Change is constant and unrelenting in online marketing and, most notably, in search. New technologies continue to emerge as others fade. Some promising innovations wither and do not live up to their initial hype; and others become household names. Who was to know that the university project – Google – would become a gigantic corporation with enormous worldwide influence and power? Through all of the change, there is a constant, beyond change itself, and that is the online user’s need to find information in an ever-expanding digital universe. There is another constant: users rely on search to help them navigate their way to the information they need. If we expect users to come to our sites, we must very carefully consider what information we are providing, not just whether a search engine will like the page. The days of chasing the changes in search algorithms are now history. Today, understanding user intent and the language used to convey intent are the requisite keys for search success.
For most businesses, search is the primary engine of business growth. It is the tip of the spear and a key strategic element in the marketing mix, not just an add-on. Many marketers, however, still do not stay current with the changes in search, leaving it to search experts. However, in so doing, they neglect to integrate search into their entire organisation.
Blurred lines
This approach shortchanges learning about and understanding the dynamics of user intent. In today’s online world, it is essential to integrate search with the rest of the marketing effort. The digital universe has evolved. Today, the boundaries between off and online have almost completely blurred. Whether a business’ goal is audience development or sales of widgets, the wise marketer today must integrate online with offline strategies to capitalise on the synergies now available.
To do this effectively, the silos between technology and marketing must be breached. Brand development, content marketing and reputation management are today as important for search engine marketing success as a well-thought out keyword strategy and a technically sound, search-friendly site. This is because search engines, most notably Google, have integrated these traditional off-line elements into the search algorithm used to answer the traditional query.
The keyword query has evolved. The shift from the desktop to mobile devices has spurred and accelerated many of the current changes in search technology. As more users rely on mobile devices, search must align with their needs. Users on mobile devices expect immediate answers to very direct questions: ‘Where is Anfield?’ In response, Google has made significant changes to its algorithm by changing from a strict keyword query based search formula to one which is sensitive to more conversational phrases and user intent. When we place a query in a search box, we may be providing only a partial signal of what we really want to know. Our intent is often buried in a string of circumstances and even in previous queries. Search today digs into the implied intent of the query.
For example, user behaviour on mobile phones is predictable. The user often wants directions to specific locations; however, the user may not know exactly where they are currently. The technology solves the puzzle. To respond accurately to this type of geo-locational questions, Google has developed sophisticated local search algorithms which incorporate the GPS that is built into the phone to deliver ultra-specific, geo-locational results.
Once the user learns one bit of information, their curiosity often leads them in a predictable path. For example, the query: ‘Where is Anfield?’ would immediately call for an answer to a further question: ‘How far away is the stadium?’ The searcher really wanted to know both pieces of information. Today, the technology addresses both questions and often provides the answers to both upon receiving the initial query.
Can You Hear the Humming? Probably Not
In September 2013, Google unveiled its single largest revamp of its basic search algorithm in over ten years - Hummingbird. Unless you follow search closely, this change was invisible, but the results are clear. It is expected to impact 90% of searches. Hummingbird is a fundamental change in how Google processes the query in the background before delivering the results to the search results page. The intent of this major change was to improve the speed and precision of the processing. The algorithm now uses a number of signals derived from the query and the user’s behaviour to assist in delivering a result that quickly and precisely answers what the user really wanted to find. This is breakthrough technology that is for the most part invisible.
What are the magic ingredients to this formula? The new processing change enhances the search information with information provided by the Knowledge Graph – the information that typically appears in the right side of the search page that provides quick-look information. The intent is to provide answers to questions posed in conversational language –‘Where is Anfield?’ This methodology will allow Google to get at the user’s intent more accurately than just keyword matching. It is easy to see how this works. A query: ‘Where is Anfield?’ is clearly a request for information or directions. The search results provide both immediately. Figure 1 shows the contents of the Knowledge Graph box. This quick look box gives the user basic information. By placing the official site for the stadium #1 in the search results (Figure 2), a user can easily find more detailed, specific information on events held at the stadium. This new processing algorithm gives faster and more precise answers to the seemingly simple query.
Although Hummingbird impacts 90% of all search queries, the impact on traffic for most sites was negligible. No significant shifts in web traffic were reported worldwide after its launch. Most site owners never heard the humming. This was not, however, an invitation for site owners to stand pat on their search efforts. As users become more familiar with conversational search, they will expect content that is more directed to the types of questions that they might form into queries. Additionally, since the Knowledge Graph and the official site figure large in the results, it behoves site owners to tighten up their unique value proposition so that they stand out as the authoritative voice. Also, the site links shown in the search results should be carefully honed and made more appealing to users. Web design in the future will map to users’ queries not just workable information silos.
Search Success with Hummingbird
When the initial reports showed no change in worldwide traffic as a result of Hummingbird, the challenge was not how to react to Google’s change but a more unfamiliar task of how to adapt to a changed future. The shift from keyword matching to query intent does not eliminate the need for basic SEO. A page still must be found for Google’s new system to deliver it in response to a query. This means that site owners still need to make sure that their sites are both crawled and indexed by Google. Google’s Webmaster toolset gives site owners considerable insight into crawl stats and indexing numbers. Periodically, all online marketers should review these statistics. With the ability to feed Google lists of URLs for crawling, there is no reason for a site not to be in the index.
The real key to success with Hummingbird is content. Publishers create a wealth of content so this is not a challenge. For Hummingbird success, the content should address present answers to questions that might be posed in a search query. This does not mean writing that is stilted or artificial, but rather rich in information that is presented clearly. If you expect your content to appear near the top of the search results, it must meet these three criteria: fresh, frequent and unique.
Fresh content does not necessarily need to be all new content. If your site has evergreen pieces such as frequently asked questions or how-to articles, consider how long they have been on the site. Would they benefit from a bit of a refresh? For Google, fresh content is better than stale content. ‘Who reads yesterday’s newspaper?’ is the applicable thinking.
Frequency is another metric used to measure content. It is better to have a continuous stream of content flowing into the site rather than a single large upload. Frequency can be achieved through a number of strategies that are familiar to publishers. Each should choose one that fits their publishing model.
Google has provided authorship and publisher markup to allow publishers to clearly identify their content. This provides publishers with the opportunity to brand their content and develop a following for their writers and content. Google appears to give preference to verified authors. If you are not currently using the markup language to identify your authors, it will go a long way toward improving your results under Hummingbird. As publishers and searchers more fully adapt to the changes, we can expect to see improved results. The promise of semantic search is being fulfilled. Whether the query is informational, transactional or navigational, a precise and fast answer is forthcoming.