Search is changing and evolving as users change how they interact with the web. Originally the relationship was linear. A user placed a query in a public search engine, such as Yahoo! or Google, and received back a list of textual links that were deemed relevant by the search engine for the searcher’s query. The activity may have stayed the same, but users today have an altered relationship with search. First, the definition of search by necessity has expanded beyond its original narrow view to include anywhere that a user might go to seek for any type of information. For example, Amazon.com is in fact a search engine for books, and the large travel sites are search engines for travel information. YouTube is a giant video search engine, and Flickr is a searchable photo archive. The addition of text link advertising in search results has added a new dimension and enriched the texture of search, but the addition of advertising did not change the fundamental activity. The searcher still got ten blue links as well as the advertisements decorating the page.
Today, the rapid growth of social networking sites has continued to shape the face of search. These social networks with their rapidly changing memes require almost instant, real time indexing; otherwise, the information, flowing in these virtual rivers of communication, is as interesting as yesterday’s news. The inclusion of news results in search, both through dedicated sections (Google News) and within the main results, allows users to rapidly access time sensitive information from trusted sources. Rapid indexing has created opportunities for publications and new outlets such as blogs to bring up to the minute information to a broad spectrum of readers through search engines. The negative impact on the publishing industry of the growth of online sources has been well documented. Search has played a significant role in making the economics of publishing more difficult; however, this should be viewed within the larger context of how the users relate to information in our current environment.
Not only is the search experience shifting in terms of what and how fast various types of information can be sought and found, but a major shift has occurred in where and how the searcher queries the web. Originally, the user was anchored to a computer with its internet connectivity and browser. Then, laptops and wireless connectivity freed the user to search in more locations. Today, the browser is likely to be on a smart phone or an iPad. The user is completely un-tethered from the traditional computer and can query the web from almost any location. The browser, the search results and the web page must now be configured to match the specific confines of these devices. The user has come to expect a ubiquity of information.
Search marketers have had to expand their horizons beyond seeking traffic from the ten blue links on the search engine results page to cover a vast array of new searchable media and devices used for search. They have had to change their tactics and evolve along with search. The evolution of search brings with it opportunities and threats for publishers.
Search Optimisation Is Not the Same Old Game
In the past, search optimisation was often described as a game of cat and mouse played by clever search marketers eager to keep ahead of the search engines. The most clever players were rewarded by the top positions on page one of the search engines. Search tactics focused on overcoming technical difficulties that might prevent search crawlers from accessing all of the pages of the site and on making sure that the page’s titles and other metadata were filled with the correct keywords that matched the queries that searchers were putting in the search engines. These were not trivial tasks, for as web technologies have advanced and developed, the increased complexity brought problems with content duplication and technologies that defeated the capabilities of search spiders, a notoriously slow evolving segment of the web. Today, some of these tasks seem simple and almost quaint.
Some of the tactics have persisted and are still best practice for search today. Carefully crafted document titles are still a very important ranking element for organic search. Search marketers still focus on making sure that the content on a site is accessible to search and that duplicate pages are not presented. Duplicate content creates confusion for the search engine as it tries to identify which piece of identical content is the “canonical” piece of content. Content duplication is a serious challenge for publishers. There are two main sources of content duplication for publishers. First, content duplication results when a site manager has a print page version of each article or story and has not blocked search crawlers from accessing them. The second source of duplicate content is from reuse and syndication. There are broadly accepted tactics for how to prevent and remedy duplicate content. Solving and preventing these types of problems is search marketer’s challenge today.
Some of the Same Old Problems Persist
Getting a site indexed and making sure that the content is included in the search engines has long been a key task for search marketers, for in search, the county fair principle applies – you must be present to win. Today, the search engines have made it easier for site owners to feed their content into the search engines via xml feeds of page URLs, but this has not reduced the challenge.
Although it is easier to make a search engine aware of the pages on a site, there is still no guarantee that they will be indexed. For site owners with very large sites (hundreds of thousands of URLs) or publications with large searchable archives, this can result in entire content areas not indexed even though they have been fed to the search engine. The reason is simple, the web is rapidly growing and indexing all of the content is still a major challenge for search engines.
This past year, Google, whose ambition is to index all of the world’s information, announced to search marketers that they would be adding into their metrics for evaluating a site’s relevancy the site’s speed and performance. This went live in April of this year. The logic is simple. A slow site or one with a large number of bad pages (those returning 404 errors) is by logical extension a site that is poorly maintained and hence less likely to be attractive to searchers. The net result has been a renewed interest among search marketers in the technology that drives sites. Many search marketers left the arcane areas of server speed and load balancing to specialised technologists. It is this author’s belief that we are seeing the first salvos in a new arms race; whereby, the search engines – most notably Google – prompt site owners to create ever faster and better performing sites.
Forcing search marketers to focus on the speed and performance of their sites has a positive impact for a search engine. If the site is fast, lean and performs well, then the search engine spiders can more quickly harvest pages, resulting in the search engine being able to index more pages with the use of fewer resources.
Site owners can improve their site’s speed by making template modifications, such as reconfiguring when scripts load. These changes are just the first level of performance enhancements. Site owners can expect that down the road they should be considering back-end re-engineering to further enhance speed. Such changes are already being made by some ecommerce firms in an attempt to further optimise their sites’ performance and retain their coveted positions in the search results. For a multi-publication publisher, now would be a good time to review what steps need to be taken to improve site performance.
New Devices, New Opportunities
As users adopt iPads, Kindles and ereaders at an accelerating pace, there are opportunities for publications to capture subscribers and new readers devoted to these new technologies. The challenges go beyond just making sure that there is a version compatible with the most popular device. It is essential that the reader can find the publication in its ereader format via search. iPhone app developers not only sell their apps through iTunes, but they also have microsites dedicated to their app. These show screen shots and selling features of the particular app. It is easy to find these app microsites when the developers have made sure that they are adequately search optimised. The user will use the microsite to review the app and then purchase it through iTunes. Publishers of ereader versions will want to take a leaf from the app developers’ book and make sure that the special ereader version is easy to find via search.
Search Is Changed Yet Has Stayed the Same
The activity of searching is by now an integral part of the online experience. Search is no longer the linear activity of placing a query in a favourite search engine and then scanning pages of ten blue links. Search is now part of every online activity. Publishers who recognise that users want the information as quickly as possible and make their information available instantly will be rewarded.