The Evolution of the Google SEO Algorithm

Saturday,

 

The most coveted real estate in the world is without a doubt Google’s first result page. Getting there is a tall order, but still, it is important to understand the basics of the Google Algorithm. Adjusting your website accordingly can play in your favor.

 

This article will cover:

  • Overview of the Google algorithm

  • What are Keyword Tags

  • What is On Page Optimization

  • What is Anchor Text

  • What is Keyword Density

  • What is Topic Modeling

  • What is Domain Authority

  • What is Link Diversity

  • What are Brand Signals

  • The updates to the Google algorithm

 

It’s hard to believe, given the vast amount of websites available globally, but there is one single company, in this decentralized environment, that dictates the behavior of millions of sites – Google. Almost any company wishing to have a significant presence in the market is dependent on the giant search company, and may even be a direct Google customer, by using AdWords.

 

The Most Secretive Algorithm Of Them All

Google’s algorithms hold immediate impact on a site’s architecture, links to and from the site, its content volume and the quality of its content. This gives Google, a business entity, a de-facto regulatory status, paving the way of site behavior. Sites that do not adhere to these guidelines find themselves marginalized and increasingly irrelevant.

The exact nature of Google’s elaborate changes of algorithm (supposedly including hundreds of components) is never fully disclosed. The changes, created in an attempt to prevent any search result manipulations, will be elaborated on later in this article. And although Google may keep the changes hidden, SEO agencies do elaborate research, and with their received data proceed to derive conclusions that will help their clients remain in top search results.

 

From Birth To Beta: Google Introducing PageRank

Google began as a research study in Stanford’s Computer Science department. Sergey Brin and Larry Page set out to design a large-scale search engine designed to crawl and index the Internet, thus improving the methods that were at use at the time. This took place in 1996, two years after Webcrawler, a broad website index culled from merely 4000 servers, was introduced. At the time, Yahoo, another search engine, was still powered by human, rather than automatic, research.

While still in beta, Google introduced the PageRank algorithm to rank websites, based on incoming links. PageRank was based on Brin’s idea – that sites can be ranked by way of link popularity. In other words, the more links a webpage contained, the higher it ranked.

This concept has lead to the conclusion that calculating a page’s rank is, according to Larry Page, “an objective measure of its citation importance”. The PageRank feature is now incorporated in many other search engines, and is considered a key factor in determining a site’s value and worth.

 

Keyword Tags

In the late 90’s, webmasters and web page managers had the power to determine how a page will appear in search engine indices and web crawlers – by using keyword tags. These are words and phrases describing, or connected to, the main subject of the page. Search engines used those keywords blindly, relying on them when ranking a search results list. The need arose for a more discerning ranking metric, which cannot be as easily manipulated by false or inflated keywords.

 

On Page Optimization

In addition, on page optimization introduced a variety of technical elements assisting the search engine in deducing what the subject of the page is. These elements included title and description meta tags, the page’s heading (also known as HTML H tags) as well as the page content and appearance of keywords.

 

More Technical Tools Are Coming

Between 2003 and 2005, new technical tools enabled the search engine to deduce independently what the site was about. Those tools included the page’s anchor text, topic modeling and keyword density. The anchor text is essentially the clickable text that leads via hyperlink to another webpage, document or location. Search engines take this text into consideration when ranking the target page. The more relevant texts linked to a page, the higher its rank in the search results. As a result, the importance and value of PageRank within the overall assessment was diminished.

 

Keyword Density

The keyword density of a page was a prominent metric during in those years – it checks the number of recurrences of a certain keyword or phrase, relative to the overall word count of that page. These reduced the search engine’s reliance on keywords alone, but were still susceptible to manipulation.

 

Topic Modeling

Topic modeling, or in its more elaborate name, the Latent Dirichlet Allocation (LDA) is another feature most likely incorporated in Google’s algorithm. LDA is used to measure the relevancy between a webpage and search queries that are contextually related. It points out topics contained within sentences, and tries to track down the documents that have most likely generated this certain search term collection. Thus, the more relevant a page is to the query, the higher it will rank in the search results.

 

Don’t Wear That Black Hat

During this era, black hat SEO practitioners could still get away with dishonest methods such as writing scripts, creating web graffiti and stuffing blogs and forums with search terms. These got their sites top search results and earned them money but underlined the need for a newer, cleaner way to gain top ranking without resorting to spamming.

 

Google Algorithim - Black Top Hat

 

The Era of Authority and Trust

The next phase took place between 2006 and 2009, signifying Google’s continuous shift from purely technical SEO metrics to a richer set of tools favoring authority and trust. The main tools introduced during this era were domain authority and link diversity.

 

Domain Authority

Domain authority came along to differentiate between the treatment of serious, valuable sites and that of so-called garbage sites. Until this time, the official site of a well-known brand (e.g., Nike.com, Adidas.com) did not have any notable SEO advantage over a low-quality exact match domain site that shared its topic and keywords (e.g., shoesforrunning.com).

From here on, the entire domain was being graded according to authority, or trust. If you had a high authority site, your pages were pushed up within the search results. When competing in ranking against a site of lesser authority, even child pages, or secondary pages, on your site were boosted against your competitor’s primary and landing pages and received preference.

 

Link Diversity

Link diversity, on the other hand, meant showing preference to a site with a diverse link profile. This meant that a site that included external links from different, unique domains fared better than with a homogeneous link profile (with links from the same domain).

Building up the concept of brands continued between 2009 and 2011, as Google introduced Brand Signals to differentiate a generic brand from a strong one.

 

Brand Signals

Those signals included search volume – how many searches were made for this particular brand – and social presence (maintaining authentic, popular accounts in social networks). By monitoring user behavior and click through rate (CTR), examining if a user returned to the search engine soon after clicking on a search result, Google was now able to determine the relevancy of the displayed results and tweak them for better performance.

 

The Infamous Google Algorithm Updates

From 2011 onwards, Google invested efforts in three main algorithmic paths, in order to solve spamming and trust problems. That same year, it deployed Panda, a shift in its search algorithms aimed at eliminating content farms (low quality content sites that get top-tier rankings through aggregation and manipulation of content) and condoning original content.

 

Panda: The Rise Of The Landing Page

Among Panda’s main focuses: increasing the importance of landing pages and their appearance; examining what is a relevant trusted page and how to determine its relevancy, using word count and content freshness and penalizing sites with thin or low content. As Google’s top search engineers described it, Panda was intended, among other things, to provide better rankings for “high quality sites – sites with original content and information such as research, in-depth reports, thoughtful analysis and so on”.

 

Penguin: The Fight Against Spammy Links

A year later, another update was introduced, this time called Penguin. This algorithm update acted as a filter of sorts against sites that boost page rankings by building low quality links with thin content, using directories and blog networks. If Panda was aimed against non UX-friendly sites, Penguin saw its main enemy in sites that manipulated search engine indices.

 

In Penguin, links are checked, giving focus on natural and genuine links, as opposed to manipulated ones. Also, diversity between brand links and links to search terms is being scrutinized. Sites that haven’t updated their operations found out they stayed behind in rankings and were quick to rectify it.

 

Google Algorithim - Penguin with Megaphone

 

Hummingbird: Searching Semantics & User Intent

2013 saw the introduction of Google’s Hummingbird search algorithm, based on contextual search. As opposed to the previous algorithms, that were updates to the old search algorithm, Hummingbird introduced a new algorithm altogether. It showed a clear trend to steer the search engine towards semantic search technology. It was now prepared to analyze search queries through human semantics, focusing on user intent, and as a result reducing the need to create separate landing pages to similar terms.

 

Introducing The Knowledge Graph

Expanding the use of Google’s Knowledge Graph, Hummingbird shows that Google has come full circle from simply indexing the web to truly understanding it. If, in the past, any restaurant owner wishing to build a significant web presence was required to build one landing page for the restaurant, a separate one for “a place to eat” and another one for “dining”, these days Google can discern that these are all terms relating to the same entity.

 

Furthermore, Google’s queries rely increasingly on a broader context – where you are located when you send your search query, what device you have searched on, your recent search queries, etc. In fact, as previously disclosed, Google’s understanding of searches has become so contextually broad, that typing in an abstract query such as “movie where two guys drink wine” will actually bring you an accurate search result – “Sideways”, a successful movie from 2004 about two male friends taking a wine-tasting trip.

 

So What’s Next? Quality Content

In the past decade and a half, SEO marketing has shifted from being based on purely technological tools to a position where content is gaining equal importance. Of course, Google’s algorithms are completely automated and nothing is manual. Therefore technology is always necessary for the construction and architecture of a site and tech knowhow is imperative to maintain quality SEO and keep on Google’s good side.

Content marketing is becoming increasingly dominant, as can be seen in Google’s attempt to eradicate the manipulation and creation of pages solely for search engines. Instead, it aims to educate websites to create pages that keep the user in mind and try to cater to their wishes and queries.

 

Looking At The Big Picture

In such a perpetually changing environment, with Google calling the shots, it is extremely important to team up with an expert who holds a broad understanding of the processes and trends of the field. In order not to lose your company’s rankings in search results, it is of paramount importance to retain a holistic aspect and invest in this area, examining the site’s content, links, authority, keywords and so much more, keeping them fresh and constantly evolving.

 

Google Algorithim - Content Tools

 

This Is How We Do It @eTraffic

It is more apparent than ever that having a great product, awesome service or kickass content is simply not enough to get you noticed in search engines. This is why we at eTraffic invest in research and perpetual optimization. We help our clients receive higher ranking and find the best way to achieve steady and long-lasting results.

by Guy Regev, CEO of eTraffic Web Marketing