fbpx

Google algorithm updates: the past, the present & the future

These days, it seems like Google releases a significant algorithm update every two minutes. The SERPs always change, SEO professionals are constantly scrambling to figure out what’s going on, and Google continues to wheel out the “just make better content” statement each time.

So, what is actually happening? Where do we go from here? And does each page need to be precisely 2,146 words with a 3% keyword density and 14 DA 50 or above links to rank?

In this blog, we’ll go through a selection of Google’s major algorithm updates over the years, break down the key trends and use that information to try to figure out what those mischievous devs up in Google Towers will do next.

Google’s early years

During Google’s early years, algorithm updates were a rare event rather than part of the constant cycle we have now. There would be one or two updates to the SERPs a year followed by months of silence. Between 2003 and 2010 we got just six updates:

  • Florida (2003)
  • Jagger (2005)
  • Big Daddy (2005)
  • Vince (2009)
  • Caffeine (2009)
  • MayDay (2010)

2003’s Florida was the first big Google update and the first sign of what was to come over the next few years. Tactics such as keyword stuffing, hidden text and hidden links were targeted, and we saw the state of the SERPs change dramatically overnight. 

This was followed over the next few years by a handful of updates designed to contribute continuous small improvements – from the Jagger update which specifically targeted link spam to Caffeine which focused on improving how fresh results were. Ultimately though, the SERPs remained pretty stable over this time. Then, in 2011, Google released the update which truly started modern search and eventually led to the results we see today: the Panda Update.

Panda (2011)

Initially rolled out in February 2011, Panda was updated nine times before the year was out, with an unrelated Freshness Update also popping up in November, as Google truly got into the rhythm of pushing updates out on a regular basis.

At this time, the SERPs were dominated by content farms which churned out thousands of low-quality pieces of content a day and users were growing increasingly frustrated by the poor quality of the results from Google. Panda was launched to combat this and can now be seen as the first step towards the famous E-E-A-T we know today.

Human quality raters were involved for the first time at this juncture and asked a series of questions focused around how trustworthy a site and its content were. Panda was then developed by comparing various ranking signals against the human quality rankings. If you’re interested, you can find the 23 questions Google asked their Quality Raters here.

The end result of this was content farms dropping from the rankings and businesses operating with this model quickly having to pivot. DemandMedia – one of the biggest players in this space – lost $6.4m in Q4 2012 and named Google updates a direct contributor to this fact.

Penguin (2012)

Coming hot on the heels of Panda’s 13th update, March 2012 saw the release of Google’s famous Penguin update.

While Panda focused on on-page elements, Penguin had its own singular focus: links.

Penguin targeted link spam and manipulative link building practices. Links have always been an important part of Google’s algorithm, so SEOs have always tried to build links. Pre-Penguin, the easiest way to go about this was to buy and build directory, PBN and other blog links from websites designed to provide links at scale. And this strategy worked incredibly well for a while.

But Penguin aimed to put a stop to that. It was designed to detect low quality links and penalise websites that engaged in spammy practices designed to manipulate the algorithm. Nowadays, Penguin is a real-time part of Google’s Core algorithm after a series of updates designed to further enhance its capabilities – and is the reason you can’t just buy 1,000 PBN links and rank.

Hummingbird (2013)

Google’s next big named update came in September 2013 in the form of the Hummingbird update. This was described by some Googlers as an almost total update of Google’s Core algorithm.

Unlike the previous large updates, Hummingbird focused on improving Google’s overall search technology, rather than punishing websites attempting to manipulate the algorithm.

Hummingbird set the stage for dramatic advances in Google’s capabilities by focusing on improving its ability to parse conversational search queries. The end result of this update was a faster, more precise algorithm that, while not showing immediate impact, paved the way for many of the changes that came later. 

Essentially, Hummingbird enabled Google to better understand longer queries which were becoming the norm. In previous variations of the algorithm, Google performed a word-for-word match of each query and a webpage. This algorithm change enabled Google to ignore certain words and more effectively cut through to what the user actually meant, so ensuring you matched your target keywords precisely in the text started to matter a lot less.

Fred (2017)

2013 to 2017 saw continual updates in the SERPs, many of which came from refinements of previous algorithms such as Penguin alongside updates such as Pigeon which focused on improving local results. The next algorithm change to shake things up in a big way, however, was March 2017’s Fred.

Following hot on the heels of a Core Update released in February 2017, Fred saw some sites lose up to 90% of their traffic and transformed business models overnight.

Fred’s focus built on the work done with Panda back in 2011, with the main objective being to clear out pages Google deemed as being low quality results. In this instance, that consisted of sites relying on thin content and aggressive ad placements.

Once again, the focus here was on E-A-T and based on the quality raters’ guidelines and data their team had built up. Sites creating content designed solely to capture search traffic and serve a tonne of ads suffered, while detailed pieces with a less aggressive focus on monetisation won the day.

BERT (2019)

Nina Yasmine (2009)

Announced on 25th October 2019, BERT was presented as the biggest change to Google search in five years.

BERT (which stands for Bidirectional Encoder Representations from Transformers), is a natural language processing model designed to be used on large bodies of text. Or in simple, human terms: it helps Google understand the context of the words on the page.

For example, before BERT, if you searched Google for “can you get medicine for someone at the pharmacy”, your results would be pages telling you how to collect your medication. After BERT, Google knows that ‘for someone’ changes the intent of a query from simply collecting medication to doing it for another person and shows the relevant results.

SEOs writing exact-match articles aligned with Google’s previous understanding of language suffered, while users received results more aligned with the information they required. This means you’re now rewarded for writing more naturally rather than forcing specific phrasing because of Google’s limited contextual understanding.

Passage Ranking (2021)

Building on this increased understanding of language and context, Google then launched the Passage Ranking update in February 2021.

While BERT improves Google’s understanding of context and rewards publishers who focus on writing naturally and with a user-first approach, Passage Ranking enables Google to get incredibly granular by (you guessed it) ranking specific passages of text rather than just whole pages.

Again, the focus here is to reward publishers who cover a topic in as in-depth a manner as possible. We know users want results quickly and are likely to bounce if they can’t rapidly find the information they’re looking for – this naturally leads to the creation of a number of short articles designed to hit specific searches.

Thanks to the Passages update, it’s now much more realistic to put together large, in-depth pieces of content, safe in the knowledge that Google will rank the correct portion of it when searches get highly specific. 

Product Reviews (2021)

Launched in April 2021, the Product Reviews update came shortly after the Passage Ranking update and continued to build on what it had started in many ways.

This update was notable for its significant impact on affiliate sites, as it was designed to reward “product reviews that share in-depth research, rather than thin content that simply summarises a bunch of products.” So within the space of a few months, Google essentially released two updates designed to reward publishers who provide in-depth information rather than shorter, summary-focused pages.

MUM (2021)

In what was an incredibly busy year for search, 2021 also saw the death of BERT and the birth of MUM (that’s Multitask Unified Model) in June. 

Upon its launch, MUM was heralded as being 1,000 times more powerful than BERT, with this model allowing Google to access a wealth of previously-hidden information around searches. Basically, MUM is BERT 2.0 and makes Google even more effective at understanding natural language and providing contextually-relevant results. 

For SEOs, it further drove home the need to focus on matching user intent and providing relevant information, rather than hitting specific keywords.

Page Experience (2021)

Google continued its productive summer 2021, which saw the release of eight significant updates over June and July, when it launched the Page Experience update on 15th June.

This was a long-awaited release after Google took the somewhat unusual step of announcing it well in advance, with the upcoming update formally announced in November 2020. This update’s key focus was around every SEOs favourite three words: Core Web Vitals.

Core Web Vitals formed just part of this update, with its major focus ultimately being on rewarding sites which are secure, fast, usable across devices and don’t drown users in ads.

Helpful Content (2022)

Released in August 2022, this update aimed to do exactly what it said on the tin: reward helpful content.

Once again, the official line from Google on this update was simple: if you create user-centric content designed to provide quality information and match intent you’ll be rewarded. Spam pages designed solely to rank and low quality AI content won’t perform.

What we can learn from more than a decade of algorithm updates

Obviously, this isn’t a complete, exhaustive list of every algorithm update Google’s released over the past few years. There are regular Core updates in addition to Link Spam updates designed to punish spammy link building tactics, local updates which focus solely on improving local search results and other regular iterations on previous updates.

This selection of key updates is enough, however, to paint a complete picture of exactly what Google has been trying to achieve over the past decade(ish). From Panda through to Helpful Content, every update has focused in some way on one of the following:

  • Ensure content that ranks is written or fact-checked by an expert in the field, meaning users can trust they’re getting reliable information
  • Differentiate relevant, high quality citations from irrelevant, poor quality citations. Again, this helps ensure users get reliable information rather than receiving knowledge from sources that can’t be trusted to be accurate
  • Reward user-first content which covers the topic in detail, rather than pieces only created to hit a certain keyword group
  • Encourage websites to put user experience, rather than ad monetisation, at the forefront of their thinking

In short: Google’s algorithm updates have consistently focused on rewarding websites which are:

  • Fast
  • Uncluttered
  • User intent-focused
  • Trustworthy

So websites focused on E-E-A-T and providing a great user experience should be reaping the rewards from an organic perspective.

How the SERPs have changed

Of course, it’s not just how the results are ordered that’s evolved over the last few years – the SERPs themselves have changed a lot too!

In 2011, the SERPs were pretty basic: ten blue links with a couple of text adverts at the top and the right.

Skip forward to today, however, and a search for something like ‘buy running shoes’ serves up a SERP which features:

  • Text ads
  • Image-based shopping ads
  • A ‘Find Results On’ box which links to relevant directories
  • The map pack which shows local shops selling running shoes
  • People Also Ask
  • A ‘Refine by brand’ option which enables people to select a specific brand
  • Related searches

And ten blue links – some of which now feature product imagery, so users can get a sense of your inventory without even leaving the SERPs.

Much like with the algorithm updates, this has been an evolution which has occurred over a period of years, but all of these changes have had a clear overriding focus: to give users as much information as possible before they leave the SERPs and visit a website. What this often leads to is:

  • A win for Google – people spend more time on their property and are thus more likely to click an ad and become a profitable customer
  • A win for users – having the answer to a quick question given to you without having to go to a new page is incredibly convenient
  • A loss for website owners – less people visit their website because they’ve already got the answer they need or had their eye caught by an image-led result.

In some ways, this is something we just have to accept. People who want a quick, one-line answer will get what they want and move on with their day without even recognising you provided it.

The future of Google

So where do we go from here? And what do we think the next decade or so of search will look like? Honestly, as much as we all say Google and SEO is this wildly unpredictable, ever-changing beast, the trends on a macro level are pretty clear – Google will reward content which:

  • Is original
  • Matches the intent of search
  • Covers the stated topic thoroughly
  • Isn’t plastered in ads or slow to load
  • Cites relevant sources
  • Is delivered from a position of proven expertise

So when you’re putting together an eCommerce category page, for example, you should consider the following:

  • Do you have the inventory to justify a page? 
  • Do you have clear, compressed images of each product to use?
  • What questions do people have when searching for that product, and how do they want them to be answered?
  • Is the category relevant to your business’s core area of expertise?

From this, you’ll have a great starting point on producing a category page and supporting content. Ensuring this is created by experts and all technical elements are in place will put you on the path to success.

We also know from previous changes that Google is focused on providing users with as much information directly on the SERPs as possible. Bard and other Chat GPT-inspired integrations and other new  SERP features only continue the path Google has already made a lot of headway on in recent years. Queries requiring short, sharp answers will continue to not send much traffic and people will become even more used to getting answers quickly with minimal fluff. This means website owners and SEO professionals will get much better engagement by getting to the point when answering questions.

On the eCommerce side, product pages are probably going to become much more important. Google is already moving towards making searches which would usually return category pages more visual and we know that most traffic comes from people browsing products on our category pages. The logical endpoint of the changes Google has already made may be to create its own category page which ranks products – and honestly browsing this rather than opening 100 tabs to browse 100 shops does seem pretty appealing!

Ultimately though, if every indexable, canonical page has a clear purpose, provides users with all relevant information and a great experience, and proves itself to be a trustworthy source, you’re probably going to be okay in the long run. 

Admittedly that’s far less fun than over analysing, catastrophising and speculating about every little thing! So, maybe the official (very much my own) Re:signal consensus can be: in a decade the internet won’t even be a thing any more after Bard becomes sentient and accidentally sends us all back to the year 457, so there’s no point to any of this.