fbpx

The Google Search API leak: What it actually means for your site

On Sunday 5th May 2024, API documents were leaked from Google’s Search division. Erfan Azimi, an SEO and founder of EA Eagle Digital, leaked the information, and then Rand Fishkin and Michael King both broke the news. 

For decades, the way search works has been shrouded in mystery, and for good reason. Many old SEO tactics like keyword stuffing, buying links, hiding keywords/content using cloaking, etc. stopped Google and other search engines from surfacing the most relevant sites. Usually, when algorithm factors are revealed, they’re abused. Subsequent algo updates were designed to prevent these tactics from being successful. 

All search updates ever since have been designed around stopping sites “gaming the system”, and surfacing sites that are the most reliable to answer specific queries. How we do SEO has changed wildly since those days as a result. We now focus on making sites more user-friendly and helpful, which helps improve search. 

In a bid to stop repeating the past, Google has been understandably guarded about confirming specifics. The Google Search Liaison team have been great at confirming or denying information, or just pointing us in the right direction.

Or so we thought.

The leak revealed that there are multiple factors which influence search results that Google has previously denied. This is important because most SEOs are in a constant battle with internal processes as to why to action specific dev tickets – and they’ve cited the advice of Google in their reasoning to make their sites better. But it appears they’d been misinformed.

Unsurprisingly, this has eroded trust towards Google from the SEO industry. However, in many cases revealed in the API documents, SEOs now have confirmation of what we know to be important.

As a site owner or SEO, what do you need to know from the Google Search API Leak that’s important? In this guide, I’ll break down the key actions you’ll need to take from this into your SEO strategy.

The important ranking factors we now know to be true from the Google Search API Leak

There are many factors in the document that Google previously denied. Some of them are impactful, but others others are relatively insignificant,  or aren’t easily manipulated. We compiled the most important ones below, as highlighted by Rand here:

  1. SERP click data plays a role in ranking (Navboost, CTR%, long vs short clicks, & user-data). Several modules in the documentation mention features such as “goodClicks,” “badClicks,” “lastLongestClicks,” impressions, squashed, unsquashed, and unicorn clicks. These indicate that clicks are categorised by time spent on a result from SERP, or whether the click was a mistake, etc.

  1. Chrome browser clickstream data. It makes sense that Google would use its data to improve its product. This is why the SEO industry was puzzled when this was denied. The document indicates that at both page-level and domain-level this data is measured and determined.

  1. Whitelisting sites in travel, politics, and YMYL (your money or your life) keywords. This has been a hot topic before now, particularly during the pandemic when several COVID-related updates seemed to favour government and official CDC/WHO sites. Again, a whitelist has been denied previously. Here we see indications of a whitelist of domains preferred for travel, politics and YMYL topics.

  1. Google search rater feedback used algorithmically. This has been a really interesting factor for me in particular. It shows that human search raters have an impact algorithmically. It means that even if sites think they’re safe from manual actions, they could be getting manually penalised without being aware.

  1. Click data used to determine link quality. As a former Digital PR Exec, I know this to be true. The links I built from PR activity that I knew were driving volumes of referral traffic, always had the largest effect on page and site-level search traffic. It makes sense from Google’s point of view too: links that drive traffic are more likely to be relevant.

What does the Google Search API Leak mean for ecommerce brands?

The effects of the findings are relevant for all sites, but for ecommerce sites, click data is the most pronounced. Now more than ever, ecommerce sites need to stay ahead of the curve in the following specific areas:

  • User experience – UX & SEO go hand-in-hand. From the learnings on click data used, it makes sense to make sure sites load fast, give the user what they need ASAP, offer up different/complementary product options, and are accessible to users with disabilities.
  • Focus on niche product categories – Many of us in ecommerce SEO recommend managing crawl budget by reducing niche PLPs available to Google to index. ecommerce sites that can index specific chosen facets and optimise them, may find they can stay in front of the competition.

What do we need to do to adapt our SEO strategies?

It’s important to not rush to change your entire SEO strategy because of the Google Search API Leak. The main thing we know to be true still stand: genuinely useful content always wins. Keep doing the following:

  1. Continue checking crawling and indexing viability on the site regularly

Sitemap.xml files should be updated regularly with viable URLs. 404ing pages should be regularly checked, with internal links changed or removed. Orphaned content should be kept in check, etc. No content strategy will overturn a lacklustre technical strategy.

  1. Invest in a good CRUX strategy

UX is a great way to see where users are getting frustrated on-page. Marrying CRUX with SEO is a brilliant way to make existing traffic work harder, but also to nurture customer relationships from search results to avoid ‘pogo-sticking’.

  1. Keep doing SERP analysis

Keyword research only gives us so much. Trust proprietary data less, and your own data more. Use the best source of semantics analysis: look at the SERP. It’s the worlds’ best crawler and available natural language processor. What content is surfacing? What issues are they resolving that your content doesn’t? What other questions are searchers using? Are there videos? What does the ‘People Also Ask’ box say? Is this SERP commercially focussed or does it aim to resolve a problem? SERP analysis is really the secret to a good SEO strategy.

Your SEO strategy post Google API Leak

The insights the SEO industry has gleaned from the Google API Leak will certainly have an influence on strategy recommendations and techniques. Although many of the fundamental SEO best practices remain the same, we now have more concrete information about how pages are ranked. 

Should you adjust your SEO strategy in light of the Google API Leak? At Re:signal, we can help. Our experienced teams can provide analysis of your ecommerce website, and recommendations for improvement based on the latest SEO information available. Contact us today for a tailored analysis.