Unless you have been living under a rock (or you just don’t talk about SEO as much as I do), you will have heard that on May 27th, thousands of Google Search API documents were leaked – and the SEO community went wild. The 30th of May was the real date to note, as Google seemed to actually confirm these leaks were legitimate.
As you can imagine, 2,500+ pages of API leaks take a long, long, long time to read. We spent days trawling through the documents themselves and the reactions from the SEO community, so that you don’t have to. We also heard directly from the source, Rand Fishkin, at MozCon last week! Here’s our take!
What is an API, and why does it matter that Google’s API was leaked?
Some quick basics before we dive in (skip ahead to our key takeaways if you already know this!):
An API, or Application Programming Interface, is a set of rules that allow applications to communicate with each other. Google’s APIs are crucial because they dictate how software interacts with Google services, including Search. When these APIs are leaked, it reveals potentially sensitive information about how Google manages and prioritises search results, which can be gold dust for SEO strategists and digital marketers
Our key takeaways from the leaks
Let me start by saying, there were over 14,000 ‘ranking factors’ released as part of this leak, so if you take anything from this article, it’s that SEO is not a one size fits all. That being said, there were some little nuggets of gold that were particularly interesting.
User data, including clicks, engagement and time on site, are important for SEO
The leaked documents seem to show that metrics such as clicks, time on site, and overall engagement are used by Google to evaluate website quality and relevance.
Your SEO strategy doesn’t stop at writing a great article, or creating a nice web page. A website must deliver valuable content and a positive user experience. Metrics like scroll depth, time on page, and pages per visit may be more important than previously assumed. Was the introduction of ‘engaged sessions’ on GA4 a clue all along?
Click data is used to determine ‘good’ backlinks
It appears that Google uses click data not just for ranking sites but also for assessing the quality of backlinks (yes, they still matter)! High-engagement clicks from reputable sites provide stronger backlink quality signals to Google.
The aim of this, it seems, is to ensure that rankings are influenced by genuinely useful and relevant backlinks, reflecting actual human interest and approval, rather than by manipulative link-building practices.
Backlinks have long been denied by Google as being an important ranking factor altogether, so this one is a double hitter!
Effective SEO varies by location
Two things mentioned a lot were “localityScore” and “geolocation.” These terms are used to describe how Google assesses and integrates geographical relevance into its ranking factors.
We have always known that an SEO strategy for Australia will not work the same way, nor produce the same results in America. Localisation of content has always been a focus, but these leaks have taken it one step further and it appears that Google may assign different weights to specific ranking factors depending on the market.
In a nutshell, Google is confirming that what works in one market, may not work in another. So if you’re recycling the same strategy in all markets, stop!
Site authority is real – and so is author authority
The documents seem to confirm the existence of a site-wide authority metric that Google might use to rank websites, assessing credibility and the quality of content and backlinks This one isn’t much of a shock, just again validation for us SEOs.
Author authority is interesting though, and is explicitly mentioned throughout the API leaks. Although author bios are already commonplace in any good SEO strategy, it may actually be a factor we are underestimating the breadth of. It seems like when Google is assessing whether a piece of content is trustworthy, they are considering things like the author’s overall credibility on a topic, reputation, past work, and their involvement in professional communities.
Rand offered an interesting perspective of his own, suggesting this was going to change the way he hires for his own internal writers, emphasising their topic expertise and online credibility as potentially even more important than their copywriting experience. Are we about to enter the age of content influencers?
Refreshing your content should never be underestimated
According to the leaks, Google might track how frequently a site updates its content and use this as a ranking signal. Regular updates may signal to Google that a site remains relevant and active, potentially maintaining or improving its search ranking position.
As for why this is important, I guess that is fairly logical. Google needs to be able to provide users with answers to their search queries, and if the answers it is serving are out of date then Google loses trust with its users. It’s in Google’s best interest to make sure the information it is serving is fresh.
So what?
Good question. Well, whether you’re a business owner, or a marketing professional the takeaways are fairly similar.
Firstly, think critically. We don’t know how many of these leaks are even active anymore, so don’t waste your time trying to dissect and win at every single one.
Secondly, Google is a business significantly driven by ad revenue. If the number of people who visit Google on a daily basis drops, so does Google’s ability to drive revenue. Google may have lied (let’s face it), about a lot of the ranking factors we now see confirmed in the API leaks, but one thing that will always be true is that Google has to adapt to meet our (searchers) needs, and those needs are always changing. People like to consume images? Shopping carousels appear. TikTok is suddenly popular? Videos pop up in the SERPs. Focus on optimising your site based on what your users need, the value you can provide, and their preferred way to consume information. Nail this, and the algorithm will favour you.