The Google API Leak: What Happened, and What's Next for SEO
The recent leak of Google Search API data has sparked controversy (and a fair amount of anxiety) in the world of Digital Marketing and SEO. With Google’s “secret rules” for search now exposed, many marketers, SEOs, and site owners are left wondering, have we been wasting our time (and money) playing in a rigged game? What are the REAL ranking factors for organic search? And most importantly, do we need to change our SEO strategies now that Google’s lies secrets have been exposed?
In this post, we’ll answer all those questions and more. We’re going to explore the recent Google API leak, and talk about the ramifications for online business owners, how it affects ecommerce SEO services, and what action you can take to ensure that your website stays at the top of Google Search results.
Who am I? About the author
First, a little bit about me. My name is Bryan Swift, I’m the SEO Director at BlueSwitch, and I’ve been working in SEO and web-based marketing for about a decade. When I first started doing SEO, I didn’t even realize that’s what I was doing. Back then, my goal was to help small businesses expand their reach by creating relevant content that would resonate with their audience.
I didn’t know a backlink or a canonical URL from a hole in the ground, but I knew that the best way to help my clients succeed in search was through the use of well-written content. Pretty soon I realized that the better my content was, the more people Google would show it to. As it turns out, that principle is the heart and backbone of SEO.
Once I figured out that what I was doing was called “SEO,” I started doing research. Then I realized that if I was going to do SEO professionally, then I had to make a decision about my career. Would I be the kind of SEO specialist that uses black hat tricks and exploits to artificially inflate the results? Or would I be the kind of SEO professional that uses legitimate tools to tip the scales in my clients’ favor? I chose the latter.
What I believed then, and still believe now, is that the best tools we have to influence rankings are the tools that anyone, professional or amateur, can use: great content, social signals, reviews, engagement. The Google leak has only served to reinforce those tenets of BlueSwitch SEO.
First things first: What happened?
If you haven’t been following the news about Google’s API data leak, then there are a few things you’ll need to know to understand what’s going on here. It all started when a then-anonymous source–who would later reveal himself as Erfan Azimi–contacted a blogger named Rand Fishkin and shared some troubling news.
The source claimed that he’d stumbled onto a treasure trove of super-secret Google API documentation from inside Google’s Search division. You can read the whole story on Rand Fishkin's blog Spark Toro, or in this great article on The Verge. This documentation seemed to disprove a number of claims that Google has made about Search ranking factors over the years. Here is a list of contradictions uncovered in the leak, compiled by a Redditor named Coolsheet:
Google claimed they don't use a "domain authority" metric, but the docs show they totally do - it's called "siteAuthority."
G said clicks don't affect rankings, but there's a whole system called "NavBoost" that uses click data to change search results.
Google denied having a "sandbox" that holds back new sites, but yep, the docs confirm it exists.
G assured us Chrome data isn't used for ranking, but surprise! It is.
The number and diversity of your backlinks still matter a lot.
Having authors with expertise and authority helps.
Putting keywords in your title tag and matching search queries is important.
Google tracks the dates on your pages to determine freshness.
A lot of long-held SEO theories have been validated, so trust your instincts.
Creating great content and promoting it well is still the best approach.
We should experiment more to see what works, rather than just listening to what Google says.
Why is this Google API leak a big deal?
SEO is a high stakes game where potentially millions of dollars are riding on which sites are the top-ranked results for certain keywords. Professional SEOs spend a lot of time and energy poring over Google’s public search documentation to identify the most important ranking factors, and then pass that information along to their clients. Understanding ranking factors is the key to optimizing a website and pushing it up to the #1 spot.
According to Google, everything we need to know about Google’s ranking factors can be found in their documentation on Google Search Central, which is like the bible an instruction manual for SEO professionals. Visit Search Central, and you’ll see right there on the main page, it says “How to get your website on Google Search”.
One would assume that if we follow Google’s instructions, then we’ll be able to push ANY website to the top of Google Search results. But what many SEOs have noticed (including me) over the years, is that ranking a website isn’t always as easy as Google makes it out to be.
How does the Google API leak affect SEO services?
Anyone who’s done SEO for a significant amount of time can tell you that when it comes to Search Engine Optimization, what Google SAYS is important, and what’s ACTUALLY important are two completely different things.
Luckily, I’ve always considered Google’s Search Central to be merely a starting point for SEO, not an endpoint. In a post-leak internet, I can still use Search Central to guide my team’s optimization efforts, but at the end of the day, we will continue to first and foremost rely on two factors to steer our SEO campaigns: Attention to detail, and pattern recognition.
If anything, the Google API leak proves that my team and I have been doing SEO the RIGHT way all along. We already knew that SEO campaigns are not one-size-fits-all. We figured out a long time ago that if we have five different websites, and we do the exact same optimizations on all five sites, we’re going to get five very different sets of results. That’s why data and observation are such important factors in an SEO campaign, but the most important factor is, and will always be, the HUMAN element of SEO.
How do we know which optimizations will have the most impact?
To succeed in SEO, you can’t be afraid to take risks. Making changes to page content is a risk. Sometimes the change results in a win. Sometimes the change results in a loss. But no matter what the outcome is, as long as we pay attention and learn from the results, then we’ve gained valuable information that will eventually help us outrank the competition.
We make changes to content, then observe the results, then make adjustments based on those results, then we repeat the process over and over. If we do something that doesn’t work, then we try something different. If we do something that works, we examine it closely to figure out WHY it works, then we apply what we’ve learned to the next set of changes. Rolling optimizations help us continually refine and improve our content, a little bit at a time, until we find the winning formula.
How will BlueSwitch SEO campaigns be affected by the Google API leak?
BlueSwitch SEO campaigns aren’t based on a list of rules that was published ten years ago. We perform optimizations based on data, observation, insights, and experience. And that’s the way it should be. Google Search Central doesn’t optimize my clients’ websites, my team and I do. An algorithm doesn’t tell us which optimizations to perform. Our observations and experience do.
So while the information in the Google leak is a little shocking, it doesn’t disprove or nullify anything that my team and I have been doing for our clients. Quite the opposite. It turns out that BlueSwitch has been doing better SEO than a large percentage of the SEO industry.
How can we use what we’ve learned from the Google API leak?
One glaring factor that stood out to me as I researched this story was that a lot of things that for years we’ve only suspected, are actually true. Here’s a short list of principles that were reinforced by the Google API leak:
E-E-A-T are even more important than we thought – If you want your page to rank, then the content must illustrate these qualities: Expertise, Experience, Authority, Trustworthiness, or E-E-A-T. Every piece of content that a brand pushes out should illustrate one or more of the E-E-A-T tenets. If every page of your site flexes these key attributes, your organic presence will benefit from it.
Helpful content is the key to success. The API leak has proven that Google does not like content that’s written “for machines”. When publishing content, ask yourself, Is this content helpful? Does this content answer important questions? Is this content written for humans, or machines? Is this content worth sharing? Is this content easy to understand? Is this content entertaining? And most importantly, what can I do to make this content even better?
New domain? Be patient. Domains with more history are trusted more than brand new domains. A brand new domain is “sandboxed” aka not ranked until Google is sure that it’s a reputable, trustworthy domain, and not a fraudulent site. So if you’re a new brand, with a new website, and a new domain, you can’t rush to greatness. Focus on building your audience through publishing great content, and then promoting that content on legitimate organic channels.
During times of crisis, certain websites are trusted more than others. During COVID, Google whitelisted certain high-authority domains (like CDC.gov) so that they would rank above other less-reputable, but more-optimized sites. In theory, this is bad, because it means that the playing field isn’t level. But in practice, that policy probably saved at least a few lives during the crisis.
Traffic and engagement beget more traffic and engagement. The use of NavBoost shows us that the more people who click on a website, the more likely that website is to be shown to other people. And the best way to get people to click on and engage with your website is to publish GREAT CONTENT. Again, we come back to this tenet that the best way to outrank the competition is to produce better, more helpful content that people will want to read, engage with, and share.
Local brand, local content. NavBoost tracks user location to measure local relevance. That means that the more localized users that engage with a post, the more likely that post will be shown to other users in that area. So if you’re a New York City brand, and you want to grow your local audience, publish content that’s optimized for local relevance. In other words, if you want to rank in NYC, write content that appeals to PEOPLE IN NYC.
In the end, the best content always wins
The Google API leak is important because it reinforced many of the ideas that for a long time were theoretical at best. The leak doesn’t prove that Google has been “cheating” this whole time, and it certainly doesn’t prove that we’ve been wasting our time with SEO campaigns. If anything, the Google leak just reinforces that which we already know.
We can’t cheat our way to the #1 spot. We can’t use tricks and exploits to get Google to like us. We can’t take shortcuts and expect to win. Publishing “decent” content isn’t enough, we have to publish great content. Content that’s written by humans, for humans will always win out eventually. Helpful content is king. And when it comes to your organic presence, nothing, and I mean NOTHING is more valuable than Experience, Expertise, Authority, and Trustworthiness.
Want to learn more about BlueSwitch SEO? Visit our SEO service page, or contact us to schedule a discovery call today.