It’s been a few weeks since Google began rolling out its latest major search algorithm update, BERT, and many members of the SEM community still have questions about what this change means for search engine optimization and content marketing. For good reason, too. The sheer amount of speculation and “expert” guides that were published just days after the update was announced have made separating algorithmic fact from fiction difficult.

 

Thankfully, we’ve done most of the heavy lifting for you and put together a no-nonsense guide breaking down the key components of this update:

 

What is BERT?

In his blog post announcing the algorithm change, Google’s Vice President of Search, Pandu Nayak, called this update “…one of the biggest leaps forward in the history of Search,” but what is it exactly and why is it so important?

 

Before we dive into it, I think it’s important to acknowledge that BERT, which stands for: Bidirectional Encoder Representations from Transformers, has actually been around for at least a year now. It was first unveiled in October of 2018 as an academic paper from the Google AI Language research team titled, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding and then announced as an open source contribution on the Google AI Blog the following month.

 

In the AI Blog post, BERT was explained to be a neural network-based technique for natural language processing (NLP) pre-training—which sounds intense, but is really a lot more straightforward than it sounds. In layman’s terms, BERT is a method for training Google to better understand language as humans do, or to put it even more simply, it’s a way to help Google identify the context of words in search queries.

 

For example, prior to the BERT algorithm change, googling the query, “can you get medicine for someone pharmacy” would provide you with a lot of generic results about the prescription filling process. Not exactly what you were looking for, right? Now, post-update, googling this same exact query gives you results about whether or not you are able to pick up a friend or family member’s prescription, indicating that Google now correctly understands the importance of the word “someone” to the intent of this query.

BERT Algorithm Example 1

Similarly, before BERT was rolled out googling the query, “math practice books for adults” would provide you with a lot of results for math books in the “Young Adult” category, but now BERT better understands that these young adult results are being matched out of context and instead gives you a more helpful result: math books for adults, i.e., those over the age of 18.

BERT Algorithm Example 1

 

What makes BERT so groundbreaking?

The answer to this question lies in the “B” of BERT: bidirectional, which is in reference to the way BERT trains language models differently than previous NLP pre-training techniques.

 

Prior to BERT the standard way of training language models was through an ordered sequence of words (either left-to-right or combined left-to-right and right-to-left), but now language models can be trained bidirectionally. This breakthrough in NLP means that language models can now look at the entire set of words in a search query at once, allowing the model to understand word context based on all the words in the query and not just the word immediately before or after itself.

 

Which search queries does BERT affect?

At this point BERT is not being applied to every search that’s made on Google, just organic search queries in the U.S. in English, and even then, not every search is affected. By Google’s own estimate BERT is currently only being used to enhance 10% of U.S. English search queries, “particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning,” said Nayak in the official update announcement post. That said, the company plans on rolling BERT out to more countries and applying it to more searches in the coming months.

 

Outside the realm of organic search, Google stated that BERT is also being used to enhance featured snippets in all twenty-five languages for which Google displays them. Details on exactly how BERT is being used to improve featured snippets isn’t readily available, but I think it’s safe to assume it is similar to the way the algorithm is used to improve organic search based on the context of the search query.

 

What does BERT mean for SEO?

According to Google’s public Search Liaison, Danny Sullivan, not a whole lot. When he was asked on Twitter about ways to optimize content for the new update he responded, “there’s nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.”

 

This update is more about making it easier for people to use natural language in Google search and less about altering the way content is ranked in the search engine results page (SERP), and as a result Google’s advice on how to rank well remains unchanged: produce quality, user-centered content that satisfies searchers’ intent.

 

If anything, this algorithm update will make most search engine optimizers’ jobs easier since Google is now better at interpreting language like real people do. Just keep doing what you should be doing, and it will pay dividends.

 

Is BERT responsible for ranking declines?

Since this update is only meant to affect the way Google processes language and not SERP ranking you may be wondering why you’ve heard about, or maybe even experienced, drops in organic traffic and ranking since it started rolling out last month. Are these drops due to BERT?

 

Google Webmaster Trend Analyst John Mueller says no.

 

In the latest Webmaster Hangout, Mueller responded to a publisher’s question about a 40% decline in his site’s traffic, saying, “… this would not be from BERT. The BERT changes are particularly about understanding user queries better and around being able to understand text better in general.

So it’s not that we would say that suddenly your page is less relevant. But rather with BERT we would try to understand does this question that someone is asking us, does it match this website best. And usually that’s for more complicated questions.

The thing to keep in mind is that we make changes all the time. We’ve made several core algorithm changes as well over the… last month or so, which kind of overlap with the rollout of BERT as well.”

 

Going back to Nayak’s announcement, I think it’s important to mention an interesting phenomenon he brings up called “keyword-ese,” what he calls the unnaturally phrased strings of words that people use when they believe Google won’t understand the way they would normally phrase their question.

 

Keyword-ese is a search engine optimizer’s bread and butter, but it is not conducive to a great search experience for the average user, which is why Google is trying to eliminate its use entirely. And thanks to BERT, keyword-ese may be going the way of the dinosaurs sooner rather than later, in turn making search engine use more intuitive and natural for everyone involved.


Leave a Reply

Your email address will not be published. Required fields are marked *

 

Let's Get Started