Integrated Digital Marketing Solutions | EXPLORE SERVICES

You're reading:

What is Google’s BERT Update? – What it Means for SEO

What is Google’s BERT Update? – What it Means for SEO

Google’s BERT update is set to change the way the search engine handles naturally phrased search queries. With billions of searches made daily, many are too nuanced for the search engine to fully understand. This major update is designed to let it handle search queries it can’t anticipate, or to understand complex phrases it normally would not be able to.

So, what is BERT and what does it mean for Google search?

Google’s update is meant to help it process natural language with the use of an algorithm called Bidirectional Encoder Representations from Transformers, or BERT.

The BERT AI update is meant to make headway in the science of language understanding by employing machine learning to a full body of text – in this case, a Google search. It’s something they claim represents the largest update to their search system in at least five years. The biggest change to the search algorithm since the search giant’s famous 2015 RankBrain update.

The BERT search query algorithm processes words entered into Google in relation to all the other words in the phrase, rather than one-by-one in order. The AI is then applied to both ranking and featured snippet results so that it can more accurately find results for users. Google claims that the BERT model will affect 1 in 10 searches in the U.S, and that its ability to parse conversational language means it can now understand the context that prepositions like “for” and “to” provide to a phrase.

More to the point, the BERT model for language processing can contextualize each word in a phrase – in relation to every other word in the phrase – simultaneously. This sets it apart from natural language processing (NLP) models that contextualize words in order of left to right, or right to left.

Part of the way it does this is by “masking” some of the words in the input text and then having the BERT model bidirectionally predict the masked word. This way, it can deduct the meaning of each word based on language context and avoid confusion with synonymous words.

“This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.” – Google Fellow and Vice President of Search, Pandu Nayak

The algorithm can process the meaning of keywords contextually by cross referencing search queries to the data it derived from a training text – in this case Wikipedia. By feeding the BERT AI a large data set of training text, the model can better understand never-before-seen text, including search phrases.

When Google announced the BERT update, its Fellow and Vice President of Search, Pandu Nayak explained how the AI based algorithm could affect a specific query like “2019 brazil traveler to usa need a visa.”

Previously in a search like this, the word “to” may not have been given significance. But BERT’s understanding of phrasing and context helps the search engine to understand that this query is specifically about a Brazilian traveling to the U.S. and not just about visas for those two countries in general.

This example demonstrates how these advances in language processing are helping Google better understand what people intend when they enter queries into the search bar or speak out voice commands to their digital assistants.

What does the Google BERT update mean for SEO strategies?

How does the BERT NLP model affect search engine optimization techniques for webmasters and SEO experts?

Because BERT aims mainly to address searcher intent, it likely won’t change much for SEO. It doesn’t represent a change in Google’s ranking factors, but instead aims to more accurately determine which results appear with which queries.

This new update does not replace RankBrain, but instead aims to fill in gaps in Google’s language processing abilities that RankBrain may not be sophisticated enough to parse.

Because the update is similar in intent to RankBrain, the advice for optimization may be the same as what Google suggested for RankBrain: simply write content for humans that aims to match what users want. Google claimed that traditional SEO adjustments would not be enough. Instead, the best approach was to make content from the ground-up to be authoritative, accurate and to focus on what audiences want.

Many traditional search engine strategies focus on meta data creation, linking and content optimization/keyword density. With algorithm changes like BERT, the best strategy is to focus on site content that is good for people and helps them reach the goal of their original search. Creating content that is built around topics and not around desired keyword rankings is likely the best way to optimize for BERT.

For web pages that see a drop in longtail keyword rankings or clicks, this could be a sign that content doesn’t match with those queries as much as Google originally thought. It could also suggest that meta title tags and content that are too focused on search algorithms may inadvertently suffer. Search engines have long ignored prepositions – or “stop words” – like “to,” “for,” “in,” etc., but in the same way that these words have meanings to humans, Google’s advances in NLP suggest that these could be important clues for search engines when found in page titles and H-tags.

With BERT’s reliance on natural language and the context of text, paying attention to the surrounding context of keywords could be one way to optimize for this new model. Creating on-page content that focuses on topics rather than keywords will keep site content coherent for users. Additionally, site owners should focus on building quality content that meets Google’s best practice guidelines and displays a high level of EAT (Expertise, Authoritativeness, Trustworthiness).

Since the BERT model also applies to rich snippets, content writing strategies with these kinds of snippets in mind are just as important as ever. There’s no direct strategy for getting featured-snippet appearances on search results pages, but odds improve for pages with clearly structured content, helpful section headings, and content that aims to answer questions or give step-by-step guides.

With BERT’s focus on NLP, it makes sense that pages with niche and detailed advice would be more suited to complex, atypical, and question-based searches.

Google has long made its goal to provide better, more accurate search results to its users. Because of this, its algorithm has become more and more sophisticated in its ability to understand language and searcher intent – first with RankBrain and now with the BERT AI update. The focus of the Google BERT update is on NLP to help the search engine better understand never-before-seen phrases and queries, and now it’s getting closer to that goal.

Contact us