With the release of BERT, the largest upgrade to Google’s algorithm in years, search has seen a revolution. The update came out in fall 2019, and has already rolled out worldwide. It affects search results in dozens of languages. So, how will it impact your website?

It’s important to keep up with Google’s algorithm updates so you can get the best results in the rankings. What did Google BERT change? How did it affect SEO optimization? Let’s go over Google’s latest update, how it uses machine learning to better understand queries, and what, if anything, you can do to optimize for it.

What is Google BERT?

BERT is, of course, an acronym and stands for Bidirectional Encoder Representations from Transformers. It is the latest major update to Google’s search algorithm and one of the biggest in a long time. While its release was in October 2019, the update was in development for at least a year before that, as it was open-sourced in November 2018.

The purpose of BERT is to help Google Search better interpret what its users are asking. This is especially necessary in conversational queries such as from voice search that are wordy and make use of prepositions. Before, the best way to search with Google was to keep your phrases simple. Now, it can understand all sorts of inputs.

google rankbrain dissecting search term
Image Source: Perficient

It does this by using a neural network, a highly complex brain-like system that can recognize patterns in data. With trillions of past searches to analyze, BERT is highly intelligent and able to decode users’ queries like never before.

It’s important to notice that BERT is an extension to Google Search, not a replacement. Some queries, like those that are longer and more conversational, might use BERT or a combination of it and other algorithms such as RankBrain. Some simpler searches may not use it at all. Right now, it affects about 10% of searches.

It’s just one small piece in the huge puzzle that is Google Search.

Why BERT?

Historically, Google has always struggled with certain types of queries. The algorithm has gotten smarter every year, but it’s never really fully understood context, or the relationships between words like prepositions.

Most users got around this by keeping their queries short and sticking to simple words. But now that projections say that 50% of all searches will come via voice, things are getting more difficult for Google.

Unlike the fragmented questions made on computers and phones, voice searches tend to be complete sentences. They are usually longer and make use of complex grammar that would easily confuse the old systems. Consequently, it became necessary to keep up with the wave of conversational searches that were taking over the engine.

BERT explained: understanding conversational search queries
Image Source: Moz

Also, Google is always looking for ways to make their system more accurate, and their algorithm is complex as it is. They were already making use of machine learning for RankBrain, so it was only a matter of time before they dived into natural language processing to improve their systems.

On that note, let’s take a look at how exactly Google BERT functions.

How Does Google BERT Work?

BERT is a complicated beast, built on top of an even more complex system called Transformer. It would be difficult to explain in depth how exactly it functions without writing an entire research paper.

It’s a lot easier to break these difficult concepts down to their basics and explain in simpler terms how Google BERT works. Basically, it uses neural networks and machine learning to teach itself to better understand user queries and tries to understand the context behind words.

As said before, BERT stands for Bidirectional Encoder Representations from Transformers. “Bidirectional Encoder Representations” means it reads the entire set of words in an input to understand how they interact with each other. “Transformer” is a machine learning model that deals with language processing in artificial intelligence.

Confused? Let’s explain what all of this means.

Neural Networks and Machine Learning

Machine learning aims to give computers the ability to perform high-level tasks that were before only possible through humans. It does this by feeding a system with huge amounts of data.

For instance, you might have a self-driving car that needs to accurately recognize stop signs (and know what isn’t a stop sign). By feeding its system thousands of stop sign images, it will eventually learn this itself. At first, it might only be able to identify clear, close-up images. Or, it might get confused if you show it a picture of similar objects. But with enough data, the system would learn to accurately identify stop signs in any image, even those it had never encountered before.

machine learning allows systems to understand images of stop signs like this

Neural networks are a subtype of machine learning that works similarly to the neurons of a brain. They gradually learn through example. However, unlike basic machine learning, these networks often become so advanced that they can start making predictions and correlations on their own. For example, right now, BERT is using the billions of searches it gets per day to learn more and more about what we’re looking for.

BERT is built on the Transformer encoder, a neural network system that is primarily used for natural language processing. NLP is a field dedicated to developing AI that can truly understand what humans are trying to say.

The best example for this is of course Google Search; it needs to understand what you’re asking for and output results that are relevant. That’s the entire point of BERT, and its advances have had a huge impact not just in Google’s algorithm but in NLP as a whole.

Context Sensitivity

BERT’s biggest breakthrough is that it reads an input bidirectionally. Most natural language models read left to right, right to left, or both. This meant that they could only understand how a word interacted with a word right next to it.

“Bidirectional” here doesn’t mean that it reads both RTL and LTR, but more that it processes the entire input as a whole to understand what every word means put together.

For instance, take the conversational search “what computer should I buy for college”. Before, Google would have struggled with the prepositions and long-winded nature of this question. It might have been able to pick out the nouns “college” and “computer”, or relied on articles using parts of that keyphrase. But you would have been better off just searching “best college computer” to get cleaner, more relevant results.

Now, the system takes into account every word in the sentence to understand the context of your search. It also looks at how each word interacts with the other — even those that aren’t right next to it.

That’s not to say it’s perfect. BERT still struggles with negatives (no, not, etc.) and often seems to ignore them entirely. It also doesn’t have nearly the same understanding of context and implications as a human. There’s still a long way to go.

Language processing has always been difficult for machines, but BERT has made huge strides in the NLP field.

Optimizing for Google BERT

There aren’t any hard and fast rules for BERT optimization. Unlike with other algorithms, you won’t be able to find any easy-to-follow SEO checklist. If you do, its creators are probably making a lot of false assumptions.

The only thing you can do is continue to put focus on your content, and examine the user intent to better bring people what they’re looking for. Instead of looking for a simple BERT checklist, start doing research on content marketing and better blogging strategies.

That said, if you’re curious how BERT has impacted SEO, here’s what you need to know to improve your content focus.

Conversational Queries

BERT was created to better understand how people use search engines and the actual intent behind their queries. Conversational queries, those that are long, complete sentences with proper grammar, are what it was designed to decrypt.

This in particular affects mobile users. They are the biggest demographic performing voice searches, which results in more long-winded, conversational wording. Someone speaking out loud is more likely to say “How do I get to Phoenix from where I am now?” than “Phoenix Arizona directions”.

What does this mean for you? It means exact match keywords are much less important. They are, of course, still a factor, but Google has been moving far away from keyword focus in the past ten years to combat their abuse.

bert explained: move away from keyword focused SEO

However, keyword research continues to be very much useful to you. You can use it to understand what your demographic is searching for, especially those on mobile, and the conversational searches that dominate your field of interest.

And keywords will still impact your SEO. Google is just a lot better at understanding searches that don’t use the kinds of keywords you were optimizing for in the past. BERT was just the nail in the coffin for that.

Content is King

Google’s primary goal has always been to match users to good content. Consequently, it should be every website’s aim to create it for them. The algorithm has been constantly moving away from artificial ways of inflating your SEO and towards promoting high-quality articles. But it’s more important than ever to focus on content.

Now that searches are extra sensitive to context, user intent is more relevant than the actual keywords they use. Content marketing remains the best way to increase your SEO. However, modify your strategy to focus less on keywords and more on addressing intent.

Consistently providing quality and accurate information is closest you can get to “optimizing for BERT”. If your current content strategy is shoving in as many high-ranking keyword phrases into your article as possible, regardless of how awkward or out of place it sounds, now’s the time to rethink that. Certainly don’t drop keywords entirely, but they’re not the important thing.

This update also impacts your featured snippets, not just the content of your website, so keep an eye out for that.

BERT in Non-English Languages

BERT originally launched solely for English searches. A few months later, in December 2019, it rolled out fully in 72 languages. So even if you blog in anything else than English, this update will more than likely affect you. It probably already has.

There’s no news on if BERT will be updated to support other languages any time soon. Yet, it’s possible that it will affect even more searches worldwide in the future.

So how did they manage to do this so fast? BERT is actually so advanced that it can, to an extent, use the data it gathers from one language and apply it to any number of others. So even as it continues learning from these 70 languages, English searches will improve as well.

The impact on non-English searches is probably not quite as large as 10%. But it will definitely have a lasting effect that will only continue growing as BERT gets smarter.

Demystifying Google’s Latest Update

With the Google BERT update, there’s not any quick tricks to gaining free SEO. There are no checklists to follow or easy changes you can make to boost your rankings; good content will automatically be rewarded by the algorithm. So the best thing you can do is create more of it.

The new neural network has made Google better than ever at understanding conversational queries and matching them with relevant content. If you’re already putting out awesome articles, great! Lackluster websites will be hit the hardest by BERT, so you should analyze your content marketing strategy to see if what you’re creating is really relevant to your user base.

Focus on articles that are well-written, appeal to your demographics’ interests, and exist to give people access to genuinely helpful information. This way, you’ll be doing the best you can to optimize for BERT.

How has your traffic been impacted since BERT was launched in October 2019? Was it a positive or negative effect? Let us know about any changes you’ve noticed in the comments.