Skip to Content

Google BERT Update and What You Should Know

7 min read

The latest update from Google has the world of search engine optimization in a frenzy. Google considers this update to be “one of the biggest leaps forward in the history of Search.” But according to recent chatter, SEOs feel its impact is minimal.

What is going on?

What is BERT?

BERT is an open-sourced “neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers” introduced by Google last year. At MarketMuse, we use BERT internally for several things. Not knowing much about it, I asked our Chief Product Officer and Co-Founder, Jeff Coyle, to provide some background.

“BERT is a technology to generate “contextualized” word embeddings/vectors, which is its biggest advantage but also it’s biggest disadvantage as it is very compute-intensive at inference time, meaning that if you want to use it in production at scale, it can become costly.

Whereas some word vectors (word2vec) are pre-calculated for each word/phrase and saved in a database for retrieval whenever one needs it, with BERT, there is a need to calculate/compute vectors every time.

Here’s an example.

Sentence 1: A sentence is a linguistic structure made up of words and syntactic rules.

Sentence 2: The judge gave a hard sentence to the criminal.

If I want a computer to get a good sense of the meaning of “sentence” in both sentences, with word2vec, I simply retrieve the pre-calculated vector from the database, and I will use the same vector for the sentence.

With BERT, being a technology that creates “contextualized” vectors, I will have to feed both sentences into the BERT network (that means I need to do to thousands of FLOPS as BERT is a deep neural network with many layers and neurons). BERT will generate two very different vectors for the word “sentence” as they appear in two very different contexts (i.e., contextualized vectors).

This capability of contextualizing makes BERT and it’s related models state-of-the-art on many natural language understanding tasks. But, it also makes it compute-intensive and hard to bring into production. That’s why it is powerful for a question or rewrite service but has limitations.”

The computationally intensive requirements mean that its application in search queries will be limited, for now. No doubt, there will be improvements in the near future to reduce processing requirements of the algorithm.

How does BERT help searchers?

The Google BERT update means searchers can get better results from longer conversational-style queries. Now there’s less necessity for resorting to “keyword-ese” types of queries – typing strings you think the search engine will understand, even if it’s not how one would normally ask a question.

The Impact of Google’s BERT Update

The impact of this algorithm update varies depending on who you ask. According to Google, it’s “the biggest leap forward in the past five years” with 10% of search results affected by the change.

Providers of keyword tools however, feel the impact will be minimal. They’re not seeing significant changes because they do not track long tail conversational type queries. They monitor mostly shorter phrases and head terms that do not require the natural language processing capacity of BERT.

How Does Google’s BERT Affect Content Marketers?

While the Google BERT update directly affects searchers using Google, it’s a different story for content marketers. Just because it can better understand context doesn’t mean you go out and create thousands of pages targeting the long tail.

This update provides further incentive to continue writing well-organized, topically-rich, and comprehensive content. 

Google BERT Misinformation

With Google pushing BERT as being a major update, it shouldn’t come as a surprise how quickly misinformation has spread. I’ve already started seeing press releases from SEO firms claiming to be skilled in BERT optimization.

Partial screenshot of press release related to Google BERT, explaining the virtues of quality content while filled with grammatical errors!

While it’s funny to read a press release with poor grammar extolling the virtues of quality content, there is a darker side. Relying on these “alternate facts” could result in a great deal of wasted time and money pursuing an ill-advised strategy using the wrong tactics. 

You Can Optimize for BERT

Let’s be clear, BERT is not a ranking signal. Therefore, you cannot optimize pages for a signal that doesn’t exist. BERT can better understand long term queries and as a result surface more appropriate results. 

BERT models are applied to both organic search results and featured snippets. While you can optimize for those queries, you cannot “optimize for BERT.”

While we use BERT at MarketMuse for natural language processing, it would be incorrect to say that MarketMuse Suite optimizes for BERT.

You Should Focus More on Long-tail Terms

BERT is not the reason to focus on long-tail keywords. A comprehensive and topically-rich page can still rank for hundreds of search queries. Take a look at this.

Partial screenshot showing keyword rankings for an article on natural language processing.

A Simple Introduction to Natural Language Processing ranks for 380 keywords including long-tail search terms like “natural language processing with machine learning,” “nlp meaning in machine learning” and many others that keyword tools do not even register.

One topic for which they rank poorly is “natural language understanding course.” Let’s say you decided to create new content instead of optimizing the current page.

What long-tail term would you target?

There are numerous modifiers that could be added on to the term “natural language understanding course” to form “BERT-friendly” phrases:

  • The best
  • What is a good
  • What is a popular
  • For dummies
  • For beginners

The danger inherent in creating content targeting these very long-tail keywords is you end up with pages of content that address the same search intent. It’s the one-page one-keyword syndrome where people use to right separate pages of content about:

  • Yellow widgets
  • Red widgets
  • Green widgets

I remember those times well, and I don’t want to go back there!

My Google BERT Prediction

Google BERT will affect 20% of searches within a year, double its current percentage. Here’s why.

“people often use “keyword-ese,” typing strings of words that they think we’ll understand, but aren’t actually how they’d naturally ask a question.”

This is not natural. 

As more people discover Google’s ability to understand complex queries, the focus will shift from “keyword-ese” toward longer phrases. That’s just human nature.

Summary

As Google’s search algorithms become more sophisticated, it’s increasingly difficult to succeed at search engine manipulation optimization. The ability to use content to craft a narrative of expertise is what will separate the content marketing haves from the have nots. Every Google update offers additional reasons to create better content. A content intelligence platform like MarketMuse can help your organization do this at scale, creating a quicker path to authority.

Stephen leads the content strategy blog for MarketMuse, an AI-powered Content Intelligence and Strategy Platform. You can connect with him on social or his personal blog.

Tweet
Share
Share