Google BERT: A simple guide to interpreting and measuring impact!

If the last week’s SEO chatter is anything to go by, then the Google BERT update seems to have occupied a lot of space everywhere. Even though Google has only done it for 10% of its US-en SERPs.

But then, when we SEO’s are quite capable of making mountains out of moles when Google says it has made an update, we’re bound to go crazy.

When news popped in my timeline, ‘Google Applies New BERT Model to Search Rankings, Affecting 1-in-10 Queries’, I started figuring out who about BERT! And Woah! This can bring a lot of change to our SERP.

To Summarize what the BERT update means for SEOs,

  • BERT will be used on 1 in 10 searches in the US (English)
  • Google is already using a BERT model to improve results in all the two dozen countries where featured snippets are available.
  • Google has found that BERT helped its algorithms better grasp the nuances of queries and understand connections between words that it previously couldn’t.

Here’s a Before/After of BERT as released by Google itself (Link)

BERT_Example

What does BERT mean anyway?

Going by the definition, here’s what BERT stands for:

“BERT stands for Bidirectional Encoder Representations from Transformers. It is designed to pre-train deep bidirectional representations from the unlabeled text by jointly conditioning on both the left and right contexts. As a result, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of NLP tasks.”

‘Of course, we don’t want the definition! Can someone please explain what BERT means?”

To simplify, BERT is an NLP framework that tries to figure out the intent of sentences. NLP is Natural language Processing and Google has been trying to make use of it for over a year now.

  • BERT is pre-trained on a large corpus of unlabelled text including the entire Wikipedia (that’s 2,500 million words!) and Book Corpus (800 million words).
    Imagine Google, over time, allowing a much larger corpus of data for the model to get trained on!
  • BERT is a “deeply bidirectional” model. Bidirectional means that BERT learns information from both the left and the right side of a token’s context during the training phase.

To give an example, BERT differentiates between the following two sentences by checking the context for the word, ‘bank’:
We went to the river bank
-I need to go to the bank to make a deposit 

 

What is the Impact of BERT on Current Rankings

In the short-term, BERT will impact Long-tail queries and Featured Snippets. This is in-line with how Google is trying to be the answer engine instead of remaining just another Search Engine.

1. Long-Tail Search Queries

The impact of long-tail conversational search queries is expected to be huge.

Why?

In Google’s own words: ‘where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query.’ This implies pages which end up ranking for queries where the intent isn’t being served, will get affected. At the same time, pages that did match intent but weren’t getting rewarded should keep an eye on keyword metrics.

Example: When a user types in, “2019 brazil traveler to the USA need a visa”, it’s someone from Brazil who’s traveling to the USA. The page ranking before BERT is from WashingtonPost on ‘How US citizens can travel to Brazil’ which doesn’t answer the query. Instead, now USembassy’s page talking about Visa to Brasil is getting ranked.

2. Featured Snippets:

Google said BERT is already being used globally, in all languages, on featured snippets. So, shifts in featured snippets are expected. If pages were not answering user-queries, and yet were ranking, they could be replaced.

Check the below example, for the query ‘parking on a hill with no curb’.
The previous Featured Snippet only talked about ‘Parking on a Hill’ while the new one specifically talks about the additional detail of ‘no curb’.

Of course, BERT isn’t the ONLY way Google is trying to arrive at results like these, but this has a big role to play in how featured snippets will get generated. 

How to Optimize for BERT

If you’re spending time in reading articles that are going to talk about ‘optimizing for BERT’, then you’re simply wasting time.

From what it appears, we can’t optimize for BERT. Just like Rankbrain.

Of course, the existing best practices, which does not correlate with this Update, but has been there since long, have increased relevance. As a webmaster, you should:

  1. Ensure all answer box queries for which you’re ranking are relevant. If not, update content to match Intent
  2. This addition of relevant content applies to Rankings as well
  3. Include more natural language in your content. Write for Humans instead of Bots
  4. Better Page-URL Mapping
  5. Merge content to create power pages that add topical relevancy to pages. Quality over quantity.

How to benchmark & measure the impact of BERT

 Considering BERT is more likely to affect long-tail keywords and Featured Snippets, you should:

  1. Keep tabs on Featured snippets:

    -GSC has a tab for “search appearance” where you can measure the change in Metrics.
    -Regularly check SERP Feature data on Ahrefs or Semrush for shifts
    -Monitor your existing list of featured snippets manually

  2. Long-tail queries:

    -Create a list of long-tail queries you’re ranking for and refresh ranks for them every week
    -Update the list with new long-tail keywords or lost ones
    -It’s easier to ignore many of these queries as most 3rd party tools won’t give you data. Hence, stick to Google Search console
    -Create a list of queries where you don’t necessarily answer the query, and yet you rank
    -Create a similar list where you match the intent of the query but don’t rank, but can

So, now what?

Is there still a very simplistic way of looking at BERT?

Yes, Write quality content for humans by matching the search query with your content. That’s what BERT for Google is all about.

Did your site face any issues because of this? Do update!

Leave a Reply 2

Your email address will not be published. Required fields are marked *


Anirban Sengupta

Anirban Sengupta

Great summary ! & I agree that SEOs are opportunistically hyping things up.

The SEO Newsletter: Volume 1 – DigiTortoise

The SEO Newsletter: Volume 1 – DigiTortoise

[…] Want a refresher for BERT? Here’s a detailed Guide on BERT […]