Understanding Google Searches Better Than Ever Before

New Search Algorithm, Bert

What is BERT?
Pandu Nayak | Google Fellow and Vice President, Search

It is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers. It was opened-sourced last year and written about in more detail on the Google AI blog.

After RankBrain, BERT is the biggest update by Google Search and is likely to impact 1 in 10 search queries. The previous major algorithm update, RankBrain was introduced by the company around 5
years ago.

BERT stands for Bidirectional Encoder Representations from Transformers. It is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT can help computers understand languages a bit more as humans do. In 2018, it was open-sourced by Google. Detailed information about BERT is provided in the Google AI blog.

The rolling out of BERT began earlier this week and it is likely to be fully live within a few more days. As of now, the feature is suitable for only English language queries, but shortly Google might try out other languages.
Google further revealed that BERT is being used globally in all languages in featured snippets. Understanding the context of words in searches will help better match those queries with relevant results.

In one example, Google said, with a search for “2019 brazil traveling to the USA need a visa” the word “to” and its relationship to the other words in query are important for understanding the meaning. Earlier, Google did not understand this connection between words and would return with results about U.S. citizens traveling to Brazil. Google further explained, “With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.”

Understanding Google Searches Better Than Ever Before

In yet another example, a search for “do estheticians stand a lot at work”, Google said it previously would have matched the term “stand-alone” with the term “stand” used in the query. Google’s BERT models can “understand that ‘stand’ is related to the concepts of the physical demands of the job, and displays a more useful response,” Google said. In the example below, Google can understand a query more like a human to show a more relevant result on a search for “Can you get medicine for someone pharmacy.”

To read more of this article please download our November SEO Newsletter or visit Google’s Understanding searches better than ever before.

Categorised in:

This post was written by Shoreline Content Team