Select Theme
 

All that you need to know about the Bert algorithm used by Google

November 13, 20190
Google Bert Update

All of us use Google daily to gather information about the tiniest of things. So how does the search engine work? What is the mysterious BERT algorithm that was included in the latest search update of Google? Let us take a closer look.

Basics of BERT

BERT stands for Bidirectional Encoder Representations from Transformers. The concept was first published by the Google Artificial Intelligence Language researchers. Instantly, it caused a significant impact on the Machine Learning sector of society. Essentially, BERT is a technique involving neural networks. This system can naturally process language pre-training. So why does Google need it? Simply put, BERT can assist the renowned search engine to better understand the meaning and relevance of every word used in searches.

Now to understand BERT, we first need to make sense of a few technical terms:

  • Neural Network: Most algorithms used in various software platforms use neural networks. These networks facilitate pattern recognition. For example, neural networks can successfully recognize handwriting, group similar images and predict financial trends in the stock market.
  • Natural Language Processing: NLP or Natural Language Processing is a form of artificial intelligence. Essentially, it helps computers to gain a real-world perspective on how humans behave and communicate. A glowing example of the technology is a chatbot that is trained to construct replies as a human being would.
  • Bi-directional Training: Bi-directional training is nothing but an algorithm’s ability to allow language models to read and interpret entire phrases in a query. Traditionally, language models train by reading in a specified order, say left-to-right or vice versa. However, bi-directional training is more human-like as it helps the algorithm to consider the overall context.

As for BERT, it makes use of both neural networks as well as NLP. Firstly, it has been pre-trained using a huge pool of data from Wikipedia so that it can recognize patterns during searches. Further, it has incorporated an enhanced version of NLP via bi-directional training.

Understanding how BERT works

The major technical innovation in BERT is the successful application of bi-directional training. It uses the Transformer which is a mechanism of attention. Transformer helps in analyzing and understanding the contextual connections between different words. Often, Transformer has two parts: an encoder which reads the text input; and a decoder that gives a predictive output for the given task. It bases its results strictly on the words surrounding the search term and not just the text preceding or succeeding it.

bert-update-example

An example of BERT’s methodology

Let us consider the word ‘bark’. Now, this word can have two meanings. It can either refer to the “sound made by a dog” or indicate the “outer layer of a tree’s trunk”. If the traditional language model was being used, ‘bark’ would have similar representation, free of context, no matter which meaning of the term was used. However, BERT would like to differ!

Consider the following sentence: “The child played with the bark of the tree.” Now, a unidirectional contextual model will relate the term ‘bark’ with the preceding words “The child played with the” if it were studying the sentence,starting from the left and moving towards the right. However thanks to the bidirectional model, BERT will take into account ‘tree’ as well. So, in a nutshell, it will be better equipped to understand that the word is being used for a tree rather than a dog.

Impact of BERT on Google Search Engine

For a layman like you and me, the impact of BERT in our regular searches may be difficult to notice. However, according to the tech giant, BERT can be expected to have an impact on nearly 10% of Google searches. It will affect the traffic as well as the organic visibility of the brand. Recently, BERT has brought about improvements in even small-scale Google searches, such as a search for mathematics practice books for adults.

However, it will be a mistake to assume that BERT will help Google to understand every search perfectly. According to reports, it will only impact one out of every ten English searches made in the United States. Queries that are longer and have a human conversational quality to it will see better results from BERT. In particular, the inclusion of prepositions such as ‘to’ and ‘for’ will make it easier for the search engine to grasp the contextual aspect of the user’s query.

In addition to the search engine, BERT also has the potential of affecting the Google Assistant. Although it is expected to work on search functions only at the moment, if someone asks a question to the Google Assistant, it is possible that the search results returned will be influenced by BERT, thereby, affecting the Assistant function as well.

BERT: Thumbs up or Thumbs Down?

Needless to say, BERT is proof of a ground-breaking development in the area of Machine Learning. Its strength lies in the fact that it is understandable and can be customized according to specific needs. As a result, it has the potential of being widely used in the years to come.

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!