The Bidirectional Encoder Representations from Transformers (BERT) is an AI developed by Google as a means to help machines understand language in a manner more similar to how humans understand language. Specifically, it’s pre-trained, unsupervised natural language processing (NLP) model that seeks to understand the nuances and context of human language.

It was released as an open-source program by Google in 2018 but had an official launch in November 2019. It is now being used in Google searches in all languages, globally and impacts featured snippets.

What is BERT used for?

BERT is primarily used to provide better query results by using its understanding of language nuance to deliver more useful results. This goes not only for standard snippets, but for featured snippets as well. It’s said that it will impact at least 1 out of every 10 search results going forward.

When BERT uses its understanding of nuance in language, it can understand a user’s intentions through connecting words, such as: and, but, to, from, with, etc. So rather than utilizing only keywords, BERT can understand a user’s query request by examining words like “and” or “verses” in delivering SERP results.

Example of BERT on SERP
An example of how BERT uses NLP to distinguish a user’s search intent.

In an example provided by Google, if you search for “parking on a hill with no curb,” you would get SERP results and a featured snippet detailing what you need to do if you’re parking a vehicle on a hill where there is no curb. Thanks to BERT’s NLP, Google knows that the word “no” means that there is no curb, whereas previously, if you searched for the same query, you would’ve received results on parking on a hill WITH a curb because your query included the keyword “curb” but Google didn’t understand the significance of the word no.

What is BERTSUM?

BERTSUM is a variant of BERT that is used for extractive summarization of content. Essentially, BERTSUM can be used to extract summaries of web pages and content for several different web pages and sites. This has been known to be particularly useful when writing meta descriptions for hundreds or even thousands of webpages on a site, rather than having to write each one individually.

BERT’s effect on RankBrain

RankBrain, being Google’s first AI used to understand queries, has been used to understand queries and content since 2015. While it shares some things in common with BERT, they do not perform the same functions and BERT has not replaced RankBrain. RankBrain can do things like, understand what a user is looking for even if they misspelled it or used incorrect grammar whereas BERT seeks to understand the nuances of the language used in a search query.

Therefore, while they both share a lot in common and both perform NLP functions for the Google SERP, they are not the same.