fbpx
bert and ernie puppets this picture is part of a blog about google's algorithm named BERT

BERT! Its the name of the biggest update in Google’s algoritm in the last five years..

Yes, its is seriously called BERT and it obviously reminds us of the yellow pupput from Sesamestreet.

BERT stands for: “Bidirectional Encoder Representations from Transformers”, and it is a Natural Language Processing (NLP) model.

To make the joke about Sesamestreets yellow puppet even more obvious: another pre-trained contextual representation is called “ELMo”.

It is created to get a better understanding of languages, queries and content with the end goal being able to display the most relevant results in SERP’s (search engine results pages).

Another (long-term) benefit is that this update makes it easier for Google to understand “voice search” better so it can provide more accurate results on voice searches which will (most likely) lead to more usage of voice search.

So far this update is rolled out English (US and UK), but soon it will be for Dutch (and other languages) as well.

Featured snippets

BERT will also be used for featured snippets which, according to Google will lead to better results.

Featured snippets often show an even clearer and structured answer or steps in a process (how to bake a cake for example) or a Q&A.

Google allready made it clear that it has no use in trying to define strategies to make this update work in your advantage other then the strategy to provide people with the best information as possible.

Sounds good to me since that is (or should be) the main reason why you are writing content in general: to help your (potential) customers and others by providing the desired information.

So this is just another logical step in Google’s mission which reads: “Our mission is to organize the world’s information and make it universally accessible and useful”.

So what does bi-directional mean?

It means that BERT uses the context and relations of all the words in a sentence, rather than just one-by-one in order that they are provided.

By doing this bi-directional BERT can figure out the full context of a word by looking at the words that come both before and after it.

And although this is another great step forward in becoming closer to understanding language on a human level, it is not the final solution to language understanding and BERT will still sometimes get it wrong.

an organization chart showing bi-directional routes for both BERT and ELMo
Image module

Would you like to be up to date when it comes to our blogs, projects and branche related news? Then follow our LinkedIn page:

Comments
Share
admin

Leave a reply