Google Updates Its Search Algorithm: Brings Neural Network Techniques to Search

Whenever Google updates, tweaks, replaces or improves its search algorithms, webmasters the world over anxiously wait to see how it will impact their rankings. Google’s latest update, Bidirectional ...
Google Updates Its Search Algorithm: Brings Neural Network Techniques to Search
Written by Matt Milano

Whenever Google updates, tweaks, replaces or improves its search algorithms, webmasters the world over anxiously wait to see how it will impact their rankings.

Google’s latest update, Bidirectional Encoder Representations from Transformers (BERT), is one of the company’s most interesting to date. Last year Google “introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training,” or BERT.

The company is using BERT to better understand complex, natural language queries and return more relevant results.

“By applying BERT models to both ranking and featured snippets in Search, we’re able to do a much better job helping you find useful information,” wrote Pandu Nayak, Google Fellow and Vice President, Search in a company blog post. “In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.

“Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.

“To launch these improvements, we did a lot of testing to ensure that the changes actually are more helpful. Here are some of the examples that showed up our evaluation process that demonstrate BERT’s ability to understand the intent behind your search.

“Here’s a search for ‘2019 brazil traveler to usa need a visa.’ The word ‘to’ and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.”

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us