Book SEO Strategy Call

Bing reveals that it has been using BERT before Google

December 10, 2019 0 comments

Last month, Google introduced BERT — Bidirectional Encoder Representations from Transformers — which would help Google better understand online search queries with proper context.

The algorithm update made many headlines about what it is, what’s its impact going to be, and what search engine professionals and content marketers need to do right to ensure their websites do not go down in the SERPs.

However, now Bing revealed in a recent blog post that it has been using BERT before Google and at a larger scale, too. This is what the blog post mentioned:

“Over the last couple of years, deep learning has become widely adopted across the Bing search stack and powers a vast number of our intelligent features. We use natural language models to improve our core search algorithm’s understanding of a user’s search intent and the related webpages so that Bing can deliver the most relevant search results to our users.”

The post also specifically mentioned BERT. It added:

“Recently, there was a breakthrough in natural language understanding with a type of model called transformers (as popularized by Bidirectional Encoder Representations from Transformers, BERT).  Starting from April of this year, we used large transformer models to deliver the largest quality improvements to our Bing customers in the past year.”

The post also highlighted the following example on how it is changing the results on the search engine results pages.

In the above example, Bing mentions that the word “aggravate” indicates that the user’s motive is to learn more about the actions that can be taken after a concession.

Before BERT, the #1 result would have been about the causes and symptoms, which does not really align with the user’s search intent. However, after BERT, the result that Bing returns with is a page that acknowledges concussion and talks about the various steps that can be taken.

Bing also admitted that applying a deep learning model like BERT on a worldwide scale for web search is an expensive feature. This is a reason why we won’t see many other search engines adopting the BERT technology right away, which can be a competitive advantage for search engines that do use this new natural language processing technique.