Google BERT 2019 Update – 1 in 10 !
What does Google’s BERT update mean for SEO companies and website owners?
BERT (Bidirectional Encoder Representations from Transformers) is a new transfer learning technique currently making some big noise in the NLP research space. BERT improves the handling of “context” in keywords searches. In other words it allows Google to better understand how to interpret search queries. Naturally this will affect both rankings and also featured snippets.
BERT will be used on 1 in 10 searches & Google says it is so complex it’s running at the absolute limit of its hardware. Hence the 10% limitation.
Google users should see more useful search query results – in their own words :
“Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
Google is also using “BERT” to improve results in 20 + countries where featured snippets are available.