Google is Introducing BERT Model to Search Rankings

The BERT model, standing for bidirectional encoder representations from transformers, helps understand search queries. 

At the moment BERT is to be used on 1/10 of the English US searches.

The BERT model processes words in the search query in relation to the other words, meaning it will better interpret the meaning of a search query.  Examples show how BERT causes more appropriate results to appear in SERPs in relation to search terms. This could revolutionise SERPs.