Definition
BERT (Bidirectional Encoder Representations from Transformers) is a language model developed by Google that understands the context of a word by looking both left and right in the text. It is particularly effective for comprehension tasks such as classification, question answering, and semantic search. Many enterprise search systems use BERT behind the scenes.
Related terms
EXPLORE