AI Glossary
NLP & LANGUAGE

BERT

NLP & Language

Definition

BERT (Bidirectional Encoder Representations from Transformers) is a language model developed by Google that understands the context of a word by looking both left and right in the text. It is particularly effective for comprehension tasks such as classification, question answering, and semantic search. Many enterprise search systems use BERT behind the scenes.

EXPLORE

More terms in NLP & Language

Want to apply AI in your business?

Talk to us. The first call is free and no commitment.