Definition
A word embedding is a vector representation of a word that captures its semantic meaning and relationships with other words. Words with similar meanings have close vectors. Word2Vec and GloVe were among the first word embedding models. Today, contextual models (BERT, GPT) generate embeddings that vary based on the sentence context.
Related terms
EXPLORE