AI Glossary
NLP & LANGUAGE

Word Embedding

NLP & Language

Definition

A word embedding is a vector representation of a word that captures its semantic meaning and relationships with other words. Words with similar meanings have close vectors. Word2Vec and GloVe were among the first word embedding models. Today, contextual models (BERT, GPT) generate embeddings that vary based on the sentence context.

EXPLORE

More terms in NLP & Language

Want to apply AI in your business?

Talk to us. The first call is free and no commitment.