Definition
A token is the basic unit of text processed by language models. It can correspond to a word, part of a word, or a punctuation character. For example, 'automation' might be a single token, while 'automatization' could be split into multiple tokens. The cost of AI services is often calculated based on the number of tokens processed.
Related terms
EXPLORE