AI Glossary
AI FUNDAMENTALS

Token

AI Fundamentals

Definition

A token is the basic unit of text processed by language models. It can correspond to a word, part of a word, or a punctuation character. For example, 'automation' might be a single token, while 'automatization' could be split into multiple tokens. The cost of AI services is often calculated based on the number of tokens processed.

EXPLORE

More terms in AI Fundamentals

Want to apply AI in your business?

Talk to us. The first call is free and no commitment.