AI Glossary
AI FUNDAMENTALS

Hallucination

AI Fundamentals

Definition

An AI hallucination occurs when a language model generates false information, presenting it as true with confidence. LLMs don't 'know' what is true: they generate statistically probable text. For businesses, this means every AI output must be verified, and RAG systems are preferred because they anchor responses to real documents.

EXPLORE

More terms in AI Fundamentals

Want to apply AI in your business?

Talk to us. The first call is free and no commitment.