Responsible AI Platform
Technical Concepts

Hallucination

Definition & Explanation

Definition

Erroneous output from AI systems (especially LLMs) where the model generates convincing-sounding but factually incorrect information. An important risk with generative AI that users must be informed about as part of AI literacy.

Related Terms

Read more about this topic