AI

AI Hallucination

When AI generates confident but false information

Definition

AI hallucination occurs when a large language model generates plausible-sounding but factually incorrect or entirely fabricated information with apparent confidence. It stems from the model predicting likely text rather than retrieving verified facts.

📌 Example

A hallucinating AI might confidently cite a research paper that doesn't exist, complete with a fake author, journal, and publication date.