Hallucination
Definition
A hallucination is when an AI model produces information that sounds plausible but is incorrect or not supported by sources.
Plain English
Confident-sounding mistakes.
Why it matters
- It's why citations and verification are important.
- It rewards clear, evidence-backed publishing.
- It can mislead buyers if answers aren't grounded.