Fact
Hallucination Phenomenon
July 29, 2024
LLM 'hallucinations' occur when models generate false or nonsensical information presented as factual. This happens because models predict plausible-sounding text rather than retrieving verified facts.
Category: LLM Limitations