2000 character limit reached
The Pitfalls of Defining Hallucination
Published 15 Jan 2024 in cs.CL | (2401.07897v1)
Abstract: Despite impressive advances in Natural Language Generation (NLG) and LLMs, researchers are still unclear about important aspects of NLG evaluation. To substantiate this claim, I examine current classifications of hallucination and omission in Data-text NLG, and I propose a logic-based synthesis of these classfications. I conclude by highlighting some remaining limitations of all current thinking about hallucination and by discussing implications for LLMs.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.