Hallucination Rates in Real-World PA Letter Generation
Ascertain the hallucination rate of large language model-generated prior authorization letters under less structured, real-world deployment conditions where input records may be incomplete or ambiguous, and quantify the associated safety risks for clinical AI deployment.
References
The hallucination rate under less structured conditions remains an important open question for the safety of clinical AI deployment.
— AI-Generated Prior Authorization Letters: Strong Clinical Content, Weak Administrative Scaffolding
(2603.29366 - Awan et al., 31 Mar 2026) in Hallucination Assessment subsection, Results section