The Lindley paradox in optical interferometry
Abstract: The so-called Lindley paradox is a counterintuitive statistical effect where the Bayesian and frequentist approaches to hypothesis testing give radically different answers, depending on the choice of the prior distribution. In this paper we address the occurrence of the Lindley paradox in optical interferometry and discuss its implications for high-precision measurements. In particular, we focus on phase estimation by Mach-Zehnder interferometers and show how to mitigate the conflict between the two approaches by using suitable priors.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.