Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalizing the Fano inequality further

Published 17 Jan 2026 in cs.IT | (2601.12027v1)

Abstract: Interactive statistical decision making (ISDM) features algorithm-dependent data generated through interaction. Existing information-theoretic lower bounds in ISDM largely target expected risk, while tail-sensitive objectives are less developed. We generalize the interactive Fano framework of Chen et al. by replacing the hard success event with a randomized one-bit statistic representing an arbitrary bounded transform of the loss. This yields a Bernoulli f-divergence inequality, which we invert to obtain a two-sided interval for the transform, recovering the previous result as a special case. Instantiating the transform with a bounded hinge and using the Rockafellar-Uryasev representation, we derive lower bounds on the prior-predictive (Bayesian) CVaR of bounded losses. For KL divergence with the mixture reference distribution, the bound becomes explicit in terms of mutual information via Pinsker's inequality.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.