Optimal K-dependence in training-conditional lower bounds for full conformal prediction

Determine the optimal dependence on the number of intervals K in training-conditional lower bounds for full conformal prediction over the algorithm class P_K, where each prediction set is a union of at most K intervals. Specifically, establish the tight K-scaling (as a function of K and sample size n) for the worst-case training-conditional coverage error in the offline regime, beyond the current bound of order \(\widetilde{\Omega}(\min\{1/\sqrt{K},\,1/\sqrt{n}\})\).

Background

In the offline consequence of the online lower-bound analysis, the authors establish a baseline training-conditional lower bound for full conformal prediction when prediction sets are unions of at most K intervals (the class P_K).

They prove a general result (Proposition 4) that any algorithm in P_K suffers worst-case training-conditional coverage error at least on the order of Ω~(min{1/K,1/n})\widetilde{\Omega}(\min\{1/\sqrt{K},\,1/\sqrt{n}\}).

They note that this result characterizes a baseline dependence on K but does not settle the optimal K-scaling; identifying the exact optimal K-dependence remains unresolved.

References

Determining the optimal $K$-dependence in training-conditional lower bounds remains an interesting open direction, which we leave for future work.

Optimal training-conditional regret for online conformal prediction  (2602.16537 - Liang et al., 18 Feb 2026) in Implications beyond the online setting (following Proposition 4), Section 4.4