Tightness and weak limit of empirical entropy distributions over prime multisets
Prove that for each m ≥ 1 and fixed logarithmic resolution M ≥ 2, if P_m is any collection of prime multisets of size m and ν_m = (1/|P_m|) ∑_{𝔭∈P_m} δ_{H(𝔭)} is the empirical distribution of the spectral entropy H(𝔭) computed from the log-binned distance distribution associated with 𝔭, then the family {ν_m}_{m≥1} is tight; moreover, prove that after centering by the Poisson null-model entropy, ν_m converges weakly to a limiting probability measure as m → ∞.
References
Conjecture [Ensemble-Level Entropy Tightness] Let $\mathcal P_m$ denote a collection of prime multisets of fixed size $m$, and define the empirical entropy distribution
\nu_m := \frac{1}{|\mathcal P_m|} \sum_{\mathbf p \in \mathcal P_m} \delta_{H(\mathbf p)}.
Then the family ${\nu_m}_{m\ge 1}$ is tight. Moreover, after centering by the Poisson null-model entropy, $\nu_m$ converges weakly to a limiting probability measure as $m\to\infty$.