Tightness and weak limit of empirical entropy distributions over prime multisets

Prove that for each m ≥ 1 and fixed logarithmic resolution M ≥ 2, if P_m is any collection of prime multisets of size m and ν_m = (1/|P_m|) ∑_{𝔭∈P_m} δ_{H(𝔭)} is the empirical distribution of the spectral entropy H(𝔭) computed from the log-binned distance distribution associated with 𝔭, then the family {ν_m}_{m≥1} is tight; moreover, prove that after centering by the Poisson null-model entropy, ν_m converges weakly to a limiting probability measure as m → ∞.

Background

Beyond single-prime statistics, the paper considers aggregations over multisets of primes and studies the induced distribution of entropy values across ensembles.

The conjecture posits both tightness and a nontrivial limiting law (after centering by the Poisson null-model entropy), suggesting a statistical limit theory for spectral entropy in structured point processes.

References

Conjecture [Ensemble-Level Entropy Tightness] Let $\mathcal P_m$ denote a collection of prime multisets of fixed size $m$, and define the empirical entropy distribution

\nu_m := \frac{1}{|\mathcal P_m|} \sum_{\mathbf p \in \mathcal P_m} \delta_{H(\mathbf p)}.

Then the family ${\nu_m}_{m\ge 1}$ is tight. Moreover, after centering by the Poisson null-model entropy, $\nu_m$ converges weakly to a limiting probability measure as $m\to\infty$.

A Scale-Invariant Entropy Statistic for Distance Distributions  (2604.02802 - Gewily, 3 Apr 2026) in Section 9: Questions and Conjectures