Sharp Concentration Inequalities: Phase Transition and Mixing of Orlicz Tails with Variance
Abstract: In this work, we investigate how to develop sharp concentration inequalities for sub-Weibull random variables, including sub-Gaussian and sub-exponential distributions. Although the random variables may not be sub-Guassian, the tail probability around the origin behaves as if they were sub-Gaussian, and the tail probability decays align with the Orlicz $Ψα$-tail elsewhere. Specifically, for independent and identically distributed (i.i.d.) ${X_i}{i=1}n$ with finite Orlicz norm $|X|{Ψα}$, our theory unveils that there is an interesting phase transition at $α= 2$ in that $\PPł(ł|\sum_{i=1}n X_i \r| \geq t\r)$ with $t > 0$ is upper bounded by $2\expł(-C\maxł{\frac{t2}{n|X|{Ψα}2},\frac{tα}{ n{α-1} |X|{Ψα}α}\r}\r)$ for $α\geq 2$, and by $2\expł(-C\minł{\frac{t2}{n|X|{Ψα}2},\frac{tα}{ n{α-1} |X|{Ψα}α}\r}\r)$ for $1\leq α\leq 2$ with some positive constant $C$. In many scenarios, it is often necessary to distinguish the standard deviation from the Orlicz norm when the latter can exceed the former greatly. To accommodate this, we build a new theoretical analysis framework, and our sharp, flexible concentration inequalities involve the variance and a mixing of Orlicz $Ψ_α$-tails through the min and max functions. Our theory yields new, improved concentration inequalities even for the cases of sub-Gaussian and sub-exponential distributions with $α= 2$ and $1$, respectively. We further demonstrate our theory on martingales, random vectors, random matrices, and covariance matrix estimation. These sharp concentration inequalities can empower more precise non-asymptotic analyses across different statistical and machine learning applications.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.