Exponential Concentration Inequalities For Independent Random Vectors Under Sublinear Expectations
Abstract: Li and Hu recently established variance-type O(1/n) bounds for the sample mean of independent random vectors under sublinear expectations. We extend their results to the exponential concentration regime. For bounded, independent Rd-valued random vectors under a regular sublinear expectation, we prove: (i) a general concentration principle that reduces vector-valued tail bounds to scalar martingale inequalities via a three-layer architecture; (ii) an Azuma-Hoeffding inequality showing that the distance from the sample mean to the Minkowski average of the expectation sets has sub-Gaussian tails; (iii) a Bernstein inequality incorporating the variance parameter of Li and Hu, interpolating between sub-Gaussian and sub-exponential regimes; (iv) a dimension-free bound replacing the exponential covering prefactor with a polynomial one via the matrix Freedman inequality; and (v) an explicit construction demonstrating that the sub-Gaussian rate is optimal. To the best of our knowledge, these constitute the first exponential concentration inequalities for the multivariate sample mean under sublinear expectations in terms of the set-valued distance to the Minkowski average.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.