Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gaussian mixtures: entropy and geometric inequalities

Published 15 Nov 2016 in math.PR, cs.IT, math.FA, math.IT, and math.MG | (1611.04921v3)

Abstract: A symmetric random variable is called a Gaussian mixture if it has the same distribution as the product of two independent random variables, one being positive and the other a standard Gaussian random variable. Examples of Gaussian mixtures include random variables with densities proportional to $e{-|t|p}$ and symmetric $p$-stable random variables, where $p\in(0,2]$. We obtain various sharp moment and entropy comparison estimates for weighted sums of independent Gaussian mixtures and investigate extensions of the B-inequality and the Gaussian correlation inequality in the context of Gaussian mixtures. We also obtain a correlation inequality for symmetric geodesically convex sets in the unit sphere equipped with the normalized surface area measure. We then apply these results to derive sharp constants in Khintchine inequalities for vectors uniformly distributed on the unit balls with respect to $p$-norms and provide short proofs to new and old comparison estimates for geometric parameters of sections and projections of such balls.

Citations (55)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.