Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gaussian-width gradient complexity, reverse log-Sobolev inequalities and nonlinear large deviations

Published 13 Dec 2016 in math.PR and math.FA | (1612.04346v6)

Abstract: We prove structure theorems for measures on the discrete cube and on Gaussian space, which provide sufficient conditions for mean-field behavior. These conditions rely on a new notion of complexity for such measures, namely the Gaussian-width of the gradient of the log-density. On the cube ${-1,1}n$, we show that a measure $\nu$ which exhibits low complexity can be written as a mixture of measures ${\nu_\theta}{\theta \in \mathcal{I}}$ such that: i. for each $\theta$, the measure $\nu\theta$ is a small perturbation of $\nu$ such that $\log \tfrac{d \nu_\theta}{d \nu}$ is a linear function whose gradient is small and, ii. $\nu_\theta$ is close to some product measure, in Wasserstein distance, for most $\theta$. Thus, our framework can be used to study the behavior of low-complexity measures beyond approximation of the partition function, showing that those measures are roughly mixtures of product measures whose entropy is close to that of the original measure. In particular, as a corollary of our theorems, we derive a bound for the na\"ive mean-field approximation of the log-partition function which improves the nonlinear large deviation framework of Chatterjee and Dembo in several ways: 1. It does not require any bounds on second derivatives. 2. The covering number is replaced by the weaker notion of Gaussian-width 3. We obtain stronger asymptotics with respect to the dimension. Two other corollaries are decomposition theorems for exponential random graphs and large-degree Ising models. In the Gaussian case, we show that measures of low-complexity exhibit an almost-tight reverse Log-Sobolev inequality.

Authors (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.