Papers
Topics
Authors
Recent
Search
2000 character limit reached

Monotonicity of Composed Measures

Updated 16 December 2025
  • Monotonicity of composed measures is the study of how deterministic and stochastic operations like convolution and conditioning preserve partial orders and functional properties.
  • Techniques such as Fourier analysis, kernel representations, and Laplace smoothing classify monotone functionals across convolution semigroups and measure compositions.
  • Applications span stochastic ordering, convex geometry, and information theory, offering insights into risk measures, functional inequalities, and entropic properties.

The monotonicity of composed measures is a unifying theme in modern probability, analysis, and information theory, capturing how deterministic or stochastic operations on measures preserve or induce partial orders, convexity/concavity, or monotonicity properties in associated functionals. This topic encompasses the classification of monotone homomorphisms on measure semigroups, Schur-monotonicity for integrated functionals under product and convolution operations, monotonicity properties under measure compositions along convex combinations, the behavior of entropic and extropic quantities under marginalization and conditioning, and kernel-analytic representations of such monotonicity phenomena. These structures are central to deep results in stochastic ordering, convex geometry, functional inequalities, and the theory of information measures.

1. Foundational Structures: Semigroups, Orderings, and Functionals

A core analytic setting involves the convolution semigroup SP(R)\mathcal{S}\subseteq P(\mathbb{R}), where P(R)P(\mathbb{R}) denotes the Borel probability measures on R\mathbb{R}. The operation μν(A)=μ(Ay)dν(y)\mu*\nu(A) = \int \mu(A-y)\,d\nu(y) defines the semigroup structure, with δ0\delta_0 as identity. The stochastic (monotone) order st\le_{st} is defined by

μstν    x: μ((,x])ν((,x]).\mu \le_{st} \nu \iff \forall x:\ \mu((-\infty,x])\ge \nu((-\infty,x]).

This order plays a crucial role in defining monotonicity of functions φ:SR\varphi: \mathcal{S} \to \mathbb{R}, especially those that are homomorphisms under convolution, i.e.,

φ(μν)=φ(μ)+φ(ν).\varphi(\mu*\nu) = \varphi(\mu) + \varphi(\nu).

Such monotone homomorphisms form the structural backbone for many monotonicity properties of composed measures; scalar multiples of the expectation are the unique solutions on moment and Cramér-type semigroups with pp-finite moments for p1p\ge 1, and only the zero map survives on the full space P(R)P(\mathbb{R}) or for $0Fritz et al., 2019).

In higher-order settings, monotonicity can be induced via functional inequalities of the form

[0,1]f((1t)x+ty)dμ(t)0,x<y,\int_{[0,1]} f((1-t)x+ty)\,d\mu(t)\geq 0,\quad x<y,

where μ\mu is a signed bounded Borel measure on [0,1][0,1]. The necessary and sufficient conditions for monotonicity involve higher-order monotonicity of ff (i.e., ff being kk-increasing for various kk), and the sign-structure of the generalized moments of μ\mu (Páles et al., 27 Mar 2025).

2. Monotone Homomorphisms and Classification on Convolution Semigroups

The main classification result is that, for any 1p<1\le p<\infty, monotone homomorphisms φ\varphi on Lp={μ:xpdμ(x)<}L^p=\{\mu: \int|x|^p\,d\mu(x)<\infty\} or PCram(R)P_{Cram}(\mathbb{R}) are of the form

φ(μ)=cE[μ],c0.\varphi(\mu)=c\,\mathbb{E}[\mu],\quad c\ge 0.

This result is obtained by showing φ\varphi is linear on Dirac masses (via additivity and stochastically monotonic extension), combined with a catalytic Laplace smoothing lemma and mixing arguments, reducing all monotone homomorphisms to scalar multiples of the mean. On P(R)P(\mathbb{R}), or for $0Fritz et al., 2019). Intermediate semigroups, such as those defined with sublinear tail functionals ψ(μ)=limnnμ((n,))<\psi(\mu)=\lim_{n\to\infty} n\mu((n,\infty))<\infty, can admit distinct, non-expectational monotone additive functionals, fully determined by tail asymptotics.

3. Fourier-Analytic Criteria and Schur Monotonicity of Product/Convolution Functionals

Monotonicity phenomena also appear in functionals associated to independent coordinates or products of measures, analyzed via their Fourier transforms. For a probability measure μ\mu with density ff and positive, integrable Fourier transform ϕ(t)=e2πixtf(x)dx\phi(t)=\int e^{-2\pi i x t}f(x)dx, and for an even, positive-definite h:RRh:\mathbb{R}\to\mathbb{R}, the functional

H(a)=Rnh(a,x)μn(dx)H(a) = \int_{\mathbb{R}^n} h(\langle a,x\rangle)\,\mu^n(dx)

satisfies:

  • If rϕ(rs)r\mapsto \phi(r s) is log-convex for each ss, then HH is log-convex and Schur-convex in aa;
  • If log-concave, then HH is Schur-concave.

This criterion pins the monotonicity of moment and Khinchin-type inequalities for sums of i.i.d. random vectors, and extends to intersection-body norms in convex geometry, provided the corresponding Minkowski functionals are positive-definite distributions. The Fourier-analytic structure unifies the analysis of monotonicity for diverse classes of functionals under convolution, product, and linear mapping (Malliaris, 8 Apr 2025).

4. Monotonicity in Conditional Laws: Kernel and Covariance Representations

Kernel techniques, originated with the Hoeffding–Shorack identity, represent covariances via integral kernels derived from the distribution function and are instrumental in analyzing monotonicity of composed (conditional) measures. For (X,Y)(X,Y) on R2\mathbb{R}^2 with density hh, the conditional measure of XX+Y=sX|X+Y=s has survival function SX(x;s)S_X(x;s), and under mild regularity, the derivative with respect to ss is represented as

sSX(x;s0)=Kμ1,s0(x,x)[222φ122φ](x,s0x)dx,\partial_s S_X(x;s_0) = \int K_{\mu_{1,s_0}}(x,x') [\partial^2_{22}\varphi - \partial^2_{12}\varphi](x', s_0 - x') dx',

where φ=logh\varphi = -\log h and KK is the covariance kernel. Sufficient conditions on the Hessian of φ\varphi yield monotonicity in ss of conditional expectations and survival functions. This generalizes Efron's monotonicity property from independent log-concave measures to broad classes of dependent measures, including copulas and mixtures, and yields quantitative lower bounds on the growth rates of conditional expectations (Saumard et al., 2017).

5. Measure Composition, Higher-order Monotonicity, and Induced Stochastic Orders

Measure composition via convolution, convex combination maps, and push-forwards can induce new partial orders and monotonicity regimes. Integral inequalities of the form

[0,1]f((1t)x+ty)dμ(t)0,x<y,\int_{[0,1]}f((1-t)x+ty)\,d\mu(t)\ge0, \quad \forall x<y,

hold precisely for ff in intersections of kk-increasing function classes, provided the moment-structure and support of μ\mu satisfy explicit sign and vanishing conditions. This yields exact criteria for “higher-order” convex, monotone, and mixed orders for probability measures, systematically recovering and generalizing classical convex and stochastic orders (Páles et al., 27 Mar 2025). The composition formalism encapsulates, for example, Hermite-Hadamard type inequalities, mixed-order convexity, and other classical inequalities as special cases of measure-induced orderings.

6. Monotonicity in Entropic and Extropic Measures

In information theory, composed measures arise in the monotonicity of linear entropic formulas under local operations. All linear combinations of von Neumann entropies that are monotonic under local quantum (or classical) operations form a polyhedral convex cone—characterized exclusively by strong subadditivity (SSA)—with facet inequalities corresponding to conditional mutual informations and their nonnegative combinations. For any nn-partite system, the exact structure of extremal monotonic functionals and their symmetric reductions can be computed (Alhejji et al., 2018).

Weighted conditional extropy admits monotonicity under convolution and composition, especially in the presence of log-concave densities. For conditional extropy Jw(XS)J_w(X|S) of XX given X(c,d)X\in(c,d), monotonicity in the interval endpoint (nondecreasing if FXF_X log-concave) is established via analytical differentiation. For convolution-type random variables, such as V=Y1Y2V=|Y_1-Y_2| for iid log-concave YiY_i, the monotonicity of the conditional weighted extropy is guaranteed, partially increasing in the natural rectangle of support (Gupta et al., 2022).

7. Applications, Significance, and Scope

The monotonicity of composed measures underpins a broad array of results:

  • Classification of monotone functionals is crucial in the theory of aggregation, risk measures, and decision theory.
  • Schur monotonicity via Fourier criteria unifies results in high-dimensional convex geometry (intersection bodies, norm comparisons), functional inequalities (Khinchin’s inequalities), and moment comparison.
  • Kernel and composition techniques enable quantitative analysis in regression under error, statistical dependence modeling, and mixture models.
  • Measure-induced mixed-order convexities generalize classical inequalities and unlock new stochastic orderings.
  • Monotonicity cones for entropy and extropy, fully determined by classical information inequalities, are foundational in quantum information, secret sharing, and resource theories.

A plausible implication is that, beyond these structural regimes, nontrivial monotonic properties of composed measures arise only under specific growth, smoothness, or symmetry restrictions on the semigroup or the composing measure. The current theory provides a complete classification of such monotonicity in various analytic, probabilistic, and information-theoretic frameworks, with extensions emerging in complex measure composition and functional analysis.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Monotonicity of Composed Measures.