Papers
Topics
Authors
Recent
Search
2000 character limit reached

Infimal Convolution in Convex Analysis

Updated 27 January 2026
  • Infimal convolution is a fundamental operation in convex analysis that combines two cost functions by selecting the lowest aggregated value over all possible splits.
  • It underpins key methods in smoothing, duality, and regularization, facilitating robust approaches in statistics, imaging, and optimal transport.
  • Its versatile applications include the formulation of proximal maps, metric convolutions, and optimization schemes for handling noise and enhancing image processing.

Infimal convolution is a fundamental operation in convex analysis, optimization, variational methods, and geometric analysis, which combines two (or more) extended-real-valued functions into a new function reflecting the "cheapest" way of splitting an argument between two costs. The infimal convolution is closely related to smoothing, duality, and regularization techniques and underlies robust statistics, optimal transport, nonsmooth analysis, PDE theory, and modern machine learning loss constructions.

1. Definition and Fundamental Properties

Given two functions f,g:X(,+]f, g: X \rightarrow (-\infty, +\infty] on a vector space XX, the infimal convolution $f \infconv g$ is defined as

$(f \infconv g)(x) := \inf_{y \in X} \left\{ f(y) + g(x - y) \right\}.$

In finite dimensions, this operation is symmetric, commutative ($f \infconv g = g \infconv f$), and associative. For convex, proper, lower semicontinuous (l.s.c.) functions, the infimal convolution inherits convexity and closedness. If both ff and gg are proper and l.s.c., so is $f\infconv g$, and $\mathrm{dom}(f\infconv g) = \mathrm{dom} f + \mathrm{dom} g$ (Lambert et al., 2022, Nam et al., 2014, Burke et al., 2019).

Crucially, in convex analysis, the epigraph of $f \infconv g$ is the Minkowski sum of the epigraphs of ff and gg: $\mathrm{epi} (f \infconv g) = \mathrm{epi} f + \mathrm{epi} g$.

The infimal convolution generalizes to mm functions f1,,fmf_1, \ldots, f_m via

$(f_1 \infconv \cdots \infconv f_m)(x) = \inf_{\substack{x = u_1 + \cdots + u_m}} \left( \sum_{i=1}^m f_i(u_i)\right),$

and this formulation provides flexibility for constructing complex composite penalties or data fidelities (Formica et al., 2021, Bredies et al., 2023).

2. Duality and Connections with Convex Analysis

A cornerstone property is the Fenchel–Rockafellar conjugacy: $(f \infconv g)^* = f^* + g^*,$ where ff^* is the Fenchel conjugate. Conversely, addition in the primal yields infimal convolution in the dual: $(f + g)^* = f^* \infconv g^*.$ These identities are essential for deriving dual problems in variational and optimal control contexts and underlie the Moreau–Rockafellar theory of regularization (Burke et al., 2019, Mahmudov, 2019, Tibshirani et al., 2024).

The Moreau envelope, a smooth approximation of ff, is the infimal convolution with a rescaled squared norm: $f_\lambda(x) = (f \infconv \frac{1}{2\lambda}\| \cdot \|^2)(x).$ If ff is convex, fλf_\lambda is C1C^1 with Lipschitz continuous gradient, and the proximal map is given by

Proxλf(x)=arg miny{f(y)+12λxy2}.\mathrm{Prox}_{\lambda f}(x) = \argmin_y \{f(y) + \frac{1}{2\lambda}\|x-y\|^2\}.

This fundamental smoothing via infimal convolution is widely used in first-order optimization schemes (Tibshirani et al., 2024).

3. Subdifferential Calculus and Differentiability

Generalized differentiation of the infimal convolution function has been characterized under mild regularity conditions. If ff is proper, l.s.c., and Lipschitz on domf\mathrm{dom}\,f, and pp is a proper, l.s.c., coercive gauge (i.e., subadditive, p.h.) with p(0)=0p(0)=0, the Fréchet and Mordukhovich subdifferentials of $f\infconv p$ at xx are described by

$\hat{\partial}(f \infconv p)(x) = \hat{\partial} f(x) \cap [-\hat{\partial} p(0)],$

with analogous formulas for the limiting subdifferential, provided the minimizer set is a singleton or other regularity holds (Nam et al., 2014).

Strict differentiability of $f\infconv p$ at a point requires strict differentiability of either factor at the relevant location and single-valuedness of the minimizer map. This unifies Moreau envelopes, distance functions, minimal time functions, and other classical nonsmooth objects (Nam et al., 2014).

4. Applications in Regularization, Statistics, and Imaging

Infimal convolution is employed to build regularization functionals that balance different structural priors. For example, in imaging:

  • The family of TVLp^{p} functionals,

TVLα,βp(u):=infw{αDuwM+βwLp},\mathrm{TVL}^{p}_{\alpha,\beta}(u) := \inf_{w} \left\{ \alpha \|D u - w\|_{\mathcal M} + \beta \|w\|_{L^{p}} \right\},

interpolates between total variation (TV)-based regularization (p=1p=1) and Huber-type smoothness (p=2p=2), and in the limit p=p=\infty, it matches second-order TGV while providing enhanced preservation of piecewise-affine features (Burger et al., 2015, Burger et al., 2015).

  • The oscillation TGV model uses the infimal convolution over multiple oscillation directions to enable accurate texture-preserving reconstructions. The convexity and lower semicontinuity of the resulting regularizer ensure well-posedness and enable efficient primal-dual algorithms (Gao et al., 2017).

In robust statistics and regression, losses such as the Huber and ε\varepsilon-insensitive are constructed as infimal convolutions: $\text{Huber}_\kappa^p(f) = (\tfrac{1}{2}\|\,\cdot\,\|_Y^2) \infconv (\kappa\|\cdot\|_p)(f), \qquad \ell^p_\varepsilon(f) = (\tfrac{1}{2}\|\,\cdot\,\|_Y^2) \infconv \iota_{\|\,\cdot\,\|_p\leq\varepsilon}(f),$ yielding robust, sparse or outlier-insensitive alternatives to the squared loss (Lambert et al., 2022).

In mixed-noise image denoising, the IC-fidelity allows simultaneous adaptation to, e.g., Gaussian and Poisson noise, reflecting a data-driven decomposition of residuals into multiple channels (Calatroni et al., 2016, Toader et al., 2021).

Infinite infimal convolution regularization extends this approach to a continuous family of one-homogeneous functionals, processed via a measure-valued lifting. The theory guarantees sparsity of solutions (atomic supports), and efficient Frank–Wolfe algorithms can solve the resulting convex problems (Bredies et al., 2023).

5. Metric and Optimal Transport Convolutions

A metric analog of infimal convolution arises in the geometric context of distances. Given two extended metrics d1,d2d_1, d_2 on UU, the metric-infimal-convolution is

(d1d2)(z0,z1)=infyU[d1(z0,y)+d2(y,z1)].(d_1 \nabla d_2)(z_0, z_1) = \inf_{y\in U} \left[d_1(z_0, y) + d_2(y, z_1)\right].

Notably, the Hellinger–Kantorovich metric HKHK is expressed as

HK2(μ,ν)=infη{He2(μ,η)+W2(η,ν)},HK^2(\mu, \nu) = \inf_{\eta} \left\{ He^2(\mu, \eta) + W^2(\eta, \nu) \right\},

where HeHe is the Hellinger distance and WW the Wasserstein-2 distance. This expresses a composite geometry interpolating unbalanced mass change (Hellinger) and transport (Wasserstein) mechanisms; the associated minimization has a rich duality and convex-analytic structure (Ponti et al., 17 Mar 2025).

In multi-marginal optimal transport, infimal convolution cost yields the Wasserstein barycenter problem: minνi=1NWpp(ν,μi)=infγinfzi=1Nci(xi,z)dγ(x1,...,xN),\min_{\nu}\sum_{i=1}^N W_p^p(\nu, \mu_i) = \inf_{\gamma} \int \inf_{z} \sum_{i=1}^N c_i(x_i, z) d\gamma(x_1, ..., x_N), connecting barycenters, Benamou–Brenier dynamics, and barycentric measures via natural convex optimization (Krannich, 14 Dec 2025).

6. Smoothing, Approximation, and PDEs

Infimal convolution underpins nonlinear smoothing mechanisms. In optimization and PDEs, the Moreau envelope yields a smooth function, and the Hopf–Lax formula provides viscosity solutions to Hamilton–Jacobi equations: ut(x)=infy[u(y)+tL((xy)/t)].u_t(x) = \inf_{y} \left[ u(y) + t L((x-y)/t) \right]. Regularity, convergence, and Sobolev embedding properties induce strong smoothing effects, with sharp characterizations depending on the integrability and growth of LL (Luiro, 2012).

Laplace’s method gives a smoothing approximation to the infimal convolution: $L_\delta[f \infconv g](x) = -\delta \log \int e^{-(f(y) + g(x-y))/\delta} dy,$ with uniform smoothness for δ>0\delta>0 and gradient/Hessian structure determined by the kernel of integration. For small δ\delta, $L_\delta[f\infconv g](x)$ closely approximates $(f\infconv g)(x)$, enabling Monte Carlo and softmin-based optimization algorithms (Tibshirani et al., 2024).

In discrete and graph settings, variant notions replace the infimum over points by infima over probability measures; these enter non-Euclidean Hamilton–Jacobi equations and lead to discrete analogs of log–Sobolev and transport inequalities (Shu, 2015).

7. Quantitative Norm and Inequality Results

Infimal convolution operators satisfy sharp norm inequalities in various function spaces. In Lebesgue and Grand Lebesgue spaces, one has: $\|f_1 \infconv \cdots \infconv f_m\|_{L^p} \leq m^{d/p} \left( \prod_{j=1}^m \|f_j\|_{L^p} \right)^{1/m},$ with best possible constants and corresponding extensions to Orlicz, Lorentz, and mixed spaces, reflecting the growth of infimal convolution in high-dimensional and random settings (Formica et al., 2021, Rabier, 2015).

Integral inequalities for infimal convolution have implications for long-time behavior of Hamilton–Jacobi equations and necessary conditions for infimal-convolution equation solvability (Rabier, 2015).


Infimal convolution organizes and extends smoothing, regularization, duality, and variational schemes across convex analysis, PDE theory, optimization, and geometric analysis. Its structure is key for theoretical understanding and practical algorithmic design in robust statistics, signal/image processing, transport, and learning frameworks (Lambert et al., 2022, Gao et al., 2017, Burger et al., 2015, Ponti et al., 17 Mar 2025, Krannich, 14 Dec 2025).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Infimal Convolution.