Papers
Topics
Authors
Recent
Search
2000 character limit reached

Log-Barrier Riemannian Dynamics

Updated 22 January 2026
  • Log-Barrier Riemannian dynamics are defined through self-concordant barrier functions that induce a Hessian metric, offering an affine-invariant local geometry for constrained domains.
  • They underpin advanced sampling methods such as Riemannian Langevin diffusion and Hamiltonian Monte Carlo by adapting classical flows to the barrier-induced metric.
  • The framework supports scalable optimization and precise convergence analysis, with techniques like spectral sketching enabling efficient discretization and robust mixing time guarantees.

Log-barrier Riemannian dynamics refers to the interplay between geometric flows, sampling algorithms, and optimization procedures defined on domains equipped with a Riemannian metric induced by a log-barrier or other self-concordant barrier function. The log-barrier function, typically ϕ(x)=ilog(si(x))\phi(x) = -\sum_i \log(s_i(x)), where si(x)s_i(x) encodes the distance to the constraint boundary, defines a Hessian metric G(x)=2ϕ(x)G(x) = \nabla^2\phi(x) that endows the feasible region with affine-invariant local geometry. This framework is central to modern advances in constrained sampling, Riemannian MCMC architectures, and manifold-based optimization methods, providing convergence guarantees and efficient mixing bounds under precise geometric and analytic conditions.

1. Barrier-induced Riemannian Geometry

Let PRnP \subseteq \mathbb{R}^n denote an open polytope or, more generally, a convex body KK characterized via linear or semidefinite inequalities. The log-barrier function ϕ(x)\phi(x) diverges near the boundary, shaping the feasible region's interior as a Hessian manifold. The induced Riemannian metric is G(x)=2ϕ(x)G(x) = \nabla^2\phi(x), with local norm for tangent vectors vTxPv \in T_xP given by vG(x)2=vG(x)v=i[(aiv)/(aixbi)]2\|v\|_{G(x)}^2 = v^\top G(x) v = \sum_i [(a_i^\top v)/(a_i^\top x - b_i)]^2 for polytopal constraints (Gatmiry et al., 2022). The Dikin ellipsoid at xx is defined by E(x)={v:vx1}E(x) = \{v : \|v\|_x \le 1\}, encapsulating local curvature and barrier 'strength' (Gu et al., 2024). For spectrahedral domains, the barrier ψ(X)=logdetS(X)\psi(X) = -\log\det S(X) leads to 2ψ(X)=A(S1S1)A\nabla^2\psi(X) = A(S^{-1}\otimes S^{-1})A^\top.

The log-barrier Hessian is self-concordant with explicit constants; for G(x)=2ϕ(x)G(x) = \nabla^2\phi(x), one obtains bounds such as 2vGGDG(v)2vGG-2\|v\|_G G \preceq DG(v) \preceq 2\|v\|_G G, and higher order derivative controls ensuring smooth variation in metric and curvature. Such self-concordance is crucial for both algorithmic stability and convergence rate analysis (Gatmiry et al., 2022, Gatmiry et al., 2023).

2. Riemannian Langevin and Hamiltonian Flows

The Riemannian setting generalizes classical Langevin and Hamiltonian dynamics for sampling or optimization. The Riemannian Langevin diffusion is governed by

dXt=[(G1(Xt))G1(Xt)Df(Xt)]dt+2G1/2(Xt)dBtdX_t = [\nabla \cdot (G^{-1}(X_t)) - G^{-1}(X_t)Df(X_t)] dt + \sqrt{2} G^{-1/2}(X_t) dB_t

where f(x)f(x) encodes the potential function and G(x)G(x) the log-barrier metric. The invariant measure is ν(x)=eF(x)\nu(x) = e^{-F(x)} with F(x)=logπ(x)12logdetG(x)F(x) = -\log\pi(x) - \frac12\log\det G(x), integrating both target density and metric geometry (Gatmiry et al., 2022). The steepest-descent direction becomes Glogπ=G1(x)Dlogπ\nabla_G \log \pi = G^{-1}(x) D \log \pi.

Riemannian Hamiltonian Monte Carlo (RHMC) instead exploits geodesics governed by the Hamiltonian H(x,v)=f(x)+12vG(x)1v+12logdetG(x)H(x, v) = f(x) + \frac12 v^\top G(x)^{-1} v + \frac12 \log \det G(x), with evolution equations

x˙=G(x)1v,v˙=f(x)+12G(x)1x[logdetG(x)]12G(x)1DG(x)[x˙,x˙]\dot{x} = G(x)^{-1} v, \qquad \dot{v} = -\nabla f(x) + \frac12 G(x)^{-1}\partial_x[\log \det G(x)] - \frac12 G(x)^{-1} D G(x)[\dot{x}, \dot{x}]

The intrinsic form x˙x˙=μ(x)\nabla_{\dot{x}}\dot{x} = \mu(x) reveals the drift induced by both the target density and the metric's curvature (Gatmiry et al., 2023).

3. Discretization Algorithms and Mixing Rates

Unadjusted Langevin algorithms (ULA) and Dikin walk Metropolized proposals allow practical implementation in the log-barrier geometry. Each step involves evaluating or approximating G(x)G(x), then generating uN(0,H(x)1)u \sim \mathcal{N}(0, H(x)^{-1}) as a proposal, and accepting/rejecting based on the Metropolis–Hastings ratio. To manage computational complexity, spectral sketching provides H^(x)G(x)\widehat{H}(x) \approx G(x) within (1±ε)(1\pm\varepsilon), ensuring nearly affine-invariant sampling with polynomial runtime (Gu et al., 2024).

Mixing time analysis leverages local conductance, isoperimetric inequalities in the cross-ratio metric induced by the barrier, and Lipschitz continuity of the proposal laws (Gu et al., 2024). For polytopes with nn facets and domain in Rd\mathbb{R}^d, robust sampling via the Lee–Sidford barrier achieves mixing rates O~((d2+dL2R2)log(w/δ))\tilde{O}((d^2 + dL^2R^2)\log(w/\delta)), superseding ball walk and hit-and-run for both polytopal and spectrahedral domains.

For Riemannian Langevin algorithms, nonasymptotic KL-convergence is established under log-Sobolev inequalities (with constant α\alpha) and self-concordant barriers. With step size η\eta set according to explicit metric, dimension, and Lipschitz constants, one attains exponential KL-contraction:

H(ρkπ)exp(316αηk)H(ρ0π)+O(η/α)H(\rho_k \|\pi) \leq \exp\left( -\frac{3}{16}\alpha\eta k \right) H(\rho_0 \|\pi) + O(\eta/\alpha)

The step-size restriction ensures control of geometric discretization error stemming from metric nonuniformity and curvature (Gatmiry et al., 2022).

4. Interior-point Methods on Riemannian Manifolds

Log-barrier Riemannian dynamics extend to optimization via interior-point methods with inequality and equality constraints on manifolds (M,g)(M, g). The Riemannian gradient and Hessian operators adapt standard Newton and trust-region schemes:

  • Barrier objective: Fμ(x)=f(x)μi=1mlngi(x)F_\mu(x) = f(x) - \mu\sum_{i=1}^m \ln g_i(x), well-defined on the strictly feasible region.
  • Riemannian gradient: gradFμ(x)=gradf(x)μi(1/gi(x))gradgi(x)\mathrm{grad}\,F_\mu(x) = \mathrm{grad}\,f(x) - \mu \sum_i (1/g_i(x)) \mathrm{grad}\,g_i(x)
  • Riemannian Hessian: HessFμ(x)[η]=Hessf(x)[η]μi(1/gi(x))Hessgi(x)[η]+μi(1/gi(x)2)gradgi(x),ηxgradgi(x)\mathrm{Hess}\,F_\mu(x)[\eta] = \mathrm{Hess}\,f(x)[\eta] - \mu \sum_i (1/g_i(x)) \mathrm{Hess}\,g_i(x)[\eta] + \mu \sum_i (1/g_i(x)^2) \langle \mathrm{grad}\,g_i(x),\eta\rangle_x \mathrm{grad}\,g_i(x)

By forming the KKT map and solving for the Newton step (Δxk,Δλk,Δμk)(\Delta x_k, \Delta \lambda_k, \Delta \mu_k) using the block Jacobian, the next iterate xk+1x_{k+1} is obtained via second-order retraction. Local convergence is ensured given LICQ (constraint qualification), strict complementarity, and second-order sufficiency (Obara et al., 26 May 2025).

With barrier parameter update μk+1=cμk1+τ\mu_{k+1} = c\mu_k^{1+\tau}, near-quadratic convergence is achievable; otherwise, linear reduction yields superlinear contraction towards the solution and corresponding multipliers. Second-order stationarity is maintained by verifying the minimum eigenvalue of the Hessian model remains above O(μk)-O(\mu_k) (Obara et al., 26 May 2025).

5. Role of Self-Concordance and Curvature Bounds

Self-concordance of the log-barrier and related functions ensures both smoothness and stability of Riemannian algorithms. Explicit bounds on derivatives:

  • DkG(x)[u1,...,uk]uix|D^k G(x)[u_1,...,u_k]| \lesssim \prod \|u_i\|_x for k=1,2,3k=1,2,3
  • For the log-barrier, constants (γ1,γ2,γ3)(\gamma_1, \gamma_2, \gamma_3) are (2,4,6)(2,4,6)

These properties enable the Taylor expansions necessary for algorithmic error control, allowing step-size choices that keep geometric discretization error proportional to O(η)O(\eta) in dimension and curvature (Gatmiry et al., 2022, Gatmiry et al., 2023).

Smoothness of Hamiltonian curves (as in RHMC) is quantified via second-order variation equations, with curvature and drift normality constants controlling total variation in one-step proposals and yielding polynomial mixing time bounds (Gatmiry et al., 2023).

6. Summary of Practical Operation and Guarantees

Practical algorithms in log-barrier Riemannian dynamics proceed via:

  1. At each iterate, construct or sketch the local Hessian metric H^(x)\widehat H(x).
  2. Propose a step via local Gaussian using the metric inverse.
  3. Accept or reject the proposal by evaluating the barrier-induced Metropolis ratio.
  4. Repeat for an explicitly bounded number of iterations to achieve desired TV-error.

For optimization, a single Newton step per updated barrier parameter suffices for local superlinear or near-quadratic convergence under standard regularity conditions, without inner iterations (Obara et al., 26 May 2025).

Log-barrier Riemannian dynamics unify a spectrum of manifold-based algorithms, balancing affine invariance, efficient sampling, and robust constrained optimization. The framework leverages the geometry of self-concordant barriers, rigorous conductance and isoperimetric analysis, and modern matrix sketching techniques for scalable implementation and sharp convergence guarantees (Gatmiry et al., 2022, Gu et al., 2024, Gatmiry et al., 2023, Obara et al., 26 May 2025).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Log-Barrier Riemannian Dynamics.