Papers
Topics
Authors
Recent
Search
2000 character limit reached

Hyperbolic-Guided Denoising

Updated 12 January 2026
  • Hyperbolic-guided denoising uses hyperbolic space to model data structures, overcoming non-convexity with PSD matrix relaxations.
  • The methodology applies convex optimization, including ADMM, for efficient denoising using geometric constraints on hyperbolic sheets.
  • Applications in Gaussian image processing reveal improved mean and variance restoration, showcasing practical efficacy.

Hyperbolic-guided denoising is a technique for restoring data whose underlying structure or features are naturally modeled in hyperbolic space, specifically the hyperbolic sheet denoted HdH^d. This approach leverages convex relaxations based on positive semidefinite (PSD) matrix representations, enabling efficient optimization for denoising tasks on hyperbolic-valued data. The central innovation is overcoming the non-convexity of the hyperbolic sheet by encoding its geometric and algebraic constraints through PSD block-matrices and subsequently applying convex optimization schemes such as ADMM. Applications include Gaussian image processing scenarios where both the pixelwise mean and variance are restored under a unified geometric model (Beinert et al., 2024).

1. Definition and Non-convexity of Hyperbolic Sheets

The dd-dimensional affine hyperbolic sheet HdH^d is embedded in Rd+1\mathbb{R}^{d+1} using the Minkowski bilinear form (x,y)m:=i=1dxiyixd+1yd+1(x, y)_m := \sum_{i=1}^d x_i y_i - x_{d+1} y_{d+1}. The hyperboloid is then defined by

Hd:={xRd+1(x,x)m=1 and xd+1>0}H^d := \left\{ x \in \mathbb{R}^{d+1} \mid (x, x)_m = -1 \text{ and } x_{d+1} > 0 \right\}

with the Riemannian metric distH(x,y)=acosh((x,y)m)\text{dist}_H(x, y) = \text{acosh}(- (x, y)_m). Although HdH^d geometrically resides in the half-space Rd×[1,)\mathbb{R}^d \times [1,\infty), it is not convex in Rd+1\mathbb{R}^{d+1}. This lack of convexity means that convex combinations of two points on HdH^d do not generally lie on HdH^d, complicating direct variational minimization strategies.

2. Euclidean Embedding Through PSD Block-Matrix Representation

To bypass the non-convex constraint xHdx \in H^d, the framework introduces auxiliary scalar variables and encodes geometric constraints (such as xHdx \in H^d and x22=(x,x)m=1\|x\|_2^2 = - (x, x)_m = 1) via PSD block-matrices:

  • Tikhonov-type model: For each edge (n,m)E(n,m) \in E, introduce scalars αn\alpha_n, αm\alpha_m, β(n,m)\beta_{(n,m)}, γ(n,m)\gamma_{(n,m)} and construct a (d+5)×(d+5)(d+5) \times (d+5) block-matrix A(n,m)A_{(n,m)} with structure incorporating xnx_n, xmx_m, their 'mirrored' versions x~\tilde{x} (where x~=(x1,,xd,xd+1)T\tilde{x} = (x_1,\ldots,x_d,-x_{d+1})^T), and the scalars. The constraints xn,xmHdx_n, x_m \in H^d are equivalent to demanding rank(A(n,m))=d+1\text{rank}(A_{(n,m)}) = d + 1 and A(n,m)0A_{(n,m)} \succeq 0.
  • Total variation (TV) model: For each vertex nn, a unary (d+3)×(d+3)(d+3)\times(d+3) matrix BnB_n is constructed similarly, encoding xnHdx_n \in H^d and xn22=αn\|x_n\|_2^2 = \alpha_n via rank(Bn)=d+1\text{rank}(B_n) = d+1 and Bn0B_n \succeq 0.

This matrix relaxation directly encodes hyperbolic geometry into a convex feasible set defined by PSD conditions.

3. Denoising Energies in Hyperbolic Geometry

Two canonical variational energies are formulated:

  • Tikhonov-type energy:

Etik(u)=12nVunfn22+λ2(n,m)Eunum22E_{\text{tik}}(u) = \frac{1}{2} \sum_{n \in V} \|u_n - f_n\|_2^2 + \frac{\lambda}{2} \sum_{(n,m)\in E} \|u_n - u_m\|_2^2

subject to unHdu_n \in H^d.

  • Total Variation (TV) energy:

ETV(u)=12nVunfn22+μ(n,m)Eunum1E_{TV}(u) = \frac{1}{2} \sum_{n \in V} \|u_n - f_n\|_2^2 + \mu \sum_{(n,m)\in E} \|u_n - u_m\|_1

with the same geometric constraint.

These energies are rewritten as linear functions of the auxiliary variables plus quadratic forms in the coordinates. Hyperbolic constraints are enforced via assembled PSD block-matrices over nodes and/or edges.

4. Convex Relaxation and Semidefinite Programming

The non-convex rank constraints (rank(A(n,m))=d+1\text{rank}(A_{(n,m)}) = d + 1 and similar for BnB_n) are dropped, leaving only the PSD conditions. The resulting feasible sets

{A(n,m)0 (n,m)}or{Bn0 n}\{A_{(n,m)} \succeq 0 \ \forall\, (n,m) \} \quad \text{or} \quad \{B_n \succeq 0 \ \forall\, n\}

are convex slices of the PSD cone. The variational objective comprises linear and quadratic terms of the block-matrix variables, yielding a convex semidefinite program that admits tractable solution methods.

5. ADMM-Based Solution Algorithm

Both relaxed denoising problems are cast into the form

minF(U,X,)+G(Z)s.t.L(U,X,)=Z\min\, F(U, X, \dots) + G(Z) \quad \text{s.t.}\quad L(U, X, \dots) = Z

where FF is a smooth convex function, GG enforces convex PSD constraints, and LL is linear.

The augmented Lagrangian is

Lρ(U,X,Z,Λ)=F(U,X)+IPSD(Z)+Λ,L(U,X)Z+ρ2L(U,X)ZF2L_\rho(U, X, Z, \Lambda) = F(U, X) + I_{PSD}(Z) + \langle\Lambda, L(U, X) - Z\rangle + \frac{\rho}{2} \|L(U, X) - Z\|_F^2

and the ADMM updates proceed by alternating minimization steps:

  • Update (U,X)(U, X) by solving decoupled small linear systems.
  • Project to the halfspace {xd+11}\{x_{d+1} \geq 1\} for uu variables.
  • Project block-matrices onto the PSD cone.
  • Update Lagrange multipliers.

Closed-form updates are derived for all steps (see Theorems 3.1 and 3.2 in (Beinert et al., 2024)). Convergence is guaranteed by classical ADMM results for convex problems.

6. Influence of Hyperbolic Geometry in Applications

Hyperbolic geometry enables joint modeling of both location and scale (variance) within a unified structure. In Gaussian image processing, the mean and standard deviation at each pixel are bundled into points in H2H^2 via the Fisher–Rao metric. The hyperbolic distance penalizes relative changes in variance more naturally than Euclidean differences, yielding denoised outcomes more attuned to the intrinsic geometry of the data. Tikhonov regularization tends to over-smooth variance maps, while TV variants preserve sharper features in both mean and variance, reflecting the negative curvature of H2H^2.

7. Experimental Results: Retina-Scan Denoising

A practical demonstration involves denoising K=20K=20 noisy retina scans, each pixel modeled as Gaussian. Empirical estimates for mean μ^ij\hat{\mu}_{ij} and variance σ^ij2\hat{\sigma}_{ij}^2 are mapped to H2H^2 via a Poincaré half-plane to hyperboloid isometry. Both Tikhonov and TV ADMM algorithms are applied:

  • Tikhonov denoiser (λ1.5,ρ=10)(\lambda \approx 1.5, \rho = 10) produces smooth mean fields but often overestimates local variance.
  • TV denoiser (μ0.15,ρ=1)(\mu \approx 0.15, \rho = 1) preserves fine vessel structures in mean and local contrasts in variance.

Quantitatively, denoised signal-to-noise ratio for mean improves from approximately 5dB5\,\text{dB} (input) to 6dB6\,\text{dB} (TV), with similar improvements for variance. Hyperboloid constraint violations ((x,x)m+1|(x,x)_m + 1|) converge to below 10410^{-4} rapidly. Compared to manifold Douglas–Rachford approaches, the TV ADMM achieves similar SNR at substantially reduced computation time (TV ADMM 3\approx 3 min vs. PDRA 2\approx 2 hr).

8. Extensions and Future Directions

Potential avenues include:

  • Application of block-matrix relaxations to other Riemannian symmetric spaces (e.g., symmetric positive definite matrices, Grassmannians).
  • Denoising vector-valued variances in diffusion-tensor MRI (higher-dimensional HdH^d).
  • Integration of hyperbolic regularization with learned data priors via plug-and-play ADMM.
  • Extending convex relaxations to hyperbolic graph embeddings for network denoising.

These directions suggest promising opportunities to generalize the hyperbolic-guided denoising paradigm across statistical manifolds and structured data domains (Beinert et al., 2024).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Hyperbolic-Guided Denoising.