Hyperbolic-Guided Denoising
- Hyperbolic-guided denoising uses hyperbolic space to model data structures, overcoming non-convexity with PSD matrix relaxations.
- The methodology applies convex optimization, including ADMM, for efficient denoising using geometric constraints on hyperbolic sheets.
- Applications in Gaussian image processing reveal improved mean and variance restoration, showcasing practical efficacy.
Hyperbolic-guided denoising is a technique for restoring data whose underlying structure or features are naturally modeled in hyperbolic space, specifically the hyperbolic sheet denoted . This approach leverages convex relaxations based on positive semidefinite (PSD) matrix representations, enabling efficient optimization for denoising tasks on hyperbolic-valued data. The central innovation is overcoming the non-convexity of the hyperbolic sheet by encoding its geometric and algebraic constraints through PSD block-matrices and subsequently applying convex optimization schemes such as ADMM. Applications include Gaussian image processing scenarios where both the pixelwise mean and variance are restored under a unified geometric model (Beinert et al., 2024).
1. Definition and Non-convexity of Hyperbolic Sheets
The -dimensional affine hyperbolic sheet is embedded in using the Minkowski bilinear form . The hyperboloid is then defined by
with the Riemannian metric . Although geometrically resides in the half-space , it is not convex in . This lack of convexity means that convex combinations of two points on do not generally lie on , complicating direct variational minimization strategies.
2. Euclidean Embedding Through PSD Block-Matrix Representation
To bypass the non-convex constraint , the framework introduces auxiliary scalar variables and encodes geometric constraints (such as and ) via PSD block-matrices:
- Tikhonov-type model: For each edge , introduce scalars , , , and construct a block-matrix with structure incorporating , , their 'mirrored' versions (where ), and the scalars. The constraints are equivalent to demanding and .
- Total variation (TV) model: For each vertex , a unary matrix is constructed similarly, encoding and via and .
This matrix relaxation directly encodes hyperbolic geometry into a convex feasible set defined by PSD conditions.
3. Denoising Energies in Hyperbolic Geometry
Two canonical variational energies are formulated:
- Tikhonov-type energy:
subject to .
- Total Variation (TV) energy:
with the same geometric constraint.
These energies are rewritten as linear functions of the auxiliary variables plus quadratic forms in the coordinates. Hyperbolic constraints are enforced via assembled PSD block-matrices over nodes and/or edges.
4. Convex Relaxation and Semidefinite Programming
The non-convex rank constraints ( and similar for ) are dropped, leaving only the PSD conditions. The resulting feasible sets
are convex slices of the PSD cone. The variational objective comprises linear and quadratic terms of the block-matrix variables, yielding a convex semidefinite program that admits tractable solution methods.
5. ADMM-Based Solution Algorithm
Both relaxed denoising problems are cast into the form
where is a smooth convex function, enforces convex PSD constraints, and is linear.
The augmented Lagrangian is
and the ADMM updates proceed by alternating minimization steps:
- Update by solving decoupled small linear systems.
- Project to the halfspace for variables.
- Project block-matrices onto the PSD cone.
- Update Lagrange multipliers.
Closed-form updates are derived for all steps (see Theorems 3.1 and 3.2 in (Beinert et al., 2024)). Convergence is guaranteed by classical ADMM results for convex problems.
6. Influence of Hyperbolic Geometry in Applications
Hyperbolic geometry enables joint modeling of both location and scale (variance) within a unified structure. In Gaussian image processing, the mean and standard deviation at each pixel are bundled into points in via the Fisher–Rao metric. The hyperbolic distance penalizes relative changes in variance more naturally than Euclidean differences, yielding denoised outcomes more attuned to the intrinsic geometry of the data. Tikhonov regularization tends to over-smooth variance maps, while TV variants preserve sharper features in both mean and variance, reflecting the negative curvature of .
7. Experimental Results: Retina-Scan Denoising
A practical demonstration involves denoising noisy retina scans, each pixel modeled as Gaussian. Empirical estimates for mean and variance are mapped to via a Poincaré half-plane to hyperboloid isometry. Both Tikhonov and TV ADMM algorithms are applied:
- Tikhonov denoiser produces smooth mean fields but often overestimates local variance.
- TV denoiser preserves fine vessel structures in mean and local contrasts in variance.
Quantitatively, denoised signal-to-noise ratio for mean improves from approximately (input) to (TV), with similar improvements for variance. Hyperboloid constraint violations () converge to below rapidly. Compared to manifold Douglas–Rachford approaches, the TV ADMM achieves similar SNR at substantially reduced computation time (TV ADMM min vs. PDRA hr).
8. Extensions and Future Directions
Potential avenues include:
- Application of block-matrix relaxations to other Riemannian symmetric spaces (e.g., symmetric positive definite matrices, Grassmannians).
- Denoising vector-valued variances in diffusion-tensor MRI (higher-dimensional ).
- Integration of hyperbolic regularization with learned data priors via plug-and-play ADMM.
- Extending convex relaxations to hyperbolic graph embeddings for network denoising.
These directions suggest promising opportunities to generalize the hyperbolic-guided denoising paradigm across statistical manifolds and structured data domains (Beinert et al., 2024).