Soft-Radial Projection in Learning and Cosmology
- Soft-radial projection is a smooth transformation that maps high-dimensional inputs onto constrained sets while retaining crucial structural and gradient information.
- In machine learning, it overcomes gradient degeneracy typical of hard projections by maintaining nonzero Jacobian eigenvalues and ensuring strict feasibility.
- In cosmology, it unmixes radial scale information, enabling precise separation of linear and nonlinear modes for better parameter estimation.
Soft-radial projection refers to a class of mathematical transformations designed to retain crucial structural or statistical information while mapping from high-dimensional or unconstrained spaces to sets obeying specific constraints. Two distinct but related contexts dominate the modern literature: (1) differentiable mappings for constrained end-to-end learning in optimization and machine learning (Schneider et al., 3 Feb 2026), and (2) harmonic-weighted projections in cosmological data analysis to unmix radial scale information in redshift-space distortion (RSD) measurements (Taylor et al., 2021). Both address fundamental limitations of hard or orthogonal projections, such as gradient degeneracy or modal mixing, by introducing "soft" or smooth radial weighting along the transformation direction.
1. Conceptual Foundations and Motivations
Hard projections, such as orthogonal projection onto a constraint set , are widely used in machine learning pipelines and scientific data processing. However, these projections induce degeneracies in the Jacobian, resulting in rank deficiency and thus non-invertible transformations in directions orthogonal to the active constraints. This can stall optimization, nullify important gradients, and hinder learning dynamics, especially in safety-critical domains requiring constrained predictions (Schneider et al., 3 Feb 2026).
Similarly, in cosmological data analysis, naive tomographic projections of 3D spatial observations (such as RSDs) result in mixing of linear and nonlinear scales due to the broad Fourier kernels associated with tomographic bins. This mixing introduces model bias, particularly problematic for precision constraints on parameters sensitive to small-scale nonlinearities and their theoretical uncertainties (Taylor et al., 2021).
Soft-radial projection schemes mitigate these challenges by replacing abrupt, boundary-collapsing projections with smooth, radially parameterized maps or soft harmonic weights. These constructions recover desirable properties: strictly feasible outputs for constrained learning and nearly lossless, scale-selective mode separation for cosmological statistics.
2. Formal Definition: Soft-Radial Projection in Constrained Learning
Let be a closed convex set with nonempty interior , and let be a fixed anchor point.
Construction
- Hard Radial Projection: For , define:
- Radial Contraction: Fix a strictly increasing function with and . Typical parametrizations include rational, exponential, or hyperbolic forms, with tunable scale and minimum value. governs the strength and smoothness of contraction.
- Soft-Radial Projection Map:
- Ray-wise Parameterization: Writing with , define . Then , and the full mapping reads
The key property is that for all , achieving strict feasibility without inducing degeneracy in the Jacobian almost everywhere (Schneider et al., 3 Feb 2026).
3. Theoretical Properties and Advantages
Jacobian Structure
For (assuming for brevity):
- Interior ():
The eigenvalues are strictly positive, preserving gradients along all directions. The transformation is invertible.
- Exterior ():
Using the Minkowski gauge/recession function for the convex set , the Jacobian is generically full-rank except on a null set.
Comparison to Standard Projection
Orthogonal projection onto annihilates directions orthogonal to the boundary, which manifests as vanishing singular values in the Jacobian and precludes backpropagation through these collapsed modes. By contrast, soft-radial projection maintains non-zero eigenvalues in all directions, thus fully preserving the gradient signal required for end-to-end learning (Schneider et al., 3 Feb 2026).
Universal Approximation
Given any universal approximator (e.g., deep ReLU networks), the class remains universal on :
Thus, soft-radial layers can enforce constraints on predictions without loss of expressive power (Schneider et al., 3 Feb 2026).
4. Implementation and Computational Complexity
Algorithmic implementations of soft-radial projection rely on:
- Anchor-shifting: Center inputs at .
- Boundary search: For polyhedral , compute . For ellipsoids or more general sets, closed form or root finding is adopted.
- Application of radial map : Evaluate and combine with .
- Autodifferentiation: The backward pass uses the composite Jacobian formula for efficient gradient computation.
Complexity is for linear constraints, for balls, and for root-finding (for general convex sets) per input (Schneider et al., 3 Feb 2026).
5. Extended Context: Soft-Radial Projection in Cosmological Analysis
In spectroscopic RSD and weak lensing surveys, soft-radial (radial-harmonic) projection adopts harmonic radial weighting:
where is comoving distance. This weighting, when applied in constructing generalized tomographic window functions, produces angular spectra:
with the kernel sharply localized in due to the harmonic weight's near-delta-function Fourier transform (Taylor et al., 2021).
This unmixes small-scale (FoG) and large-scale (linear) radial modes, enabling scale-selective analysis that is otherwise impossible using top-hat binning:
- Each high- bin becomes sensitive to a narrow band in
- Large-scale modes () are handled separately in hybrid estimators
The method nearly regains the constraining power of the full 3D power spectrum , essentially eliminating parameter-dependent model bias due to FoG scale mixing. This approach is particularly well-suited for joint RSD and weak lensing analyses (Taylor et al., 2021).
6. Empirical Results and Applications
End-to-End Learning with Constraints
In constrained machine learning, soft-radial projection achieves strict feasibility and superior convergence relative to baselines.
- Portfolio Optimization (capped simplex constraints): Soft-radial projection (SRP) delivers substantially higher net Sharpe ratio (0.90) and lower turnover (0.06) compared to softmax, orthogonal projection, DC3, and HardNet baselines (Schneider et al., 3 Feb 2026).
- Resource Dispatch (scaled capped simplex constraints): SRP matches the best-served rate (0.84) while maintaining higher robustness in the presence of noisy dynamics.
| Method | SR (net) | Turnover |
|---|---|---|
| Softmax | 0.63 (ยฑ0.19) | 0.22 |
| O-Proj | 0.25 (ยฑ0.18) | 0.48 |
| DC3 | 0.63 (ยฑ0.08) | 0.23 |
| HardNet | 0.62 (ยฑ0.11) | 0.19 |
| SRP | 0.90 (ยฑ0.03) | 0.06 |
Cosmological Parameter Estimation
Comparisons of cosmological information recovery between standard tomographic projection and soft-radial (radial-harmonic) weighting:
| Estimator | ||
|---|---|---|
| $3$D | 0.013 | 0.1 |
| Tomography | 0.055 | 0.1 |
| Hybrid (radial-harmonic) | 0.013 | 0.1 |
Tomographic estimators lose a factor of in statistical precision for the growth rate , while the hybrid soft-radial projection method recovers nearly all 3D information in angular space (Taylor et al., 2021).
7. Limitations and Extensions
Key limitations of current soft-radial projection schemes include the requirement that be convex and contain a known interior point, and the computational demand of harmonic kernel evaluation in cosmological settings. Extensions under active research include handling nonconvex feasible sets, anchor point selection, piecewise convex covers, or endowing the radial contraction map with data-driven learnable parameters (Schneider et al., 3 Feb 2026). In cosmology, optimal partitioning of -bins, hybridization with 3D estimators for large-scale modes, and cross-correlation with weak lensing tomography represent continued areas of development (Taylor et al., 2021).