Papers
Topics
Authors
Recent
Search
2000 character limit reached

Soft-Radial Projection in Learning and Cosmology

Updated 4 February 2026
  • Soft-radial projection is a smooth transformation that maps high-dimensional inputs onto constrained sets while retaining crucial structural and gradient information.
  • In machine learning, it overcomes gradient degeneracy typical of hard projections by maintaining nonzero Jacobian eigenvalues and ensuring strict feasibility.
  • In cosmology, it unmixes radial scale information, enabling precise separation of linear and nonlinear modes for better parameter estimation.

Soft-radial projection refers to a class of mathematical transformations designed to retain crucial structural or statistical information while mapping from high-dimensional or unconstrained spaces to sets obeying specific constraints. Two distinct but related contexts dominate the modern literature: (1) differentiable mappings for constrained end-to-end learning in optimization and machine learning (Schneider et al., 3 Feb 2026), and (2) harmonic-weighted projections in cosmological data analysis to unmix radial scale information in redshift-space distortion (RSD) measurements (Taylor et al., 2021). Both address fundamental limitations of hard or orthogonal projections, such as gradient degeneracy or modal mixing, by introducing "soft" or smooth radial weighting along the transformation direction.

1. Conceptual Foundations and Motivations

Hard projections, such as orthogonal projection onto a constraint set CC, are widely used in machine learning pipelines and scientific data processing. However, these projections induce degeneracies in the Jacobian, resulting in rank deficiency and thus non-invertible transformations in directions orthogonal to the active constraints. This can stall optimization, nullify important gradients, and hinder learning dynamics, especially in safety-critical domains requiring constrained predictions (Schneider et al., 3 Feb 2026).

Similarly, in cosmological data analysis, naive tomographic projections of 3D spatial observations (such as RSDs) result in mixing of linear and nonlinear scales due to the broad Fourier kernels associated with tomographic bins. This mixing introduces model bias, particularly problematic for precision constraints on parameters sensitive to small-scale nonlinearities and their theoretical uncertainties (Taylor et al., 2021).

Soft-radial projection schemes mitigate these challenges by replacing abrupt, boundary-collapsing projections with smooth, radially parameterized maps or soft harmonic weights. These constructions recover desirable properties: strictly feasible outputs for constrained learning and nearly lossless, scale-selective mode separation for cosmological statistics.

2. Formal Definition: Soft-Radial Projection in Constrained Learning

Let CโŠ‚RnC\subset\mathbb{R}^n be a closed convex set with nonempty interior Int(C)\mathrm{Int}(C), and let u0โˆˆInt(C)u_0\in\mathrm{Int}(C) be a fixed anchor point.

Construction

  1. Hard Radial Projection: For uโˆˆRnu\in\mathbb{R}^n, define:

ฮฑโˆ—(u):=supโก{ฮฑโˆˆ[0,1]:u0+ฮฑ(uโˆ’u0)โˆˆC}\alpha^*(u) := \sup\{\alpha\in[0,1]: u_0 + \alpha(u-u_0)\in C\}

q(u)={u,uโˆˆCย u0+ฮฑโˆ—(u)(uโˆ’u0),uโˆ‰Cq(u) = \begin{cases} u, & u\in C \ u_0 + \alpha^*(u)(u-u_0), & u\notin C \end{cases}

  1. Radial Contraction: Fix a C1C^1 strictly increasing function r:[0,โˆž)โ†’[ฮต,1)r:[0,\infty)\to[\varepsilon,1) with r(0)=ฮต>0r(0)=\varepsilon>0 and limโกฯโ†’โˆžr(ฯ)=1\lim_{\rho\to\infty} r(\rho)=1. Typical parametrizations include rational, exponential, or hyperbolic forms, with tunable scale and minimum value. r(โ‹…)r(\cdot) governs the strength and smoothness of contraction.
  2. Soft-Radial Projection Map:

ฯ(u):=โˆฅuโˆ’u0โˆฅ2\rho(u) := \|u-u_0\|^2

p(u):=u0+r(ฯ(u))(q(u)โˆ’u0)p(u) := u_0 + r(\rho(u)) \big( q(u) - u_0 \big)

  1. Ray-wise Parameterization: Writing u=tvu = t v with โˆฅvโˆฅ=1\|v\|=1, define tbar(v):=supโก{tโ‰ฅ0:tvโˆˆC}t_\text{bar}(v) := \sup\{ t \geq 0 : t v \in C \}. Then q(tv)=minโก{t,tbar(v)}vq(t v) = \min\{t, t_\text{bar}(v)\} v, and the full mapping reads

p(tv)=r(t2)minโก{t,tbar(v)}v.p(t v) = r(t^2)\min\{t, t_\text{bar}(v)\} v.

The key property is that p(u)โˆˆInt(C)p(u)\in\mathrm{Int}(C) for all uu, achieving strict feasibility without inducing degeneracy in the Jacobian almost everywhere (Schneider et al., 3 Feb 2026).

3. Theoretical Properties and Advantages

Jacobian Structure

For p(u)=r(โˆฅuโˆฅ2)q(u)p(u)=r(\|u\|^2)q(u) (assuming u0=0u_0=0 for brevity):

  • Interior (uโˆˆInt(C)u\in\mathrm{Int}(C)):

Jp(u)=r(ฯ)I+2rโ€ฒ(ฯ)uuโŠบJ_p(u) = r(\rho) I + 2 r'(\rho) uu^\intercal

The eigenvalues are strictly positive, preserving gradients along all directions. The transformation is invertible.

  • Exterior (uโˆ‰Cu\notin C):

Using the Minkowski gauge/recession function for the convex set CC, the Jacobian is generically full-rank except on a null set.

Comparison to Standard Projection

Orthogonal projection onto โˆ‚C\partial C annihilates directions orthogonal to the boundary, which manifests as vanishing singular values in the Jacobian and precludes backpropagation through these collapsed modes. By contrast, soft-radial projection maintains non-zero eigenvalues in all directions, thus fully preserving the gradient signal required for end-to-end learning (Schneider et al., 3 Feb 2026).

Universal Approximation

Given any universal approximator GG (e.g., deep ReLU networks), the class {pโˆ˜g:gโˆˆG}\{p\circ g : g \in G\} remains universal on CC:

โˆ€h:Zโ†’Cย continuous,ย โˆ€ฯต>0,โˆƒgโˆˆG:supโกzโˆˆZโˆฅp(g(z))โˆ’h(z)โˆฅโ‰คฯต\forall h:Z\to C \text{ continuous, } \forall \epsilon>0, \exists g\in G : \sup_{z\in Z} \|p(g(z)) - h(z)\| \leq \epsilon

Thus, soft-radial layers can enforce constraints on predictions without loss of expressive power (Schneider et al., 3 Feb 2026).

4. Implementation and Computational Complexity

Algorithmic implementations of soft-radial projection rely on:

  1. Anchor-shifting: Center inputs at u0u_0.
  2. Boundary search: For polyhedral C={x:Axโ‰คb}C = \{x : Ax\leq b\}, compute ฮฑโˆ—=minโกi:aiโŠบv>0biโˆ’aiโŠบu0aiโŠบv\alpha^* = \min_{i: a_i^\intercal v > 0} \frac{b_i-a_i^\intercal u_0}{a_i^\intercal v}. For ellipsoids or more general sets, closed form or root finding is adopted.
  3. Application of radial map rr: Evaluate r(ฯ)r(\rho) and combine with q(u)q(u).
  4. Autodifferentiation: The backward pass uses the composite Jacobian formula Jp(u)=r(ฯ)Jq(u)+2rโ€ฒ(ฯ)q(u)uโŠบJ_p(u) = r(\rho)J_q(u) + 2 r'(\rho) q(u) u^\intercal for efficient gradient computation.

Complexity is O(m)O(m) for mm linear constraints, O(1)O(1) for balls, and O(logโก(1/ฯ„))O(\log(1/\tau)) for root-finding (for general convex sets) per input (Schneider et al., 3 Feb 2026).

5. Extended Context: Soft-Radial Projection in Cosmological Analysis

In spectroscopic RSD and weak lensing surveys, soft-radial (radial-harmonic) projection adopts harmonic radial weighting:

w(ฮท,z)=cosโก(2ฯ€ฮทrref(z)ฮ”rref)w(\eta,z) = \cos\left(2\pi \eta \frac{r^{\rm ref}(z)}{\Delta r^{\rm ref}}\right)

where rref(z)r^{\rm ref}(z) is comoving distance. This weighting, when applied in constructing generalized tomographic window functions, produces angular spectra:

Cฮทaฮทb(โ„“)=1rw21ฯ€โˆซ0โˆždkโˆฅโ€…โ€ŠP(kโŠฅ2+kโˆฅ2,ฮผ)K~(kโˆฅ;ฮทa,ฮทb)C^{\eta_a\eta_b}(\ell) = \frac{1}{r_w^2} \frac{1}{\pi} \int_0^{\infty} dk_\parallel\;P\big(\sqrt{k_\perp^2+k_\parallel^2}, \mu\big) \widetilde{K}(k_\parallel; \eta_a, \eta_b)

with the kernel K~(kโˆฅ;ฮทa,ฮทb)\widetilde{K}(k_\parallel; \eta_a, \eta_b) sharply localized in kโˆฅk_\parallel due to the harmonic weight's near-delta-function Fourier transform (Taylor et al., 2021).

This unmixes small-scale (FoG) and large-scale (linear) radial modes, enabling scale-selective analysis that is otherwise impossible using top-hat binning:

  • Each high-ฮท\eta bin becomes sensitive to a narrow band in kโˆฅk_\parallel
  • Large-scale modes (kโˆฅโ‰ฒ0.04k_\parallel\lesssim0.04) are handled separately in hybrid estimators

The method nearly regains the constraining power of the full 3D power spectrum P(k,ฮผ)P(k,\mu), essentially eliminating parameter-dependent model bias due to FoG scale mixing. This approach is particularly well-suited for joint RSD and weak lensing analyses (Taylor et al., 2021).

6. Empirical Results and Applications

End-to-End Learning with Constraints

In constrained machine learning, soft-radial projection achieves strict feasibility and superior convergence relative to baselines.

  • Portfolio Optimization (capped simplex constraints): Soft-radial projection (SRP) delivers substantially higher net Sharpe ratio (0.90) and lower turnover (0.06) compared to softmax, orthogonal projection, DC3, and HardNet baselines (Schneider et al., 3 Feb 2026).
  • Resource Dispatch (scaled capped simplex constraints): SRP matches the best-served rate (0.84) while maintaining higher robustness in the presence of noisy dynamics.
Method SR (net) Turnover
Softmax 0.63 (ยฑ0.19) 0.22
O-Proj 0.25 (ยฑ0.18) 0.48
DC3 0.63 (ยฑ0.08) 0.23
HardNet 0.62 (ยฑ0.11) 0.19
SRP 0.90 (ยฑ0.03) 0.06

Cosmological Parameter Estimation

Comparisons of cosmological information recovery between standard tomographic projection and soft-radial (radial-harmonic) weighting:

Estimator ฯƒ(f)\sigma(f) โˆฃbfโˆฃ/ฯƒf|b_f|/\sigma_f
$3$D P(k,ฮผ)P(k,\mu) 0.013 0.1
Tomography 0.055 0.1
Hybrid (radial-harmonic) 0.013 0.1

Tomographic estimators lose a factor of โˆผ4\sim4 in statistical precision for the growth rate ff, while the hybrid soft-radial projection method recovers nearly all 3D information in angular space (Taylor et al., 2021).

7. Limitations and Extensions

Key limitations of current soft-radial projection schemes include the requirement that CC be convex and contain a known interior point, and the computational demand of harmonic kernel evaluation in cosmological settings. Extensions under active research include handling nonconvex feasible sets, anchor point selection, piecewise convex covers, or endowing the radial contraction map r(โ‹…)r(\cdot) with data-driven learnable parameters (Schneider et al., 3 Feb 2026). In cosmology, optimal partitioning of ฮท\eta-bins, hybridization with 3D estimators for large-scale modes, and cross-correlation with weak lensing tomography represent continued areas of development (Taylor et al., 2021).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (2)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Soft-Radial Projection.