Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sinh Regularized Nonuniform Sampling Series

Updated 1 February 2026
  • Sinh regularized nonuniform sampling series is a method that employs a compact sinh-based window to regularize classical Lagrange and Shannon kernels, delivering exponential convergence.
  • It integrates the sinh window into both Lagrangian and Shannon sampling frameworks, achieving errors several orders of magnitude lower than those from Gaussian regularization.
  • Optimized parameter selection and localized computation ensure the technique maintains numerical stability and robustness to noise even with nonuniform sampling.

The sinh regularized nonuniform sampling series is a family of high-convergence reconstruction formulas for bandlimited functions sampled at nonuniform nodes. Its central innovation is the use of a compactly supported, even window function of the form φsinh(x)=(1/sinhβ)sinh(β1x2/m2)\varphi_{\sinh}(x) = (1/\sinh\beta)\sinh(\beta\sqrt{1 - x^2/m^2}) for xm|x|\le m and $0$ otherwise, which regularizes classical kernels such as the Lagrange and Shannon sinc basis. When applied to Lagrangian and Shannon sampling frameworks, this regularization yields approximation errors that decay exponentially in mm and the regularization parameter β\beta—often with convergence exponents nearly twice as strong as those yielded by Gaussian windowing, and with robust numerical properties in the presence of sampling noise (Jiang et al., 25 Jan 2026, Kircheis et al., 2022).

1. Mathematical Definition and Kernel Construction

Given a finite sampling set Λ={λj}j\Lambda = \{\lambda_j\}_j, the classical Lagrange basis for entire sine-type generating function FΛ(z)F_\Lambda(z) is

QΛ,j(x)=FΛ(x)FΛ(λj)(xλj),Q_{\Lambda,j}(x) = \frac{F_\Lambda(x)}{F'_\Lambda(\lambda_j)(x-\lambda_j)},

and reconstruction is via f(x)jf(λj)QΛ,j(x)f(x) \approx \sum_j f(\lambda_j) Q_{\Lambda,j}(x).

The sinh-regularized kernel augments this basis by multiplication with the window φβ,m\varphi_{\beta, m}:

Ksinh(x,λj)=QΛ,j(x)  φβ,m(xλj)K_{\sinh}(x, \lambda_j) = Q_{\Lambda,j}(x)\;\varphi_{\beta,m}(x-\lambda_j)

where φβ,m(x)\varphi_{\beta,m}(x) is defined as

φβ,m(x)={1sinhβsinh(β1x2m2),xm, 0,x>m.\varphi_{\beta,m}(x) = \begin{cases} \frac{1}{\sinh\beta}\sinh\left(\beta\sqrt{1-\frac{x^2}{m^2}}\right), & |x|\le m,\ 0, & |x|>m. \end{cases}

For the regularized Shannon series with uniform grid, the kernel becomes

ψsinh(x)=sinc(Lπx)  φsinh(x),\psi_{\sinh}(x) = \mathrm{sinc}(L\pi x)\;\varphi_{\sinh}(x),

and the reconstruction formula is

(Rsinh,mf)(t)=kf(kL)  sinc(Lπtπk)  φsinh(tkL)(R_{\sinh,m} f)(t) = \sum_{k} f\left(\frac{k}{L}\right)\;\mathrm{sinc}(L\pi t-\pi k)\;\varphi_{\sinh}\left(t-\frac{k}{L}\right)

with window support tk/Lm/L|t-k/L| \le m/L and oversampling L=N(1+λ)L = N(1+\lambda) (Kircheis et al., 2023, Kircheis et al., 2022).

2. Error Bounds and Convergence Analysis

The central analytical advantage of the sinh regularizer is exponential error decay, with main error terms proportional to CβeβC\,\beta\,e^{-\beta}, where β\beta is typically chosen as (N1)(πδ)(N-1)(\pi-\delta) for bandwidth δ\delta (Jiang et al., 25 Jan 2026):

supx[1,1]f(x)Sf,Q,Nsinh(x)CΛβe(N1)(πδ)fL2(R).\sup_{x \in [-1,1]} | f(x) - S_{f,Q,N}^{\sinh}(x) | \le C_\Lambda\,\beta\,e^{-(N-1)(\pi-\delta)} \|f\|_{L^2(\mathbb{R})}.

For the Shannon setting, the error fulfills

fRsinh,mfC(R)32δeβfL2(R)\|f - R_{\sinh,m}f\|_{C(\mathbb{R})} \le 3\sqrt{2\delta}\,e^{-\beta}\,\|f\|_{L^2(\mathbb{R})}

subject to β=πm(1+λ2τ)/(1+λ)\beta = \pi m(1+\lambda-2\tau)/(1+\lambda), τ=δ/N\tau = \delta/N (Kircheis et al., 2022).

In the nonuniform case, provided the nodes are DD-dense (no gaps larger than T=1/LT = 1/L) and separated, identical exponential bounds hold for

(Rsinh,mnonf)(t)=n:txnmTf(xn)  sinc(π(txn)/T)  φsinh(txn).(R_{\sinh, m}^{\text{non}} f)(t) = \sum_{n: |t-x_n|\le m T} f(x_n)\;\mathrm{sinc}(\pi (t-x_n)/T)\;\varphi_{\sinh}(t-x_n).

The improved rate—e(πδ)Ne^{-(\pi-\delta)N} for sinh versus e(πδ)N/2e^{-(\pi-\delta)N/2} for Gaussian—implies errors $2$–$5$ orders of magnitude smaller for moderate NN.

3. Parameter Selection and Localization

Optimal performance requires tuning of β\beta and support mm:

  • Choice of β\beta: Set β=(N1)(πδ)\beta = (N-1)(\pi-\delta), leading to window support m=N1m = N-1. In practical contexts, further increase beyond this offer marginal returns (Jiang et al., 25 Jan 2026).
  • Localization: At any evaluation point tt, the series requires only those nodes with tλjm|t-\lambda_j| \le m for Lagrangian or kLtm|k-Lt|\le m for Shannon series, restricting summation to $2m+1$ terms.

This compact support directly localizes computation and reduces truncation error (no tail term), as proven in multiple error analyses (Kircheis et al., 2023, Kircheis et al., 2022).

4. Numerical Stability and Robustness to Noise

Numerical experiments demonstrate that sinh-regularized series maintain exponential error decay even in the presence of sample perturbations. For noise of amplitude ϵ\epsilon at each sample, the output perturbation is bounded by O(ϵm)O(\epsilon \sqrt m) (Kircheis et al., 2022). The localized summation and rapid error decay confer robustness for large-scale practical implementations, including the nonuniform fast Fourier transform (NNFFT) and fast sinc transforms (Kircheis et al., 2021).

5. Comparative Performance: Numerical Evidence

Empirical tests reconstruct bandlimited signals under both uniform and nonuniform node distributions. For the canonical example function, the following table exhibits reconstruction errors for no regularization, Gaussian, and sinh-type regularization (δ=π/2\delta = \pi/2):

Non-periodic Case (excerpt from (Jiang et al., 25 Jan 2026)): | NN | No Reg. | Gaussian | Sinh | |-----|---------------|------------------|-------------------| | 6 | 1.85×1011.85\times10^{-1} | 9.25×1029.25\times10^{-2} | 2.67×1032.67\times10^{-3} | | 12 | 1.64×1021.64\times10^{-2} | 1.53×1031.53\times10^{-3} | 1.75×1071.75\times10^{-7} | | 18 | 5.90×1035.90\times10^{-3} | 1.79×1051.79\times10^{-5} | 1.75×10111.75\times10^{-11} |

Periodic Case (M=3M=3 channels): | NN | No Reg. | Gaussian | Sinh | |-----|---------------|------------------|-------------------| | 2 | 2.68×1032.68\times10^{-3} | 2.99×1032.99\times10^{-3} | 7.54×1057.54\times10^{-5} | | 4 | 7.77×1047.77\times10^{-4} | 6.07×1066.07\times10^{-6} | 2.64×1092.64\times10^{-9} | | 6 | 3.68×1043.68\times10^{-4} | 2.64×1082.64\times10^{-8} | 6.08×10146.08\times10^{-14} |

Observed convergence matches e(πδ)Ne^{-(\pi-\delta)N} for sinh and e(πδ)N/2e^{-(\pi-\delta)N/2} for Gaussian regularization.

6. Applications to Fast Algorithms

Sinh windows regularize transform kernels in the NNFFT, producing error terms that decay exponentially as mjm_j \to \infty, specifically as exp(2πmj11/σj)\exp(-2\pi m_j \sqrt{1-1/\sigma_j}) for oversampling factor σj\sigma_j (Kircheis et al., 2021). For the fast sinc transform, these windows are reused to achieve overall error O(ε+Ewin)O(\varepsilon + E_{\text{win}}), with EwinE_{\text{win}} the compounded window error. The entire process is computationally efficient, with O((N+L1+L2)log(N+L1+L2))O((N+L_1+L_2)\log(N+L_1+L_2)) complexity.

7. Practical Recommendations and Generalization

For high-accuracy reconstruction of bandlimited signals from nonuniform samples, the recommended configuration is:

  • Use oversampling L=N(1+λ)L = N(1+\lambda) with moderate λ\lambda.
  • Set regularization β=(N1)(πδ)\beta = (N-1)(\pi-\delta) and support m=N1m = N-1 for direct coverage of interpolation nodes.
  • Restrict summation to localized nodes, yielding exponential decay in approximation error and robustness to outlier noise.

The methodology extends naturally to general separated, DD-dense node sets; all key features—exponential decay, numerical stability, and computational efficiency—persist under nonuniform sampling regimens (Kircheis et al., 2022, Jiang et al., 25 Jan 2026). The sinh-type window is thus the preferred regularizer for practical and theoretical applications demanding minimal reconstruction error from finite, possibly irregularly spaced samples.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Sinh Regularized Nonuniform Sampling Series.