Papers
Topics
Authors
Recent
Search
2000 character limit reached

Laplace-Enhanced SINDy (LES-SINDy)

Updated 18 February 2026
  • The paper demonstrates how LES-SINDy leverages the Laplace transform to represent high-order derivatives and discontinuous terms exactly, mitigating numerical differentiation issues.
  • LES-SINDy is a data-driven framework that constructs a Laplace-domain candidate function matrix for sparse regression, enabling robust identification of governing equations even under high noise.
  • LES-SINDy exhibits superior performance on ODEs and PDEs by achieving accurate coefficient recovery and low parameter errors compared to classical SINDy methodologies.

Laplace-Enhanced Sparse Identification of Nonlinear Dynamical Systems (LES-SINDy) is a data-driven framework for governing equation discovery that circumvents limitations of classical SINDy when faced with high-order derivatives, discontinuities, or unbounded growth functions, particularly under noisy conditions. LES-SINDy operates by transferring time-domain data to the Laplace domain, enabling analytic treatment of derivatives and discontinuous terms, and mitigating numerical difficulties in the identification of ordinary and partial differential equations (ODEs and PDEs) (Zheng et al., 2024).

1. Theoretical Foundations

LES-SINDy extends the Sparse Identification of Nonlinear Dynamical Systems (SINDy) paradigm by integrating Laplace transform theory. For a vector-valued measurement time series u(t)u(t) sampled at t1,,tmt_1,\dots,t_m, the method first constructs a SINDy library X(tj)X(t_j) that includes constants, tjt_j, u(tj)u(t_j), and derivatives up to order kk, along with their tensor product monomials up to degree nn.

Each column Xi(t)X_i(t) of the resulting m×dm\times d matrix undergoes a Laplace transform:

L{Xi(t)}(s)=0estXi(t)dt,\mathcal{L}\{X_i(t)\}(s) = \int_0^\infty e^{-st} X_i(t) \, dt,

which, on finite data, is approximated by

Xi(s)j=1mes(tjt1)Xi(tj)Δtj,X_i(s) \approx \sum_{j=1}^m e^{-s(t_j-t_1)} X_i(t_j)\Delta t_j,

with Δtj=tj+1tj\Delta t_j = t_{j+1} - t_j (except for Δtm=tmtm1\Delta t_m = t_m - t_{m-1}).

The Laplace transform treats derivatives analytically using integration by parts:

L{dkudtk}(s)=skL{u}(s)n=0k1sk1nu(n)(0).\mathcal{L}\left\{\frac{d^k u}{dt^k}\right\}(s) = s^k \mathcal{L}\{u\}(s) - \sum_{n=0}^{k-1} s^{k-1-n} u^{(n)}(0).

This enables LES-SINDy to avoid numerical differentiation entirely; all high-order derivatives are represented exactly in the transform domain if initial conditions are available.

2. Algorithmic Workflow

The LES-SINDy procedure is composed of three main steps:

  1. Laplace-Enhanced Library Construction:
    • Assemble the candidate function matrix X(tj)X(t_j) from the time-series data.
    • Select LL complex frequencies s1,,sLs_1,\dots,s_L with Re(s)>0\mathrm{Re}(s_\ell)>0.
    • For each \ell, compute the Laplace domain vectors X(s)X(s_\ell) as above; stack them to form ΘRL×d\Theta \in \mathbb{R}^{L \times d} with columns X(s)X(s_\ell).
  2. Sparse Regression in the Laplace Domain:
    • Solve the implicit equation Θξ=0\Theta\xi = 0 for ξRd\xi\in\mathbb{R}^d, seeking sparsity.
    • To avoid the trivial solution ξ=0\xi=0, cycle through each ii and set ξi=1\xi_i=1; then solve

    minζΘiζΘi22+λR(ζ)\min_{\zeta}\|\Theta_{-i}\zeta - \Theta_i\|_2^2 + \lambda R(\zeta)

    where R(ζ)R(\zeta) is a sparsity-promoting regularizer, such as 1\ell_1 or thresholded least squares. - This process yields dd candidate models ξ(i)\xi^{(i)}.

  3. Model Evaluation and Selection:

    • Each candidate ξ(i)\xi^{(i)} is converted to an explicit ODE/PDE, integrated numerically (e.g., via Runge–Kutta), and produces a predicted trajectory U^(t)\hat{U}(t).
    • Two quality metrics are computed:
      • log RMSE:

    ε1(ξ)=log[1mj=1mu(tj)U^(tj;ξ)22]1/2\varepsilon_1(\xi) = \log\left[\frac{1}{m}\sum_{j=1}^m \|u(t_j) - \hat{U}(t_j;\xi)\|_2^2\right]^{1/2}

    Used primarily for hyperparameter tuning. - Corrected AIC (AICc):

    ε2(ξ)=2p+mln(2πσ2)+1σ2j=1mu(tj)U^(tj)2+2(p+1)(p+2)mp2\varepsilon_2(\xi) = 2p + m\ln(2\pi\sigma^2) + \frac{1}{\sigma^2}\sum_{j=1}^m \|u(t_j) - \hat{U}(t_j)\|^2 + \frac{2(p+1)(p+2)}{m-p-2}

    Where pp is the number of nonzero terms in ξ\xi, and σ2\sigma^2 the estimated residual variance. AICc is used for model selection.

The model minimizing AICc is selected as the optimal, parsimonious equation.

3. Analytic Handling of High-Order Derivatives and Discontinuities

A core innovation of LES-SINDy is its analytic approach for traditionally challenging features:

  • High-order derivatives: By manipulating the Laplace transform, derivatives become algebraic (e.g., sks^k factors), avoiding finite-difference schemes that amplify noise.
  • Discontinuous and impulsive terms: Step functions H(tt0)H(t-t_0) and impulses δ(tt0)\delta(t-t_0) possess known Laplace transforms (est0/se^{-s t_0}/s and est0e^{-s t_0}, respectively), allowing explicit inclusion and identification.
  • Unbounded or slowly-growing functions: The este^{-s t} weighting in the Laplace transform de-emphasizes late times where numerical errors and exponential growth dominate, provided Re(s)\mathrm{Re}(s) is sufficiently large.

This analytic treatment broadens the class of systems addressable by SINDy methodologies.

4. Regression and Model Selection in the Laplace Domain

The transformation to the Laplace domain offers computational and statistical advantages:

  • Resolution-independent regression: The feature matrix Θ\Theta is L×dL \times d and independent of the number of time samples mm, decoupling regression cost from data resolution.
  • Sparse regression is enforced in the Laplace domain, seeking a minimal support solution for ξ\xi that captures the underlying dynamical structure.
  • After candidate equations are inferred, simulation and out-of-sample evaluation ensure that discovered models are both accurate and parsimonious, as quantified by the AICc metric.

5. Empirical Performance Across Dynamical Systems

LES-SINDy has been validated on a range of ODE and PDE identification tasks:

  • High-order ODEs: Fourth-order models (e.g., utttt+αutt+βu=0u_{tttt} + \alpha u_{tt} + \beta u = 0) are recovered with exact coefficients even under substantial noise. LES-SINDy achieves substantial improvements in AICc (e.g., from 587.8-587.8 to 2559.3-2559.3), while SINDy fails to resolve high-order derivatives or produces spurious terms.
  • Discontinuous forcing: For ODEs with Heaviside or Dirac impulse terms, LES-SINDy accurately recovers both continuous and discontinuous dynamics. Classical SINDy fails or misestimates these terms, especially when noisy.
  • Trigonometric and hyperbolic forcing: While Fourier-domain techniques are limited by unbounded inputs like sinh\sinh or cosh\cosh, LES-SINDy successfully recovers their coefficients and structure, tolerating noise levels up to 10%\sim 10\%.
  • Nonlinear ODE systems: Canonical models such as Lorenz and Lotka–Volterra are reconstructed with parameter errors <0.5%<0.5\% and <1%<1\%, respectively, at strongly negative AICc.
  • PDEs: For convection–diffusion, Burgers, and Kuramoto–Sivashinsky equations, LES-SINDy recovers governing equations with <3%<3\% parameter error at noise levels up to 30%30\%. It outperforms both classical SINDy and Weak-SINDy, tolerating up to twice the noise before failure in the Kuramoto–Sivashinsky test.

Table 1 below summarizes specific empirical findings.

System Type Standard SINDy Limitation LES-SINDy Performance
High-order ODE Fails on uttttu_{tttt} with noise Exact coefficient recovery, strong AICc gains
Discontinuities Misses δ,H\delta,H terms Accurate recovery, <1%<1\% error under 10% noise
Unbounded forcing Fourier methods break down Accurate recovery for sinh,cosh\sinh,\cosh
PDEs (KS, Burgers) Low noise tolerance Twice noise tolerance, low parameter error

6. Broader Implications and Outlook

LES-SINDy substantially extends the reach of sparse regression-based equation discovery in applied mathematics, physics, and engineering domains where discontinuities, high-order derivatives, and unbounded inputs are prevalent. Its computational efficiency due to resolution-independent regression and its robustness under noise suggest promise for both theoretical analysis and practical applications, including real-world scenarios with measurement challenges and incomplete initial conditions (A plausible implication is that LES-SINDy may catalyze further development of Laplace-domain approaches to model inference).

For implementation details, theoretical justification, and further benchmarks, see "LES-SINDy: Laplace-Enhanced Sparse Identification of Nonlinear Dynamical Systems" (Zheng et al., 2024).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Laplace-enhanced SINDy (LES-SINDy).