Learned Energy-Constrained Stencils
- The paper presents a novel data-driven approach that learns convolution stencils subject to skew-adjoint constraints to preserve discrete electromagnetic energy.
- It formulates the stencil learning as a convex quadratic program with rigorous linear and box constraints, ensuring energy conservation under Crank–Nicolson schemes.
- Numerical tests demonstrate that learned stencils maintain energy invariants to machine precision while matching or slightly improving accuracy compared to classical central-difference methods.
Energy-constrained learned stencils are data-driven spatial discretizations tailored for the one-dimensional Maxwell system that preserve discrete electromagnetic energy through explicit enforcement of skew-adjointness. These stencils are designed by learning convolution operators from high-fidelity spectral data, subject to rigorous linear constraints that guarantee exact conservation properties under implicit time discretization schemes such as Crank–Nicolson. The approach connects data-driven numerical methods with the classical structure-preserving finite-difference time-domain (FDTD) paradigm by marrying data fit with structural constraints that ensure energy conservation at the semi-discrete level (Obieke, 5 Jan 2026).
1. Formulation of the Data-Driven Stencil Learning Problem
Let denote the convolution stencil weights representing the discrete spatial derivative, with support width . The learning problem is formulated as a convex quadratic program:
subject to
Here, is the design matrix that compiles all sliding window patches from training data for and fields, contains target temporal derivatives, and implements Tikhonov regularization. The linear constraint enforces discrete skew-adjointness via
- for
This is both necessary and sufficient for a discrete operator to satisfy in the grid inner product. The box constraints bound the coefficients.
2. Skew-Symmetric Convolutions and Discrete Energy Conservation
Consider the semi-discrete Maxwell system on a uniform grid with points and mesh spacing :
where is the convolution operator defined by learned stencil . The discrete electromagnetic energy is
Differentiation yields
If is skew-adjoint (), the two terms cancel exactly, so : the discrete system preserves electromagnetic energy identically. This condition is extended to Crank–Nicolson time stepping: the implicit midpoint update
for
ensures
by virtue of the skew-adjointness constraint. Thus, discrete energy is conserved to machine precision.
3. Fourier Symbol, Numerical Wave Speed, and CFL Condition
The periodic convolution stencil has the Fourier symbol
This symbol determines the eigenvalues for each mode as , and therefore the numerical wave speed
The maximum wave speed is
When using explicit time-stepping (e.g., the leapfrog scheme), the CFL condition for stability is
The learned stencil’s Fourier symbol therefore sets both propagation speeds and stability boundaries in the discrete solver.
4. Comparison of Convex Optimization Solvers
Several convex solvers are evaluated for the quadratic program:
- Projected Gradient (PG) and Nesterov-Accelerated Gradient (NAG): Solve equality-constrained problems with updates projected onto . They enforce constraints to machine precision (residual ) and are computationally inexpensive, but for larger stencil radii , they plateau at suboptimal objective values.
- ADMM (Alternating Direction Method of Multipliers): Employs variable splitting (), accomplishing constraint fulfillment in -updates and box clipping in -updates. ADMM reaches the same objective as interior-point solvers in 1–2 iterations, equality violations , and data-fit error remains low as increases.
- Interior-Point (CVXPY+SCS): Used as a reference; achieves lowest objectives nearly identical to ADMM to 6–7 digits, with residuals , but incurs higher runtime.
Numerical tests for (with ) yield stencils (compared to exact centered-difference ), and all methods deliver final-time electric-field error . For , PG and NAG errors rise to , while ADMM and CVX remain at –.
5. Discrete Energy Conservation under Crank–Nicolson Schemes
Crank–Nicolson applied to with yields an update that preserves the discrete energy invariant:
Defining , the update guarantees
Numerically, the drift after CN steps with is for both learned and standard stencils, matching machine roundoff.
6. Quantitative Comparison with Classical Central-Difference Stencils
For a grid spacing and :
- Centered-Difference Stencil:
- Learned Energy-Constrained (ADMM): ; observed (vs. $32$ for CD)
- Electric Field Error: ;
- Discrete Energy Drift:
For wider stencils (), learned stencils produce accuracy at least comparable to central differences for and show modest improvement for , all while exactly preserving energy under CN.
7. Structural Significance and Implications
Energy-constrained learned stencils provide a framework that bridges traditional structure-preserving discretizations and modern data-driven schemes. By learning spatial discretizations subject to essential physical constraints, the methodology enables exact conservation of semi-discrete invariants while leveraging training data for enhanced accuracy. The Fourier analytic characterization links the learned stencil to propagation and stability traits, and efficient convex solvers enable practical deployment in simulation pipelines. A plausible implication is the extension of such physically-constrained learning approaches to broader PDE classes, where invariants such as energy, mass, or momentum are fundamental.
For implementation details and further mathematical development, see "Energy Conserving Data Driven Discretizations for Maxwells Equations" (Obieke, 5 Jan 2026).