Full-Wave Solver Surrogate Overview
- Full-wave solver surrogates are reduced-order or learned models that accurately emulate high-cost full-wave PDE solvers used in hydrodynamics, electromagnetics, and elastodynamics.
- They combine classical reduced-basis methods with advanced machine learning architectures, such as CNNs and Fourier Neural Operators, to capture complex operator mappings.
- These surrogates enable large-scale optimization, inverse design, and uncertainty quantification by drastically reducing computational costs while maintaining high fidelity.
A full-wave solver surrogate is a computational framework that replaces the high-cost solution of wave-propagation PDEs (such as those governing hydrodynamics, electromagnetics, or elastodynamics) with a reduced-order or learned model trained to accurately emulate the physical solver’s outputs. Full-wave surrogates are critical for enabling large-scale optimization, inverse design, uncertainty quantification, and probabilistic risk assessment under computational constraints. Modern surrogates leverage both classical reduced-basis methods and advanced machine learning architectures, frequently matched to the underlying problem structure and application domain.
1. Mathematical Foundations and Problem Class
Full-wave surrogates target forward problems governed by systems such as the spectral action-density balance for ocean waves (ADCIRC+SWAN (Gharehtoragh et al., 14 Oct 2025)), the curl–curl Maxwell equations for electromagnetics (Augenstein et al., 2023, Mao et al., 3 Sep 2025), or elastodynamic wave equations for stress/strain in solids (Rashetnia et al., 2019, Meles et al., 6 May 2025). The canonical wave operator (with parameter vector encoding material properties, geometry, sources, or boundary conditions) is typically discretized via FDFD, FDTD, FEM, IGA, or FV methods, producing a large nonlinear algebraic system. The full solver’s key computational bottleneck is the repeated assembly and inversion of this system, especially under parameter sweeps or optimization loops.
Surrogate construction either approximates the full operator (surrogate matrix methods (Drzisga et al., 2020)), compresses solution manifolds (reduced-basis, POD-Galerkin (Ribés et al., 2024)), or directly learns the operator mapping by data-driven or operator-learning techniques, such as convolutional neural networks (CNNs), Fourier Neural Operators (FNOs), or bespoke architectures (see Table 1).
| Approach | Governing Equation | Surrogate Type |
|---|---|---|
| ADCIRC+SWAN surrogate | Balance of action-density | Deep CNN |
| Maxwell EM surrogates | Curl–curl (frequency domain) | FNO/MLP/Broadband NN |
| Surrogate matrix (IGA) | Helmholtz/linear waves | B-spline interpolation |
| Electrical machine | Magneto-quasistatic | POD+SVR ensemble |
| Metasurface/circuit | Maxwell’s BC, MoM matrix | Matrix-valued fitting |
2. Surrogate Architectures and Model-Reduction Strategies
Neural-Operator and Deep Network Surrogates
Neural operator models, especially FNOs, are central to recent surrogates for both hydrodynamic and electromagnetic full-wave systems. For parameterized PDEs, the FNO framework approximates the input–output operator via hierarchical Fourier-space convolution layers and local feature transformations, supporting high-dimensional, mesh-independent mapping with low data requirements (Augenstein et al., 2023, Mao et al., 3 Sep 2025). Variants (WINO, FGCS layers) employ physically-motivated coordinate embeddings and hierarchical residual learning for broadband accuracy across parameter sweeps (e.g., wavelength (Seo et al., 2024)).
For spatially localized features, as in landscape hydrodynamics, state-of-the-art surrogates adopt a CNN frontend with explicit skip connections and a deep dense head, mapping structured input tensors (comprising morphological, environmental, and dynamic features) to target fields such as peak significant wave height and surge (Gharehtoragh et al., 14 Oct 2025).
Reduced-Basis and Surrogate Matrix Methods
Classical reduction methods, such as Proper Orthogonal Decomposition (POD) and Galerkin projection, compress large spatial–temporal datasets into truncated modal subspaces capturing of solution energy (Ribés et al., 2024). Surrogates subsequently learn the mapping from physical/parameter space to modal coefficients via nonlinear regressors (e.g., support vector regression). This hybridization delivers real-time, high-fidelity predictions for parameter-dependent, nonlinear electromagnetic machines, with mathematically controlled error bounds.
Surrogate matrix methodologies accelerate operator assembly by interpolating patch/stencil functions over bases derived from the near-translational invariance of Galerkin discretizations (e.g., B-splines in IGA (Drzisga et al., 2020)). The result is matrix assembly with formal a priori error control, independent of wave-number, and empirical speedups of 5–30× in large-scale time-harmonic and time-dependent wave propagation.
3. Training Data Generation and Optimization Integration
The fidelity of a full-wave surrogate depends critically on the sampling of the parametric, geometric, or scenario space. High-utility training strategies include:
- Synthetic event suites: For hydrodynamic surrogates, ensembles of synthetic tropical cyclone (TC) events and evolving landscapes are simulated with the full solver; LOLO cross-validation ensures robust generalization to new morphologies (Gharehtoragh et al., 14 Oct 2025).
- Active Learning: Query-by-committee and variance-driven acquisition focus sampling on poorly approximated regions, reducing the required data by an order of magnitude relative to random sampling (Pestourie et al., 2020, Azad et al., 2024).
- Sequential posterior refinement: In Bayesian inversion, surrogates are iteratively retrained on posterior samples in high-impact regions, with expanded frequency bands and principal component spaces at each stage, minimizing bias and surrogate error under fixed computational budgets (Meles et al., 6 May 2025).
- Multiscale decomposition and domain partitioning: Surrogates are constructed on subdomains (with flexible Robin boundary condition inputs) and assembled via multilevel iterative solvers, enabling scalability and mesh adaptivity (Mao et al., 3 Sep 2025).
Surrogates are universally embedded into inverse design, optimization, or uncertainty quantification loops. Adjoint gradients, scenario generation, and hybrid global-local search schemes leverage efficient surrogate evaluation to enable computation previously infeasible with conventional solvers.
4. Accuracy, Error Metrics, and Statistical Validation
Quantitative metrics for surrogate assessment include:
- RMSE / NRMSE: Pointwise and normalized errors against ground-truth outputs, typically yielding NRMSE for well-posed scenarios (hydrodynamics (Gharehtoragh et al., 14 Oct 2025), electromagnetics (Augenstein et al., 2023), electrical machines (Ribés et al., 2024)).
- Statistical equivalence: Kolmogorov–Smirnov tests on predicted hazard curves (hydrodynamics), with grid cell rejection rates as low as 5–8% for optimally coupled surrogates (Gharehtoragh et al., 14 Oct 2025).
- Inference/calculation speedup: Acceleration factors of are routinely reported, with large-scale wave, field, or optimization problems evaluated in seconds to minutes (Xu et al., 14 Dec 2025, Seo et al., 2024, Budhu et al., 2024).
- Posterior-coverage and uncertainty: Ensemble and Bayesian surrogate approaches deliver uncertainty quantification and robust error estimates, crucial for active learning and MCMC-based inversion (Pestourie et al., 2020, Meles et al., 6 May 2025).
- Physical fidelity in coupled and extrapolative regimes: Surrogates capturing mutual coupling (via local-neighbor encoding) and trained across extrapolative scenarios (e.g., extreme SLR landscapes) maintain accuracy across all physically relevant domains (Xu et al., 14 Dec 2025, Gharehtoragh et al., 14 Oct 2025).
5. Specializations: Coupled, Broadband, and Hybrid Surrogates
Advanced surrogates address domain-specific requirements:
- Full-physics coupling: Joint prediction of wave and surge in hydrodynamic models (Gharehtoragh et al., 14 Oct 2025), mutual-coupling in large-scale meta-optics via local-patch MLPs (Xu et al., 14 Dec 2025), impedance boundary surrogates for metasurface circuits (Budhu et al., 2024).
- Broadband generalization: Wave-informed embeddings and element-wise spectral consistency (WINO/WIME) enable surrogate accuracy at unseen wavelengths, outperforming previous Fourier- and U-Net-based models (Seo et al., 2024).
- Sequential and surrogate matrix fusion: Combination of PCA-based prior reduction, polynomial chaos, and GP surrogates, iteratively refined, yields unbiased Bayesian posterior estimates in high-dimensional FWI at two orders of magnitude speedup over FDTD MCMC (Meles et al., 6 May 2025).
- Many-body expansions (MBE): Surrogates for WEC farm hydrodynamics use one- and two-body ANNs reassembled via MBE, validated via query-by-committee and hybrid global-local optimization, recovering near-identical farm layouts to MS approaches with ≳90× cost reduction (Azad et al., 2024).
6. Limitations, Best Practices, and Future Directions
Across methodologies, several recommendations and caveats emerge:
- Validity Domain: Surrogates must be retrained or extended if input parameters, geometry, or dynamics fall outside the training manifold (e.g., new coastlines, higher-order wave-body interactions, broadband scenarios) (Gharehtoragh et al., 14 Oct 2025, Budhu et al., 2024).
- Adaptive Sampling and Physics-Informed Learning: Active learning and incorporation of domain knowledge (physics-informed nets, multiscale decomposition (Mao et al., 3 Sep 2025)) address data-efficiency and generalization beyond noisy interpolation regimes (Pestourie et al., 2020).
- Scalability and Integration: Hybrid schemes (e.g., neural-preconditioned domain decomposition (Mao et al., 3 Sep 2025), patchwise design (Xu et al., 14 Dec 2025)) address computational scaling, nonlocal effects, and global coupling.
- Uncertainty Quantification: Ensemble and Bayesian models supply error certificates and guide adaptive refinement (Meles et al., 6 May 2025, Pestourie et al., 2020).
- Recommended practices: Use physically coupled features when available, perform robust cross-validation on out-of-sample scenarios, augment training for extreme or extrapolative cases, and exploit surrogates for scenario generation in probabilistic frameworks (Gharehtoragh et al., 14 Oct 2025, Xu et al., 14 Dec 2025).
Future directions include multi-physics surrogates, more expressive operator-learning frameworks, direct end-to-end integration with optimization and probabilistic inference pipelines, and theoretical advances in error bounds and generalization for high-dimensional parameter regimes (Mao et al., 3 Sep 2025, Seo et al., 2024).
References:
(Gharehtoragh et al., 14 Oct 2025, Augenstein et al., 2023, Mao et al., 3 Sep 2025, Seo et al., 2024, Xu et al., 14 Dec 2025, Ribés et al., 2024, Drzisga et al., 2020, Pestourie et al., 2020, Meles et al., 6 May 2025, Rashetnia et al., 2019, Budhu et al., 2024, Azad et al., 2024)