Papers
Topics
Authors
Recent
Search
2000 character limit reached

Temporal Neural Operator for Modeling Time-Dependent Physical Phenomena

Published 28 Apr 2025 in cs.LG | (2504.20249v1)

Abstract: Neural Operators (NOs) are machine learning models designed to solve partial differential equations (PDEs) by learning to map between function spaces. Neural Operators such as the Deep Operator Network (DeepONet) and the Fourier Neural Operator (FNO) have demonstrated excellent generalization properties when mapping between spatial function spaces. However, they struggle in mapping the temporal dynamics of time-dependent PDEs, especially for time steps not explicitly seen during training. This limits their temporal accuracy as they do not leverage these dynamics in the training process. In addition, most NOs tend to be prohibitively costly to train, especially for higher-dimensional PDEs. In this paper, we propose the Temporal Neural Operator (TNO), an efficient neural operator specifically designed for spatio-temporal operator learning for time-dependent PDEs. TNO achieves this by introducing a temporal-branch to the DeepONet framework, leveraging the best architectural design choices from several other NOs, and a combination of training strategies including Markov assumption, teacher forcing, temporal bundling, and the flexibility to condition the output on the current state or past states. Through extensive benchmarking and an ablation study on a diverse set of example problems we demonstrate the TNO long range temporal extrapolation capabilities, robustness to error accumulation, resolution invariance, and flexibility to handle multiple input functions.

Summary

  • The paper introduces TNO, a neural operator that overcomes limitations in temporal extrapolation and error accumulation in time-dependent PDEs.
  • It employs a dual-branch structure combining DeepONet advances, U-Net encoding with adaptive pooling, and temporal bundling for efficient spatiotemporal modeling.
  • Benchmarked on weather, climate, and CO2 sequestration, TNO achieves robust resolution invariance, long-term accuracy, and generalization across multiphysics scenarios.

Temporal Neural Operator for Time-Dependent Physical Phenomena

Introduction

The paper "Temporal Neural Operator for Modeling Time-Dependent Physical Phenomena" (2504.20249) presents the Temporal Neural Operator (TNO), a neural operator architecture tailored to efficiently model spatio-temporal dynamics governed by time-dependent PDEs. The TNO synthesizes architectural and training advances from DeepONet, Fourier Neural Operator, and spatiotemporal processing frameworks. Critically, TNO addresses significant limitations in existing operator learning methods, specifically their inability to robustly handle temporal extrapolation, error accumulation, and resolution invariance when deployed on real-world, high-dimensional, and noisy scientific datasets.

Methodology: The Temporal Neural Operator Framework

The TNO augments the classical DeepONet operator learning pipeline with a dedicated temporal branch (t-branch), architectural features for efficient high-dimensional input handling, and novel training regimes that mitigate temporal error accumulation.

The TNO operates as follows:

  • Input Encoding: The branch network processes auxiliary or parameter fields (e.g., atmospheric pressure level or reservoir static parameters), while the t-branch encodes a trajectory of system states (history length LL) using a shared U-Net encoder. Both branches employ adaptive pooling and a U-Net backbone to ensure input resolution invariance.
  • Trunk Network: The trunk maps spatiotemporal query coordinates into the shared latent space.
  • Feature Synthesis: The resulting latent representations are combined via Hadamard product, and the output is decoded via an MLP to predict multi-step future trajectories (bundle length KK). This nonlocal spatiotemporal fusion underpins TNO's expressive capacity for PDEs with complex temporal evolution. Figure 1

    Figure 1: TNO architecture with temporal bundling, illustrative case L=1L=1, K=3K=3; the t-branch enables robust modeling of temporal dynamics.

Key training strategies include:

  • Temporal Bundling (K>1K>1): Enables prediction of multiple future steps per forward pass, improving efficiency and stabilizing rollouts.
  • Autoregressive Conditioning and Teacher Forcing: Supports both Markov and memory-based system evolution, with training optionally stabilized by supplying ground truth at intermediate rollout steps.

Benchmarking: Weather and Climate Forecasting Applications

European Regional Air Temperature Forecast

TNO is evaluated on the E-OBS observational climate dataset, addressing noisy, incomplete, and high-resolution (0.1°, 0.25°) weather data with strong spatial heterogeneity. Noteworthy results:

  • Multi-step forecasting with limited history (L=1L=1): TNO achieves an MAE of 2.68∘2.68^\circC on blind test data (0.25° grid) and 2.83∘2.83^\circC on a higher-resolution 0.1° grid without retraining, demonstrating robust resolution invariance.
  • Error Accumulation: Examination of multi-step rollouts shows negligible error drift, confirming the stabilization protocol's efficacy in long-term predictions.
  • Ablation Analysis: Removal of the t-branch or U-Net severely degrades performance and resolution invariance, quantifying the contributions of each architectural component. Figure 2

Figure 2

Figure 2: TNO-predicted air temperature field for 24/06/2023 (0.25° grid) demonstrates high-fidelity spatial structure and low error versus ground truth.

Figure 3

Figure 3

Figure 3: TNO-predicted air temperature field (12/11/2023, 0.25° grid); similar accuracy is observed for higher-resolution predictions.

Figure 4

Figure 4: MAE and RMSE benchmarks on coarse (top) and fine (bottom) grids for TNO and ablated models; "Multistep" indicates use of temporal bundling (K=4K{=}4).

Global Climate Modeling

The TNO is applied to NCEP/NCAR Reanalysis data for global air temperature spanning 16 atmospheric pressure levels. The framework collapses the 3D spatiotemporal field into 2D slices with conditioning, exploiting efficient 2D convolutional architectures:

  • Temporal Extrapolation: TNO accurately forecasts air temperatures up to five years beyond training, achieving mean relative L2L_2 error 0.016 across all pressure levels.
  • Vertical Interpolation & Extrapolation: The model generalizes to atmospheric levels entirely excluded during training.
  • Single Unified Model: Unlike Ditto and other foundation models that require separate models or massive multi-level parameterization, TNO covers all levels using a single light-weight network. Figure 5

Figure 5

Figure 5

Figure 5

Figure 5: Example global air temperature field (26/03/2021, 1000 mb) predicted by TNO, showcasing strong qualitative agreement.

Operator Generalization: Geologic Carbon Sequestration

TNO is assessed on the high-dimensional, multiphysics task of CO2_2 plume migration and pressure evolution for geological storage, requiring learning coupled elliptic-hyperbolic PDE dynamics and robust generalization to unseen geology and well configurations.

  • Data Efficiency & Extrapolation: With training restricted to snapshots up to 1.8 years, the TNO delivers accurate long-term forecasts (out to 30 years) and robust responses to entirely unseen test cases, tasks not solvable by prior neural operators such as U-FNO.
  • Multiphysics Coupling: The same operator generalizes over distinct dynamics: saturation plume evolution (localized, sharp gradients) and pressure buildup (smoother, global response).
  • Failure Modes Quantified: Some error growth at the extreme extrapolation horizon is observed, with quantitative error curves mapping the limits of stability. Figure 6

    Figure 6: Example pressure buildup predictions for simultaneous extrapolation and generalization; TNO predictions closely match ground truth, absolute error remains well controlled.

Discussion: Theoretical and Practical Implications

The TNO framework advances operator learning by offering:

  • Superior Temporal Extrapolation: Through its explicit modeling of spatiotemporal operator structure and temporal bundling, TNO achieves accurate, stable long rollouts not previously demonstrated by neural operator architectures.
  • Resolution Invariance: TNO decouples training and inference resolutions, enabled by adaptive pooling/U-Net and coordinate-based trunking. This is indispensable for scientific deployment scenarios with evolving sensor networks or regridding requirements.
  • Unified, Low-Memory Models: A single TNO model handles multiresolution, multi-physics, and multivariable cases—often with fewer parameters and lower memory use than dedicated specialized models.

These results position TNO as a state-of-the-art solution for time-dependent PDE operator learning across spatiotemporal domains with practical utility in real-time forecasting and scientific simulation replacement.

Conclusion

The Temporal Neural Operator robustly addresses core challenges in operator learning for time-dependent PDEs: temporal extrapolation, error stability, generalization to novel parameterizations, and resolution invariance. Architecturally, the introduction of a temporal branch, U-Net encoders with adaptive pooling, and Hadamard product synthesis yields significant gains over DeepONet, FNO, and their variants. Extensive benchmarking on weather forecasting, climate modeling, and multiphysics geoscience demonstrates TNO's flexibility, accuracy, and practical deployment capabilities. Future directions include scaling TNO architectures for even higher-dimensional coupled systems, integration with active data assimilation workflows, and further exploration of theoretical generalization bounds for operator learning in non-Markovian and multi-scale physical settings.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.