Papers
Topics
Authors
Recent
Search
2000 character limit reached

Accelerating PDE-Constrained Optimization by the Derivative of Neural Operators

Published 16 Jun 2025 in cs.LG | (2506.13120v1)

Abstract: PDE-Constrained Optimization (PDECO) problems can be accelerated significantly by employing gradient-based methods with surrogate models like neural operators compared to traditional numerical solvers. However, this approach faces two key challenges: (1) Data inefficiency: Lack of efficient data sampling and effective training for neural operators, particularly for optimization purpose. (2) Instability: High risk of optimization derailment due to inaccurate neural operator predictions and gradients. To address these challenges, we propose a novel framework: (1) Optimization-oriented training: we leverage data from full steps of traditional optimization algorithms and employ a specialized training method for neural operators. (2) Enhanced derivative learning: We introduce a Virtual-Fourier layer to enhance derivative learning within the neural operator, a crucial aspect for gradient-based optimization. (3) Hybrid optimization: We implement a hybrid approach that integrates neural operators with numerical solvers, providing robust regularization for the optimization process. Our extensive experimental results demonstrate the effectiveness of our model in accurately learning operators and their derivatives. Furthermore, our hybrid optimization approach exhibits robust convergence.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.