Papers
Topics
Authors
Recent
Search
2000 character limit reached

Deep adaptive basis Galerkin method for high-dimensional evolution equations with oscillatory solutions

Published 29 Dec 2021 in math.NA and cs.NA | (2112.14418v2)

Abstract: In this paper, we study deep neural networks (DNNs) for solving high-dimensional evolution equations with oscillatory solutions. Different from deep least-squares methods that deal with time and space variables simultaneously, we propose a deep adaptive basis Galerkin (DABG) method, which employs the spectral-Galerkin method for the time variable of oscillatory solutions and the deep neural network method for high-dimensional space variables. The proposed method can lead to a linear system of differential equations having unknown DNNs that can be trained via the loss function. We establish a posterior estimates of the solution error, which is bounded by the minimal loss function and the term $O(N{-m})$, where $N$ is the number of basis functions and $m$ characterizes the regularity of the e'quation. We also show that if the true solution is a Barron-type function, the error bound converges to zero as $M=O(Np)$ approaches to infinity, where $M$ is the width of the used networks, and $p$ is a positive constant. Numerical examples, including high-dimensional linear evolution equations and the nonlinear Allen-Cahn equation, are presented to demonstrate the performance of the proposed DABG method is better than that of existing DNNs.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.