2000 character limit reached
Removing numerical dispersion from linear evolution equations
Published 22 Jun 2019 in math.NA, cs.NA, physics.comp-ph, and physics.geo-ph | (1906.10743v3)
Abstract: We describe a method for removing the numerical errors in the modeling of linear evolution equations that are caused by approximating the time derivative by a finite difference operator. The method is based on integral transforms realized as certain Fourier integral operators, called time dispersion transforms, and we prove that, under an assumption about the frequency content, it yields a solution with correct evolution throughout the entire lifespan. We demonstrate the method on a model equation as well as on the simulation of elastic and viscoelastic wave propagation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.