2000 character limit reached
Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition
Published 23 Jan 2020 in math.OC, cs.NA, math.DS, and math.NA | (2001.08423v3)
Abstract: We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.