Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fundamental performance bounds on time-series generation using reservoir computing

Published 27 Oct 2024 in nlin.CD | (2410.20393v2)

Abstract: Reservoir computing (RC) harnesses the intrinsic dynamics of a chaotic system, called the reservoir, to perform various time-varying functions. An important use-case of RC is the generation of target temporal sequences via a trainable output-to-reservoir feedback loop. Despite the promise of RC in various domains, we lack a theory of performance bounds on RC systems. Here, we formulate an existence condition for a feedback loop that produces the target sequence. We next demonstrate that, given a sufficiently chaotic neural network reservoir, two separate factors are needed for successful training: global network stability of the target orbit, and the ability of the training algorithm to drive the system close enough to the target, which we term `reach'. By computing the training phase diagram over a range of target output amplitudes and periods, we verify that reach-limited failures depend on the training algorithm while stability-limited failures are invariant across different algorithms. We leverage dynamical mean field theory (DMFT) to provide an analytical amplitude-period bound on achievable outputs by RC networks and propose a way of enhancing algorithm reach via forgetting. The resulting mechanistic understanding of RC performance can guide the future design and deployment of reservoir networks.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.