Papers
Topics
Authors
Recent
Search
2000 character limit reached

Attention-Enhanced Reservoir Computing as a Multiple Dynamical System Approximator

Published 9 May 2025 in nlin.CD | (2505.05852v2)

Abstract: Reservoir computing has proven effective for tasks such as time-series prediction, particularly in the context of chaotic systems. However, conventional reservoir computing frameworks often face challenges in achieving high prediction accuracy and adapting to diverse dynamical problems due to their reliance on fixed weight structures. A concept of an attention-enhanced reservoir computer has been proposed, which integrates an attention mechanism into the output layer of the reservoir computing model. This addition enables the system to prioritize distinct features dynamically, enhancing adaptability and prediction performance. In this study, we demonstrate the capability of the attention-enhanced reservoir computer to learn and predict multiple chaotic attractors simultaneously with a single set of weights, thus enabling transitions between attractors without explicit retraining. The method is validated using benchmark tasks, including the Lorenz system, R\"ossler system, Henon map, Duffing oscillator, and Mackey-Glass delay-differential equation. Our results indicate that the attention-enhanced reservoir computer achieves superior prediction accuracy, valid prediction times, and improved representation of spectral and histogram characteristics compared to traditional reservoir computing methods, establishing it as a robust tool for modeling complex dynamical systems.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We found no open problems mentioned in this paper.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 5 likes about this paper.