Convergence Guarantees for the Two-Timescale RSGN Learning System
Establish rigorous convergence guarantees for the full two-timescale learning system of Resonant Sparse Geometry Networks (RSGN) that combines gradient descent with Hebbian structural learning, determining conditions under which the combined training dynamics converge.
References
Theoretical Understanding: While we provide complexity analysis and stability conditions, complete convergence guarantees for the full system combining gradient descent with Hebbian structural learning remain an open theoretical question.
— Resonant Sparse Geometry Networks
(2601.18064 - Hays, 26 Jan 2026) in Section 6 (Discussion), Limitations and Future Work