Optimizer-invariance of capabilities under preserved spectral edge event ordering (Conjecture)

Prove that the capabilities of the trained model are invariant under continuous deformations of the optimizer that preserve the ordering and type of spectral edge events, thereby establishing an optimizer-invariant characterization of capabilities for spectral edge dynamics.

Background

In the geometric-flow perspective, spectral edge events (gap collapses/openings) play a role analogous to critical points and flow lines. The authors observe that different optimizers (e.g., AdamW versus Muon) can produce distinct spectral edge dynamics yet reach comparable capabilities, suggesting a deeper invariant.

The conjecture posits that as long as optimizer changes preserve the ordering and type of spectral edge events, the final capabilities remain unchanged—hinting at a Floer-theoretic style invariant for training dynamics.

References

The capabilities of the trained model are invariant under continuous deformations of the optimizer, provided these deformations preserve the ordering and type of spectral edge events.

The Spectral Edge Thesis: A Mathematical Framework for Intra-Signal Phase Transitions in Neural Network Training  (2603.28964 - Xu, 30 Mar 2026) in Conjecture [Informal], Section 15 (The Geometric Flow Connection)