Forsythe Conjecture on Restarted s-Gradient Methods

Prove that for restart length s with 2 ≤ s < d(A), where d(A) is the minimal polynomial degree of A, each of the two subsequences {y_{2k}} and {y_{2k+1}} of normalized residuals generated by the restarted s-step conjugate-gradient procedure converges to a single limit vector; equivalently, establish Forsythe’s conjecture beyond s=1.

Background

Forsythe studied the asymptotic behavior of conjugate gradient restarted every s steps and observed convergence to alternating limiting directions for small s, proving the behavior for s=1 and providing evidence for s=2.

The general case remains unresolved despite the foundational importance of gradient-based methods; a positive resolution would imply at-best linear asymptotic convergence under fixed-length restarts.

References

In general, however, the Forsythe conjecture remains largely open, which is quite surprising in light of the popularity and widespread use of methods like steepest descent, gradient descent, and CG.

Linear Systems and Eigenvalue Problems: Open Questions from a Simons Workshop  (2602.05394 - Amsel et al., 5 Feb 2026) in Subsection "The Forsythe conjecture" (Section 2)