A Derivative-Free Saddle-search Algorithm With Linear Convergence Rate
Abstract: We propose a derivative-free saddle-search algorithm designed to locate transition states using only function evaluations. The algorithm employs a nested architecture consisting of an inner eigenvector search and an outer saddle-point search. Through rigorous numerical analysis, we prove the almost sure convergence of the inner step under suitable assumptions. Furthermore, we establish the convergence of the outer search using a decaying step size, while demonstrating linear convergence under constant step size and boundedness conditions. Numerical experiments are provided to validate our theoretical results and demonstrate the algorithm's practical applicability.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.