Secant acceleration of sequential residual methods for solving large-scale nonlinear systems of equations
Abstract: Sequential Residual Methods try to solve nonlinear systems of equations $F(x)=0$ by iteratively updating the current approximate solution along a residual-related direction. Therefore, memory requirements are minimal and, consequently, these methods are attractive for solving large-scale nonlinear systems. However, the convergence of these algorithms may be slow in critical cases; therefore, acceleration procedures are welcome. In this paper, we suggest to employ a variation of the Sequential Secant Method in order to accelerate Sequential Residual Methods. The performance of the resulting algorithm is illustrated by applying it to the solution of very large problems coming from the discretization of partial differential equations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.