An Enhanced Levenberg--Marquardt Method via Gram Reduction
Abstract: This paper studied the problem of solving the system of nonlinear equations ${\bf F}({\bf x})={\bf 0}$, where ${\bf F}:{\mathbb R}{d}\to{\mathbb R}d$. We propose Gram-Reduced Levenberg--Marquardt method which updates the Gram matrix ${\bf J}(\cdot)\top{\bf J}(\cdot)$ in every $m$ iterations, where ${\bf J}(\cdot)$ is the Jacobian of ${\bf F}(\cdot)$. Our method has a global convergence guarantee without relying on any step of line-search or solving sub-problems. We prove our method takes at most $\mathcal{O}(m2+m{-0.5}\epsilon{-2.5})$ iterations to find an $\epsilon$-stationary point of $\frac{1}{2}|{\bf F}(\cdot)|2$, which leads to overall computation cost of $\mathcal{O}(d3\epsilon{-1}+d2\epsilon{-2})$ by taking $m=\Theta(\epsilon{-1})$. Our results are strictly better than the cost of $\mathcal{O}(d3\epsilon{-2})$ for existing Levenberg--Marquardt methods. We also show the proposed method enjoys local superlinear convergence rate under the non-degenerate assumption. We provide experiments on real-world applications in scientific computing and machine learning to validate the efficiency of the proposed methods.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.