Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Additively Preconditioned Trust Region Strategy for Machine Learning

Published 16 Dec 2025 in math.NA and math.OC | (2512.14286v1)

Abstract: Modern machine learning, especially the training of deep neural networks, depends on solving large-scale, highly nonconvex optimization problems, whose objective function exhibit a rough landscape. Motivated by the success of parallel preconditioners in the context of Krylov methods for large scale linear systems, we introduce a novel nonlinearly preconditioned Trust-Region method that makes use of an additive Schwarz correction at each minimization step, thereby accelerating convergence. More precisely, we propose a variant of the Additively Preconditioned Trust-Region Strategy (APTS), which combines a right-preconditioned additive Schwarz framework with a classical Trust-Region algorithm. By decomposing the parameter space into sub-domains, APTS solves local non-linear sub-problems in parallel and assembles their corrections additively. The resulting method not only shows fast convergence; due to the underlying Trust-Region strategy, it furthermore largely obviates the need for hyperparameter tuning.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.