First- and Second-Order Stochastic Adaptive Regularization with Cubics: High Probability Iteration and Sample Complexity
Abstract: We present high-probability (and expectation) complexity bounds for two versions of stochastic adaptive regularization methods with cubics (SARC), also known as regularized Newton methods. The first algorithm aims to find first-order stationary points, while the second targets second-order optimality conditions. Both methods employ stochastic zeroth-, first-, and second-order oracles with specific accuracy and reliability requirements. These oracles, which have been previously used with other stochastic adaptive methods like trust-region and line-search algorithms, are applicable to various optimization settings including expected risk minimization and simulation optimization. In this paper, we establish the first high-probability iteration and sample complexity bounds for both first- and second-order SARC algorithms. Our analysis demonstrates that as in the deterministic case, they outperform other stochastic adaptive methods.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.