Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic versus Deterministic in Stochastic Gradient Descent

Published 3 Sep 2025 in math.OC | (2509.02912v1)

Abstract: This paper considers the mini-batch stochastic gradient descent (SGD) for a structured minimization problem involving a finite-sum function with its gradient being stochastically approximated, and an independent term with its gradient being deterministically computed. We focus on the stochastic versus deterministic behavior of the mini-batch SGD for this setting. A convergence analysis is provided that captures the different roles of these two parts. Linear convergence of the algorithm to a neighborhood of the minimizer is established under some smoothness and convexity assumptions. The step size, the convergence rate, and the radius of the convergence region depend asymmetrically on the characteristics of the two components, which shows the distinct impacts of stochastic approximation versus deterministic computation in the mini-batch SGD. Moreover, a better convergence rate can be obtained when the independent term endows the objective function with sufficient strong convexity. Also, the convergence rate of our algorithm in expectation approaches that of the classic gradient descent when the batch size increases. Numerical experiments are conducted to support the theoretical analysis as well.

Authors (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.