Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Proximal Stochastic Gradient Method with Adaptive Step Size and Variance Reduction for Convex Composite Optimization

Published 14 Sep 2025 in math.OC | (2509.11043v1)

Abstract: In this paper, we propose a proximal stochasitc gradient algorithm (PSGA) for solving composite optimization problems by incorporating variance reduction techniques and an adaptive step-size strategy. In the PSGA method, the objective function consists of two components: one is a smooth convex function, and the other is a non-smooth convex function. We establish the strong convergence of the proposed method, provided that the smooth convex function is Lipschitz continuous. We also prove that the expected value of the error between the estimated gradient and the actual gradient converges to zero. Furthermore, we get an ( O(\sqrt{1/k}) ) convergence rate for our method. Finally, the effectiveness of the proposed method is validated through numerical experiments on Logistic regression and Lasso regression.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.