Stochastic ADMM with batch size adaptation for nonconvex nonsmooth optimization
Abstract: Stochastic alternating direction method of multipliers (SADMM) is a popular method for solving nonconvex nonsmooth finite-sum optimization problems in various applications. It usually requires an empirical choice of the static batch size for gradient estimation, which leads to a tricky trade-off between variance reduction and computational cost. In this work, we instead propose adaptive batch size SADMM, a practical method that dynamically adjusts the batch size based on the history differences accumulated along the optimization path. A simple convergence analysis is developed to handle the dependence of the batch size adaptation, which matches the best known complexity with flexible parameter choices. Furthermore, we extend such an adaptive strategy to reduce the overall complexity of the popular variance-reduced algorithms SVRG-ADMM and SPIDER-ADMM. Numerical results validate the improvement of our proposed SADMM with batch size adaptation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.