A better convergence analysis of the block coordinate descent method for large scale machine learning
Abstract: This paper considers the problems of unconstrained minimization of large scale smooth convex functions having block-coordinate-wise Lipschitz continuous gradients. The block coordinate descent (BCD) method are among the first optimization schemes suggested for solving such problems \cite{nesterov2012efficiency}. We obtain a new lower (to our best knowledge the lowest currently) bound that is $16p3$ times smaller than the best known on the information-based complexity of BCD method based on an effective technique called Performance Estimation Problem (PEP) proposed by Drori and Teboulle \cite{drori2012performance} recently for analyzing the performance of first-order black box optimization methods. Numerical test confirms our analysis.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.