Adaptive Conditional Gradient Sliding: Projection-Free and Line-Search-Free Acceleration
Abstract: We study convex optimization problems over a compact convex set where projections are expensive but a linear minimization oracle (LMO) is available. We propose the adaptive conditional gradient sliding method (AdCGS), a projection-free and line-search-free method that retains Nesterov's acceleration with adaptive stepsizes based on local Lipschitz estimates. AdCGS combines an accelerated outer scheme with an LMO-based inner routine. It reuses gradients across multiple LMO calls to reduce gradient evaluations, while controlling the subproblem inexactness via a prescribed accuracy level coupled with adaptive stepsizes. We prove accelerated convergence rates for convex objective functions matching those of projection-based accelerated methods, while requiring no projection oracle. For strongly convex objective functions, we further establish linear convergence without additional geometric assumptions on the constraint set, such as polytopes or strongly convex sets. Experiments on constrained $\ell_p$ regression, logistic regression with real-world datasets, and least-squares problems demonstrate improvements over both projection-free and projection-based baselines.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.