Scalable Projection-Free Optimization Methods via MultiRadial Duality Theory
Abstract: Recent works have developed new projection-free first-order methods based on utilizing linesearches and normal vector computations to maintain feasibility. These oracles can be cheaper than orthogonal projection or linear optimization subroutines but have the drawback of requiring a known strictly feasible point to do these linesearches with respect to. In this work, we develop new theory and algorithms which can operate using these cheaper linesearches while only requiring knowledge of points strictly satisfying each constraint separately. Convergence theory for several resulting ``multiradial'' gradient methods is established. We also provide preliminary numerics showing performance is essentially independent of how one selects the reference points for synthetic quadratically constrained quadratic programs.
- Gauge optimization and duality. SIAM Journal on Optimization, 24(4):1999–2022, 2014.
- Foundations of gauge and perspective duality. SIAM Journal on Optimization, 28(3):2406–2434, 2018.
- James Renegar. “Efficient” Subgradient Methods for General Convex Optimization. SIAM Journal on Optimization, 26(4):2649–2676, 2016.
- James Renegar. Accelerated first-order methods for hyperbolic programming. Mathematical Programming, 173(1-2):1–35, 2019.
- Benjamin Grimmer. Radial subgradient method. SIAM Journal on Optimization, 28(1):459–469, 2018.
- Benjamin Grimmer. Radial duality part ii: applications and algorithms. Mathematical Programming, 2023.
- Zakaria Mhammedi. Efficient projection-free online convex optimization with membership oracle. In Po-Ling Loh and Maxim Raginsky, editors, Proceedings of Thirty Fifth Conference on Learning Theory, volume 178 of Proceedings of Machine Learning Research, pages 5314–5390. PMLR, 02–05 Jul 2022.
- Projection-free adaptive regret with membership oracles. In Shipra Agrawal and Francesco Orabona, editors, International Conference on Algorithmic Learning Theory, February 20-23, 2023, Singapore, volume 201 of Proceedings of Machine Learning Research, pages 1055–1073. PMLR, 2023.
- Gauges and accelerated optimization over smooth and/or strongly convex sets, 2023.
- Benjamin Grimmer. Radial duality part i: foundations. Mathematical Programming, 2023.
- B. T. Polyak. A general method of solving extremum problems. Sov. Math., Dokl., 8:593–597, 1967.
- M.R. Metel and A. Takeda. Primal-dual subgradient method for constrained convex optimization problems. Optimization Letters, 15:1491–1504, 2021.
- Solving convex smooth function constrained optimization is almost as easy as unconstrained optimization, 2022.
- Robert M. Freund. Dual gauge programs, with applications to quadratic programming and the minimum-norm problem. Mathematical Programming, 38(1):47–67, 1987.
- Practical large-scale linear programming using primal-dual hybrid gradient. In Neural Information Processing Systems, 2021.
- ECLIPSE: An extreme-scale linear program solver for web-applications. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning, volume 119 of Proceedings of Machine Learning Research, pages 704–714. PMLR, 13–18 Jul 2020.
- New developments of admm-based interior point methods for linear programming and conic programming, 2023.
- Yinyu Ye Tianyi Lin, Shiqian Ma and Shuzhong Zhang. An admm-based interior-point method for large-scale linear programming. Optimization Methods and Software, 36(2-3):389–424, 2021.
- A practical and optimal first-order method for large-scale convex quadratic programming, 2023.
- Yurii Nesterov. Smooth minimization of non-smooth functions. Mathematical Programming, 103(1):127–152, 2005.
- Smoothing and first order methods: A unified framework. SIAM Journal on Optimization, 22:557–580, 2012.
- Yurii Nesterov. Universal gradient methods for convex optimization problems. Mathematical Programming, 152:381–404, 2015.
- Yurii Nesterov. Introductory Lectures on Convex Optimization: A Basic Course. Springer Publishing Company, Incorporated, 1 edition, 2014.
- A simple nearly optimal restart scheme for speeding up first-order methods. Foundations of Computational Mathematics, 22:211–256, 2022.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.