Papers
Topics
Authors
Recent
Search
2000 character limit reached

Accelerated Algorithms for a Class of Optimization Problems with Equality and Box Constraints

Published 8 May 2023 in math.OC and cs.LG | (2305.04433v1)

Abstract: Convex optimization with equality and inequality constraints is a ubiquitous problem in several optimization and control problems in large-scale systems. Recently there has been a lot of interest in establishing accelerated convergence of the loss function. A class of high-order tuners was recently proposed in an effort to lead to accelerated convergence for the case when no constraints are present. In this paper, we propose a new high-order tuner that can accommodate the presence of equality constraints. In order to accommodate the underlying box constraints, time-varying gains are introduced in the high-order tuner which leverage convexity and ensure anytime feasibility of the constraints. Numerical examples are provided to support the theoretical derivations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)
  1. J. E. Gaudio, A. M. Annaswamy, M. A. Bolender, E. Lavretsky, and T. E. Gibson, “A Class of High Order Tuners for Adaptive Systems,” IEEE Control Systems Letters, pp. 1–1, 6 2020.
  2. W. Su, S. Boyd, and E. J. Candès, “A differential equation for modeling nesterov’s accelerated gradient method: Theory and insights,” Journal of Machine Learning Research, vol. 17, no. 153, pp. 1–43, 2016.
  3. S. Evesque, A. Annaswamy, S. Niculescu, and A. Dowling, “Adaptive control of a class of time-delay systems,” J. Dyn. Sys., Meas., Control, vol. 125, no. 2, pp. 186–193, 2003.
  4. A. S. Morse, “High-order parameter tuners for the adaptive control of linear and nonlinear systems,” in Systems, models and feedback: Theory and Applications.   Springer, 1992, pp. 339–364.
  5. J. M. Moreu and A. M. Annaswamy, “A stable high-order tuner for general convex functions,” IEEE Control Systems Letters, vol. 6, pp. 566–571, 2022.
  6. P. Srivastava and J. Cortés, “Nesterov acceleration for equality-constrained convex optimization via continuously differentiable penalty functions,” IEEE Control Systems Letters, vol. 5, no. 2, pp. 415–420, 2021.
  7. M. Fazlyab, A. Koppel, V. M. Preciado, and A. Ribeiro, “A variational approach to dual methods for constrained convex optimization,” in 2017 American Control Conference (ACC), 2017, pp. 5269–5275.
  8. Y. Xu, “Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming,” SIAM Journal on Optimization, vol. 27, no. 3, pp. 1459–1484, 2017.
  9. A. Parashar, P. Srivastava, A. M. Annaswamy, B. Dey, and A. Chakraborty, “Accelerated algorithms for a class of optimization problems with constraints,” in 2022 IEEE 61st Conference on Decision and Control (CDC), 2022, pp. 6960–6965.
  10. P. L. Donti, D. Rolnick, and J. Z. Kolter, “DC3: A learning method for optimization with hard constraints,” in International Conference on Learning Representations, 2021.
  11. M. Furi and M. Martelli, “A multidimensional version of rolle’s theorem,” The American Mathematical Monthly, vol. 102, no. 3, pp. 243–249, 1995.
  12. M. Pilanci and T. Ergen, “Neural networks are convex regularizers: Exact polynomial-time convex optimization formulations for two-layer networks,” in Proceedings of the 37th International Conference on Machine Learning, Jul 2020, pp. 7695–7705.
  13. X. Pan, T. Zhao, M. Chen, and S. Zhang, “DeepOPF: A deep neural network approach for security-constrained DC optimal power flow,” IEEE Transactions on Power Systems, vol. 36, no. 3, pp. 1725–1735, 2021.
  14. X. Pan, T. Zhao, and M. Chen, “DeepOPF: Deep neural network for dc optimal power flow,” 10 2019, pp. 1–6.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.