2000 character limit reached
On the Softplus Penalty for Constrained Convex Optimization
Published 21 May 2023 in math.OC | (2305.12603v1)
Abstract: We study a new penalty reformulation of constrained convex optimization based on the softplus penalty function. We develop novel and tight upper bounds on the objective value gap and the violation of constraints for the solutions to the penalty reformulations by analyzing the solution path of the reformulation with respect to the smoothness parameter. We use these upper bounds to analyze the complexity of applying gradient methods, which are advantageous when the number of constraints is large, to the reformulation.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.