Proximal Activation of Smooth Functions in Splitting Algorithms for Convex Image Recovery
Abstract: Structured convex optimization problems typically involve a mix of smooth and nonsmooth functions. The common practice is to activate the smooth functions via their gradient and the nonsmooth ones via their proximity operator. We show that, although intuitively natural, this approach is not necessarily the most efficient numerically and that, in particular, activating all the functions proximally may be advantageous. To make this viewpoint viable computationally, we derive a number of new examples of proximity operators of smooth convex functions arising in applications. A novel variational model to relax inconsistent convex feasibility problems is also investigated within the proposed framework. Several numerical applications to image recovery are presented to compare the behavior of fully proximal versus mixed proximal/gradient implementations of several splitting algorithms.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.