Papers
Topics
Authors
Recent
Search
2000 character limit reached

Accelerated forward-backward method with fast convergence rate for nonsmooth convex optimization beyond differentiability

Published 4 Oct 2021 in math.OC | (2110.01454v1)

Abstract: We propose an accelerated forward-backward method with fast convergence rate for finding a minimizer of a decomposable nonsmooth convex function over a closed convex set, and name it smoothing accelerated proximal gradient (SAPG) algorithm. The proposed algorithm combines the smoothing method with the proximal gradient algorithm with extrapolation $\frac{k-1}{k+\alpha-1}$ and $\alpha>3$. The updating rule of smoothing parameter $\mu_k$ is a smart scheme and guarantees the global convergence rate of $o(\ln{\sigma}k/k)$ with $\sigma\in(\frac{1}{2},1]$ on the objective function values. Moreover, we prove that the sequence is convergent to an optimal solution of the problem. Furthermore, we introduce an error term in the SAPG algorithm to get the inexact smoothing accelerated proximal gradient algorithm. And we obtain the same convergence results as the SAPG algorithm under the summability condition on the errors. Finally, numerical experiments show the effectiveness and efficiency of the proposed algorithm.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.