Papers
Topics
Authors
Recent
Search
2000 character limit reached

Randomized dual proximal gradient for large-scale distributed optimization

Published 19 Sep 2016 in math.OC and cs.DC | (1609.05713v1)

Abstract: In this paper we consider distributed optimization problems in which the cost function is separable (i.e., a sum of possibly non-smooth functions all sharing a common variable) and can be split into a strongly convex term and a convex one. The second term is typically used to encode constraints or to regularize the solution. We propose an asynchronous, distributed optimization algorithm over an undirected topology, based on a proximal gradient update on the dual problem. We show that by means of a proper choice of primal variables, the dual problem is separable and the dual variables can be stacked into separate blocks. This allows us to show that a distributed gossip update can be obtained by means of a randomized block-coordinate proximal gradient on the dual function.

Citations (5)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.