Bregman Proximal Method for Efficient Communications under Similarity
Abstract: We propose a novel stochastic distributed method for both monotone and strongly monotone variational inequalities with Lipschitz operator and proper convex regularizers arising in various applications from game theory to adversarial training. By exploiting similarity, our algorithm overcomes the communication bottleneck that is a major issue in distributed optimization. The proposed method enjoys optimal communication complexity. All the existing distributed algorithms achieving the lower bounds under similarity condition essentially utilize the Euclidean setup. In contrast to them, our method is built upon the Bregman proximal maps and it is compatible with an arbitrary problem geometry. Thereby the proposed method fills an existing gap in this area of research. Our theoretical results are confirmed by numerical experiments on a stochastic matrix game.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.