Papers
Topics
Authors
Recent
Search
2000 character limit reached

Distributed Online Randomized Gradient-Free Optimization with Compressed Communication

Published 5 Dec 2025 in math.OC | (2512.05775v1)

Abstract: This paper addresses two fundamental challenges in distributed online convex optimization: communication efficiency and optimization under limited feedback. We propose a unified framework named Online Compressed Gradient Tracking (OCGT), which includes two variants: One-point Bandit Feedback (OCGT-BF) and Stochastic Gradient Feedback (OCSGT). The proposed algorithms harness data compression and either gradient-free or stochastic gradient optimization techniques within distributed networks. The proposed framework incorporates a compression scheme with error compensation mechanisms to reduce communication overhead while maintaining convergence guarantees. Unlike traditional approaches that assume perfect communication and full gradient access, OCGT operates effectively under practical constraints by combining gradient-like tracking with one-point or stochastic gradient feedback estimation. We provide a theoretical analysis demonstrating dynamic regret bounds for both variants. Finally, extensive experiments validate that OCGT achieves low dynamic regret while significantly reducing communication requirements.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.