Distributed Online Randomized Gradient-Free Optimization with Compressed Communication
Abstract: This paper addresses two fundamental challenges in distributed online convex optimization: communication efficiency and optimization under limited feedback. We propose a unified framework named Online Compressed Gradient Tracking (OCGT), which includes two variants: One-point Bandit Feedback (OCGT-BF) and Stochastic Gradient Feedback (OCSGT). The proposed algorithms harness data compression and either gradient-free or stochastic gradient optimization techniques within distributed networks. The proposed framework incorporates a compression scheme with error compensation mechanisms to reduce communication overhead while maintaining convergence guarantees. Unlike traditional approaches that assume perfect communication and full gradient access, OCGT operates effectively under practical constraints by combining gradient-like tracking with one-point or stochastic gradient feedback estimation. We provide a theoretical analysis demonstrating dynamic regret bounds for both variants. Finally, extensive experiments validate that OCGT achieves low dynamic regret while significantly reducing communication requirements.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.