Papers
Topics
Authors
Recent
Search
2000 character limit reached

Consensus Needs Broadcast in Noiseless Models but can be Exponentially Easier in the Presence of Noise

Published 15 Jul 2018 in cs.DC | (1807.05626v1)

Abstract: Consensus and Broadcast are two fundamental problems in distributed computing, whose solutions have several applications. Intuitively, Consensus should be no harder than Broadcast, and this can be rigorously established in several models. Can Consensus be easier than Broadcast? In models that allow noiseless communication, we prove a reduction of (a suitable variant of) Broadcast to binary Consensus, that preserves the communication model and all complexity parameters such as randomness, number of rounds, communication per round, etc., while there is a loss in the success probability of the protocol. Using this reduction, we get, among other applications, the first logarithmic lower bound on the number of rounds needed to achieve Consensus in the uniform GOSSIP model on the complete graph. The lower bound is tight and, in this model, Consensus and Broadcast are equivalent. We then turn to distributed models with noisy communication channels that have been studied in the context of some bio-inspired systems. In such models, only one noisy bit is exchanged when a communication channel is established between two nodes, and so one cannot easily simulate a noiseless protocol by using error-correcting codes. An $\Omega(\epsilon{-2} n)$ lower bound on the number of rounds needed for Broadcast is proved by Boczkowski et al. [PLOS Comp. Bio. 2018] in one such model (noisy uniform PULL, where $\epsilon$ is a parameter that measures the amount of noise). In such model, we prove a new $\Theta(\epsilon{-2} n \log n)$ bound for Broadcast and a $\Theta(\epsilon{-2} \log n)$ bound for binary Consensus, thus establishing an exponential gap between the number of rounds necessary for Consensus versus Broadcast.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.