Papers
Topics
Authors
Recent
Search
2000 character limit reached

Communication-efficient Quantum Algorithm for Distributed Machine Learning

Published 11 Sep 2022 in quant-ph | (2209.04888v1)

Abstract: The growing demands of remote detection and increasing amount of training data make distributed machine learning under communication constraints a critical issue. This work provides a communication-efficient quantum algorithm that tackles two traditional machine learning problems, the least-square fitting and softmax regression problem, in the scenario where the data set is distributed across two parties. Our quantum algorithm finds the model parameters with a communication complexity of $O(\frac{\log_2(N)}{\epsilon})$, where $N$ is the number of data points and $\epsilon$ is the bound on parameter errors. Compared to classical algorithms and other quantum algorithms that achieve the same output task, our algorithm provides a communication advantage in the scaling with the data volume. The building block of our algorithm, the quantum-accelerated estimation of distributed inner product and Hamming distance, could be further applied to various tasks in distributed machine learning to accelerate communication.

Citations (8)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.