Papers
Topics
Authors
Recent
Search
2000 character limit reached

Classical algorithm for simulating experimental Gaussian boson sampling

Published 6 Jun 2023 in quant-ph | (2306.03709v2)

Abstract: Gaussian boson sampling is a promising candidate for showing experimental quantum advantage. While there is evidence that noiseless Gaussian boson sampling is hard to efficiently simulate using a classical computer, the current Gaussian boson sampling experiments inevitably suffer from loss and other noise models. Despite a high photon loss rate and the presence of noise, they are currently claimed to be hard to classically simulate with the best-known classical algorithm. In this work, we present a classical tensor-network algorithm that simulates Gaussian boson sampling and whose complexity can be significantly reduced when the photon loss rate is high. By generalizing the existing thermal-state approximation algorithm of lossy Gaussian boson sampling, the proposed algorithm allows us to achieve increased accuracy as the running time of the algorithm scales, as opposed to the algorithm that samples from the thermal state, which can give only a fixed accuracy. This generalization enables us to simulate the largest scale Gaussian boson sampling experiment so far using relatively modest computational resources, even though the output state of these experiments is not believed to be close to a thermal state. By demonstrating that our new classical algorithm outperforms the large-scale experiments on the benchmarks used as evidence for quantum advantage, we exhibit evidence that our classical sampler can simulate the ground-truth distribution better than the experiment can, which disputes the experimental quantum advantage claims.

Citations (19)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 6 likes about this paper.