Papers
Topics
Authors
Recent
Search
2000 character limit reached

Second-order asymptotics for source coding, dense coding and pure-state entanglement conversions

Published 11 Mar 2014 in quant-ph | (1403.2543v6)

Abstract: We introduce two variants of the information spectrum relative entropy defined by Tomamichel and Hayashi which have the particular advantage of satisfying the data-processing inequality, i.e. monotonicity under quantum operations. This property allows us to obtain one-shot bounds for various information-processing tasks in terms of these quantities. Moreover, these relative entropies have a second order asymptotic expansion, which in turn yields tight second order asymptotics for optimal rates of these tasks in the i.i.d. setting. The tasks studied in this paper are fixed-length quantum source coding, noisy dense coding, entanglement concentration, pure-state entanglement dilution, and transmission of information through a classical-quantum channel. In the latter case, we retrieve the second order asymptotics obtained by Tomamichel and Tan. Our results also yield the known second order asymptotics of fixed-length classical source coding derived by Hayashi. The second order asymptotics of entanglement concentration and dilution provide a refinement of the inefficiency of these protocols - a quantity which, in the case of entanglement dilution, was studied by Harrow and Lo. We prove how the discrepancy between the optimal rates of these two processes in the second order implies the irreversibility of entanglement concentration established by Kumagai and Hayashi. In addition, the spectral divergence rates of the Information Spectrum Approach (ISA) can be retrieved from our relative entropies in the asymptotic limit. This enables us to directly obtain the more general results of the ISA from our one-shot bounds.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.