Papers
Topics
Authors
Recent
Search
2000 character limit reached

Petz-Rényi Information

Updated 17 January 2026
  • Petz-Rényi Information is a family of quantum Rényi-type divergences that generalizes Umegaki entropy and serves as a key measure in quantum information theory.
  • It exhibits crucial properties such as nonnegativity, monotonicity under CPTP maps, additivity, and convexity, ensuring robust operational and theoretical interpretations.
  • Computational methods like mirror-descent and alternating minimization algorithms enable efficient optimization in applications ranging from channel coding to device-independent cryptography.

The Petz-Rényi information encompasses a family of quantum Rényi-type divergences and their associated mutual and conditional information measures based on the original α-Rényi divergence introduced by Dénes Petz. These quantities generalize the Umegaki (quantum relative entropy), forming a structural backbone for operational and theoretical analyses in quantum information, statistical mechanics, and quantum field theory. The Petz-Rényi formalism yields nonnegative and data-processing-monotone quantities under broad parameter ranges and captures strong converse exponents, uncertainty relations, and refined quantum correlation structures.

1. Definition and Primary Formalism

The Petz-Rényi α-relative entropy for two positive (semi-definite) operators ρ and σ on a Hilbert space (with appropriate support conditions) is defined for α in (0,1)(1,)(0,1)\cup(1,\infty) by

Dα(ρσ)=1α1logTr[ρασ1α],D_\alpha(\rho\,\|\,\sigma) = \frac{1}{\alpha-1} \log\,\mathrm{Tr}\left[\rho^{\alpha}\,\sigma^{1-\alpha}\right]\,,

with Dα=+D_{\alpha} = +\infty when suppρsuppσ\mathrm{supp}\,\rho \nsubseteq \mathrm{supp}\,\sigma for α>1\alpha>1. The limiting case α1\alpha\to1 recovers the Umegaki relative entropy D(ρσ)D(\rho\|\sigma) (Berta et al., 2015, Androulakis et al., 2022).

Given a bipartite quantum state ρAB\rho_{AB} and positive-definite product states σAτB\sigma_A\otimes\tau_B, the (single-argument) Petz-Rényi mutual information is

Iα(A:B)ρ=Dα(ρAB    ρAρB).I_{\alpha}(A:B)_\rho = D_\alpha\left(\rho_{AB}\;\|\;\rho_A\otimes\rho_B\right)\,.

Doubly minimized and singly minimized variants are defined via optimization over marginals: Iα(A ⁣: ⁣B)ρ=infσA,τBDα(ρABσAτB),Iα(A ⁣: ⁣B)ρ=infτBDα(ρABρAτB).I_\alpha^{↓↓}(A\!:\!B)_\rho = \inf_{\sigma_A,\tau_B} D_\alpha(\rho_{AB}\,\|\,\sigma_A\otimes\tau_B)\,,\qquad I_\alpha^{↑↓}(A\!:\!B)_\rho = \inf_{\tau_B} D_\alpha(\rho_{AB}\,\|\,\rho_A\otimes\tau_B)\,. These measures collapse to the standard quantum mutual information for α=1\alpha=1. Conditional and multipartite Petz-Rényi measures are defined analogously for functionals that are linear combinations of von Neumann entropies with integer coefficients (Berta et al., 2015).

2. Core Properties and Monotonicity

For α[0,2]\alpha\in[0,2], Dα(ρσ)D_{\alpha}(\rho\|\sigma) is nonnegative with equality if and only if ρ=σ\rho=\sigma; it is monotone under completely positive trace-preserving (CPTP) maps: Dα(E(ρ)E(σ))Dα(ρσ)D_{\alpha}(\mathcal{E}(\rho) \,\|\, \mathcal{E}(\sigma)) \leq D_{\alpha}(\rho\,\|\,\sigma) (Berta et al., 2015, Kudler-Flam, 2022, Androulakis et al., 2022).

The mapping αDα(ρσ)\alpha \mapsto D_\alpha(\rho\|\sigma) is nondecreasing for all density matrices, with convexity of (α1)Dα(ρσ)(\alpha-1)D_\alpha(\rho\|\sigma) in α\alpha (Androulakis et al., 2022). The additivity property holds on tensor products: Dα(ρ1ρ2σ1σ2)=Dα(ρ1σ1)+Dα(ρ2σ2).D_\alpha(\rho_1\otimes\rho_2\,\|\,\sigma_1\otimes\sigma_2) = D_\alpha(\rho_1\|\sigma_1) + D_\alpha(\rho_2\|\sigma_2)\,. The Petz-Rényi mutual and conditional information inherit these properties including symmetry under subsystem permutation, additivity across independent systems, and data-processing inequalities under local CPTP maps (Burri, 2024, Burri, 7 Jul 2025, Cheng et al., 2018).

For the conditional mutual information, it is conjectured that αIαP(A;BC)ρ\alpha \mapsto I^{P}_\alpha(A;B|C)_{\rho} is nondecreasing, with several regime-specific proofs and numerical verification, but a general proof remains open (Berta et al., 2015).

3. Computational Methods and Optimization Algorithms

The Petz-Rényi divergence and mutual information arise in convex optimization problems, especially in quantum channel coding and hypothesis testing. Optimizing over input distributions or reference states (σ) typically yields functions that are convex or strongly convex in the parameters.

Mirror-descent algorithms, including exponentiated-gradient methods generalizing the Blahut-Arimoto scheme, exhibit provable sublinear and linear convergence rates for maximizing the Petz-Rényi capacity and mutual information of classical-quantum channels. These methods are viable due to Hölder-smoothness and strong convexity properties of the objective (Lai et al., 15 Jan 2026, Chu et al., 10 Jan 2026, Cheng et al., 2018). For the doubly minimized mutual information, alternating minimization algorithms exploiting quantum Sibson identities yield guaranteed sublinear (for α(1/2,1)\alpha\in (1/2,1)) and linear (for α(1,2]\alpha\in(1,2]) convergence, with unique global minimizers (Burri, 7 Jul 2025).

In continuous-variable settings and infinite-dimensional spaces, spectral and pushforward-of-observables formulations remain valid, and support conditions for the finiteness of DαD_\alpha are explicitly characterized (Androulakis et al., 2022, Androulakis et al., 2023).

4. Applications: Quantum Hypothesis Testing, Coding, and Device Independence

Petz-Rényi information measures critically determine strong converse exponents in quantum hypothesis testing and classical-quantum channel coding problems. The auxiliary function E0P(s,P)=sI1/(1+s)P(P,W)E_0^P(s,P) = s\,I^P_{1/(1+s)}(P, \mathscr{W}) is used in the minimax expressions for channel coding exponents, and concavity in ss is now established in previously unresolved parameter domains (Cheng et al., 2018).

In device-independent quantum cryptography and key distribution, the αα-Rényi entropy accumulation theorem and variational semidefinite programming bounds involving Petz-Rényi divergences enable sharper finite-size key-rate analyses and enhanced noise tolerance beyond von Neumann approaches (Hahn et al., 2024).

For one-shot entanglement transmission, the entanglement fidelity achieved by the Petz decoder equals the negative exponent of the singly minimized Petz-Rényi mutual information of order $1/2$ between the reference and environment across the channel's Stinespring dilation (Burri, 24 Feb 2025). In composite binary state discrimination, the direct error exponent is precisely characterized by the doubly minimized Petz-Rényi mutual information in the interval α(1/2,1)\alpha \in (1/2,1) (Burri, 2024, Burri, 25 Feb 2025).

5. Quantum Field Theory and Many-Body Physics

The Petz-Rényi mutual information in quantum field theory, as established via replica path-integral techniques, is a bona fide measure of quantum correlations satisfying nonnegativity, monotonicity under local operations, and ultraviolet finiteness in the continuum limit. In 1+1D conformal field theories, explicit expressions involve twist operators and replica-analytic continuation, bounding correlation functions and exhibiting excellent agreement with lattice numerics (Kudler-Flam, 2022).

In the modular-theoretic QFT context, the Petz-Rényi entropy generalizes the Araki-Uhlmann formula and captures genuinely quantum corrections—depending on both symmetric and antisymmetric parts of two-point functions—beyond the Umegaki entropy (Fröb et al., 2024).

6. Symmetric Petz-Rényi Relations and Uncertainty Principles

The symmetric Petz-Rényi relative entropy D~α(ρ,σ)=12[Dα(ρσ)+Dα(σρ)]\widetilde D_\alpha(\rho,\sigma) = \tfrac{1}{2}[D_\alpha(\rho\|\sigma)+D_\alpha(\sigma\|\rho)] yields tight inequalities interpolating between classical Pinsker and Holevo-type bounds, directly minimizing trade-offs between fidelity and trace distance. A generalized uncertainty relation provides explicit trade-offs between mean differences and averaged variances of observables under arbitrary quantum states, unifying quantum, classical, and thermodynamic uncertainty relations (Salazar, 2024).

At α=1/2\alpha=1/2, the symmetric version equals negative log Holevo affinity, with tight trace-distance bounds improving over the asymmetric case. The monotonicity and data-processing properties are preserved for α[0,1]\alpha\in[0,1], while the problem remains open for α>1\alpha>1.

7. Comparison to Alternative Rényi Generalizations and "Pretty Good" Quantities

Apart from the Petz (standard) Rényi divergence, minimal/sandwiched and log-Euclidean quantum Rényi divergences are also in use, each with distinctive operational meanings. The "pretty good" measurement and fidelity constructions are linked to the Petz divergence at specific orders (α=1/2\alpha=1/2), with explicit optimality conditions and performance bounds (Iten et al., 2016).

The Petz divergence typically has more relaxed support constraints for α<1\alpha<1 and matches the sandwiched divergence only under commutativity or at α-limits. The reverse Araki-Lieb-Thirring inequality bounds the Petz and sandwiched divergences, providing a precise operational gap (Iten et al., 2016).

Table: Fundamental Properties of Petz-Rényi α-Relative Entropy

Property Parameter Regime Statement
Positivity all α Dα(ρσ)0D_\alpha(\rho\|\sigma)\geq 0, equality iff ρ=σ\rho=\sigma
Data-processing (CPTP) α∈[0,2] Dα(E(ρ)E(σ))Dα(ρσ)D_\alpha(\mathcal{E}(\rho)\|\mathcal{E}(\sigma))\leq D_\alpha(\rho\|\sigma)
Additivity all α Dα(ρ1ρ2σ1σ2)=Dα(ρ1σ1)+Dα(ρ2σ2)D_\alpha(\rho_1\otimes\rho_2\|\sigma_1\otimes\sigma_2)=D_\alpha(\rho_1\|\sigma_1)+D_\alpha(\rho_2\|\sigma_2)
Monotonicity in α all α Dα(ρσ)D_\alpha(\rho\|\sigma) is nondecreasing in α
α→1 limit (Umegaki) α→1 Dα(ρσ)D(ρσ)D_\alpha(\rho\|\sigma)\rightarrow D(\rho\|\sigma)
Convexity in (α1)D(\alpha-1)D all α (α1)Dα(ρσ)(\alpha-1)D_\alpha(\rho\|\sigma) convex in α

References

  • (Kudler-Flam, 2022) “Rényi Mutual Information in Quantum Field Theory”
  • (Berta et al., 2015) “Rényi generalizations of quantum information measures”
  • (Cheng et al., 2018) “Properties of Noncommutative Renyi and Augustin Information”
  • (Androulakis et al., 2022) “Relative Entropy via Distribution of Observables”
  • (Burri, 2024) “Doubly minimized Petz Renyi mutual information: Properties and operational interpretation from direct exponent”
  • (Burri, 7 Jul 2025) “Alternating minimization for computing doubly minimized Petz Renyi mutual information”
  • (Burri, 25 Feb 2025) “Min-reflected entropy = doubly minimized Petz Renyi mutual information of order 1/2”
  • (Burri, 24 Feb 2025) “Entanglement fidelity of Petz decoder for one-shot entanglement transmission”
  • (Salazar, 2024) “Symmetric Petz-Rényi relative entropy uncertainty relation”
  • (Lai et al., 15 Jan 2026) “A Mirror-Descent Algorithm for Computing the Petz-Rényi Capacity of Classical-Quantum Channels”
  • (Fröb et al., 2024) “Petz-Rényi relative entropy in QFT from modular theory”
  • (Androulakis et al., 2023) “Petz-Rényi Relative Entropy of Thermal States and their Displacements”
  • (Hahn et al., 2024) “Bounds on Petz-Rényi Divergences and their Applications for Device-Independent Cryptography”
  • (Iten et al., 2016) “Pretty good measures in quantum information theory”

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Petz-Rényi Information.