Papers
Topics
Authors
Recent
Search
2000 character limit reached

Tsallis Relative Entropy

Updated 26 September 2025
  • Tsallis relative entropy is a q-parametric divergence measuring dissimilarity between probability distributions and quantum states, converging to KL divergence as q approaches 1.
  • It underpins nonextensive statistical mechanics by enabling maximum entropy formulations that derive q-canonical and q-Gaussian distributions.
  • Widely applied in quantum coherence, robust optimization, and financial risk, it supports operator inequalities and resource theories in modern physics.

Tsallis relative entropy is a parametric generalization of Kullback–Leibler divergence, foundational to nonextensive statistical mechanics and nonadditive information theory. It quantifies the dissimilarity between probability distributions or quantum states and introduces a deformation parameter that interpolates between different divergence regimes, making it fundamental in the analysis of systems exhibiting long-range interactions, heavy-tailed distributions, or robustness against anomalies. The precise formulation, properties under quantum operations, and its broad spectrum of applications in statistical physics, quantum information theory, resource theories of coherence and imaginarity, as well as robust optimization under model uncertainty, have been systematically developed over the past decades.

1. Mathematical Definition and Structural Properties

The Tsallis relative entropy (also termed qq-relative entropy, qq-logarithmic relative entropy, or Tsallis divergence) between probability distributions p=(pi)p = (p_i) and r=(ri)r = (r_i) (for qR,q1q \in \mathbb{R}, q \ne 1) is defined by: Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right) For quantum states (density matrices) ρ,σ\rho, \sigma, with $0 < q < 2$,

Dq(ρσ)=1q1(Tr[ρqσ1q]1)D_q(\rho \| \sigma) = \frac{1}{q-1} \left( \operatorname{Tr}\left[\rho^q\,\sigma^{1-q}\right] - 1 \right)

Various operator extensions have been constructed, including the “sandwiched” Tsallis relative entropy: D~qT(ρσ)=Tr{(σ1q2qρσ1q2q)q}1q1\widetilde{D}^T_q(\rho\|\sigma) = \frac{\operatorname{Tr}\{ (\sigma^{\frac{1-q}{2q}} \rho\, \sigma^{\frac{1-q}{2q}} )^q \}-1}{q-1} The Tsallis logarithm and exponential, which underpin all such formulas, are: qq0 For qq1, the Tsallis relative entropy converges to the classical (Umegaki/Kullback–Leibler) relative entropy.

Key structural properties of Tsallis relative entropy include:

  • Nonnegativity: qq2, with equality if and only if qq3.
  • Joint Convexity: qq4 is jointly convex for qq5.
  • Monotonicity (Data Processing Inequality): For quantum channels (completely positive, trace preserving maps) qq6, qq7 for qq8.
  • Partition Inequality/Data Processing: For convex partitionings, qq9 holds if and only if the divergence is Tsallis relative entropy (Vigelis et al., 2018).

2. Information-Theoretic Role: Maximum Entropy Principles and Inequalities

In nonextensive statistical mechanics, Tsallis relative entropy underpins the derivation of p=(pi)p = (p_i)0-canonical and p=(pi)p = (p_i)1-Gaussian equilibrium distributions via a maximum entropy principle. Specifically, for p=(pi)p = (p_i)2-expectation constraints,

p=(pi)p = (p_i)3

the maximization of Tsallis entropy yields p=(pi)p = (p_i)4-canonical and p=(pi)p = (p_i)5-Gaussian forms, with Tsallis relative entropy’s nonnegativity allowing for direct proofs of optimality without recourse to Lagrange multipliers (Furuichi, 2010). This methodology:

  • Establishes Tsallis relative entropy as a fundamental technical device for solving constrained extremal entropy problems.
  • Ensures that candidate maximizers (p=(pi)p = (p_i)6-Gaussian, p=(pi)p = (p_i)7-canonical) are unique via positivity of the divergence.

Moreover, Tsallis relative entropy is integral to a suite of functional inequalities:

  • Trace inequalities: For positive matrices, extensions of Golden–Thompson and Peierls–Bogoliubov inequalities to the Tsallis regime provide upper and lower bounds on p=(pi)p = (p_i)8, often written as:

p=(pi)p = (p_i)9

  • Pinsker-type bounds: Link the trace distance r=(ri)r = (r_i)0 to Tsallis relative entropy by

r=(ri)r = (r_i)1

  • Fannes-type continuity bounds: Guarantee uniform continuity of the Tsallis entropy and related coherence measures under small perturbations in state (quantified via trace distance) (Rastegin, 2011, Vershynina, 2022).

3. Quantum Information Applications: Coherence, Correlations, and Imaginarity

Resource Theory of Quantum Coherence

Distance-based coherence measures employ Tsallis relative entropy: r=(ri)r = (r_i)2 where r=(ri)r = (r_i)3 denotes the set of incoherent (diagonal in the reference basis) states. However, although r=(ri)r = (r_i)4 is nonnegative and vanishes on r=(ri)r = (r_i)5 (Vershynina, 2019), it generally fails the strong monotonicity property unless carefully modified (Zhao et al., 2017, Vershynina, 2022). Remedying this, families of coherence monotones are defined via: r=(ri)r = (r_i)6 which satisfy all resource-theoretic axioms (nullity, monotonicity, convexity, strong monotonicity). For block-diagonal (subspace independent) states, additivity is also maintained (Guo et al., 2020).

Quantum Correlations and Discord

Geometric measures of quantum correlations and discord quantify the distance from the set of classical-quantum (or classical-classical) states using Tsallis relative entropy: r=(ri)r = (r_i)7 The minimization has analytic solutions for certain r=(ri)r = (r_i)8, and in pure states boils down to explicit functions of Schmidt coefficients (1811.11453, Vershynina, 2019).

Imaginarity Resource Theory

Tsallis relative entropy directly quantifies the “imaginarity” resource, measuring deviation from reality in a fixed basis via: r=(ri)r = (r_i)9 which vanishes if and only if qR,q1q \in \mathbb{R}, q \ne 10 is real and is efficiently computable for Gaussian states (Xu, 2023).

4. Operator and Matrix Analysis

Tsallis relative entropy admits operator analogues, essential for quantum systems analysis: qR,q1q \in \mathbb{R}, q \ne 11 and matrix trace inequalities such as: qR,q1q \in \mathbb{R}, q \ne 12 are tied to convexity properties, operator monotonicity, and Hermite–Hadamard–based bounds (Furuichi, 2010, Moradi et al., 2017, Furuichi et al., 2017, Furuichi et al., 2020).

Significant advances include:

  • Sharp operator bounds: Both upper and lower, exploiting convexity and generalized Young inequalities.
  • Monotonicity under positive maps: Key for operational applications, now proven in improved forms.

5. Robust Optimization and Stochastic Control

Tsallis relative entropy has emerged as a penalty in robust stochastic control and mathematical finance. In robust utility maximization, the objective incorporates a Tsallis penalty for deviation from a reference measure qR,q1q \in \mathbb{R}, q \ne 13: qR,q1q \in \mathbb{R}, q \ne 14 This distortive term modifies the generator of the associated quadratic backward stochastic differential equation (BSDE), leading to a value function process qR,q1q \in \mathbb{R}, q \ne 15 as solution: qR,q1q \in \mathbb{R}, q \ne 16 with

qR,q1q \in \mathbb{R}, q \ne 17

The stochastic maximum principle derived in this context yields necessary conditions for optimal consumption and terminal wealth in the presence of model ambiguity penalized by Tsallis divergence (Huang et al., 25 Sep 2025).

6. Financial Risk, Portfolio Construction, and Asymmetric Extensions

In finance, Tsallis relative entropy serves as a risk measure that captures portfolio “distance” from market distributions, particularly effective for systems with heavy-tailed, asymmetric return profiles. Risk (TRE) between an asset distribution qR,q1q \in \mathbb{R}, q \ne 18 and a market index qR,q1q \in \mathbb{R}, q \ne 19 is: Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)0 Explicit modeling with Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)1-Gaussians (fitting both positive and negative return regimes separately for asymmetry) yields robust, stable risk–return profiles, often outperforming classical risk measures such as CAPM Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)2—notably in turbulent financial periods (Devi, 2019, Devi et al., 2022).

The asymmetric extension (ATRE) incorporates distinct Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)3 and Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)4 parameters for returns below and above zero, respectively, and results in improved goodness-of-fit and higher risk–return relationship slopes for portfolios constructed under crisis regimes.

7. Continuity, Uniqueness, and Limitations

Explicit continuity bounds demonstrate that Tsallis relative entropy–coherence measures are stable under perturbations, with explicit Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)5-divergence–dependent error control (Rastegin, 2011, Vershynina, 2022). Rigorous characterization theorems establish that, under minimal symmetry and Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)6-multiplicativity axioms, Tsallis relative entropy is (up to scaling) the only divergence possessing these properties (Leinster, 2017). However, Tsallis-based coherence differences (e.g., Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)7) are not, in general, genuine monotones unless restricted to a very narrow class of operations (e.g., Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)8-GIO) (Vershynina, 2022).

8. Summary Table of Key Mathematical Objects and Properties

Concept Definition Key Property/Role
Sq(pr)=1q1(ipiqri1q1)S_q(p \| r) = \frac{1}{q - 1} \left( \sum_i p_i^q r_i^{1-q} - 1 \right)9 ρ,σ\rho, \sigma0 Nonnegativity, Data processing, Recovers KL divergence as ρ,σ\rho, \sigma1
ρ,σ\rho, \sigma2 ρ,σ\rho, \sigma3 Operator monotonicity, Convexity, Inequality bounds
ρ,σ\rho, \sigma4 (coherence) ρ,σ\rho, \sigma5 (with corrected form for monotonicity) Resource monotone under incoherent operations (with suitable correction)
ρ,σ\rho, \sigma6 (correl.) ρ,σ\rho, \sigma7 over classical-quantum states ρ,σ\rho, \sigma8 Quantum discord measure, analytic for many classes
ρ,σ\rho, \sigma9 (penalty) $0 < q < 2$0 Robust control penalty, quadratic BSDE generator
Pinsker-type bound $0 < q < 2$1 Lower bound in terms of variational distance
ATRE ($0 < q < 2$2) Piecewise $0 < q < 2$3-Gaussians for positive/negative returns Financial risk measure for asymmetric distributions

9. Outlook and Further Developments

The multidimensional applicability of Tsallis relative entropy, from operator inequalities to resource quantification, robust financial optimization, and information geometry stems from its unique blend of mathematical flexibility and operational interpretability. Outstanding challenges include the design of further corrected resource measures with full monotonicity, investigation of operational/thermodynamic tasks leveraging the flexibility of $0 < q < 2$4-parametric divergence, exploration of numerical schemes for quadratic BSDEs with Tsallis penalties, and further links to information geometry via general deformed exponential families (Vigelis et al., 2018).

Tsallis relative entropy thus continues to be a central tool in the advancement of both theoretical frameworks and applied methodologies in nonextensive statistical mechanics, quantum information theory, and robust optimization.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Tsallis Relative Entropy.