Papers
Topics
Authors
Recent
Search
2000 character limit reached

Logical Entropy: Theory & Applications

Updated 4 February 2026
  • Logical entropy is a quadratic measure defined as 1 minus the sum of squared probabilities, quantifying the chance that two independent picks yield distinct outcomes.
  • It extends to joint, conditional, and mutual forms via the dit–bit transform, offering exact, set-theoretic analogs to traditional Shannon identities.
  • In quantum settings, logical entropy measures state purity and decoherence through density matrices, serving as a practical tool for analyzing distinguishability.

Logical entropy is a measure-theoretically grounded, quadratic information measure that quantifies distinctions—interpreted as pairs of elements or states that are separated by a partition, probability distribution, or quantum observable. Unlike Shannon/von Neumann entropy, which is based on coding-theoretic notions of uncertainty and surprise, logical entropy arises directly from the logic of partitions and has a straightforward combinatorial and probabilistic interpretation as the normalized count or probability of distinctions. This construction has broad applicability, including classical probability, quantum information, qualitative logics of knowability, and the formal analysis of computational irreversibility.

1. Logical Entropy in Partition Logic

Logical entropy originates from the categorical duality between subsets and partitions. In set theory, the Boolean logic of subsets treats elements as the primitive atoms. Partition logic, by contrast, treats distinctions (dits)—ordered pairs of elements (u,u)(u,u') that are separated by a partition—as atomic (Ellerman, 2016). For a finite set %%%%1%%%% partitioned into blocks π={B1,...,Bm}\pi = \{B_1, ..., B_m\}, the distinction set is

dit(π)={(u,u)U×U:u and u in different blocks of π}.\mathrm{dit}(\pi) = \{ (u,u') \in U \times U : u \text{ and } u' \text{ in different blocks of } \pi \}.

The logical entropy of π\pi is defined as the normalized size of this set: h(π)=dit(π)U2=1i=1m(BiU)2.h(\pi) = \frac{|\mathrm{dit}(\pi)|}{|U|^2} = 1 - \sum_{i=1}^m \left(\frac{|B_i|}{|U|}\right)^2. With a probability distribution p=(p1,...,pm)p = (p_1, ..., p_m) over blocks, the logical entropy is

h(p)=1i=1mpi2.h(p) = 1 - \sum_{i=1}^m p_i^2.

Probabilistically, h(π)h(\pi) is the chance that two independent samples fall into different blocks, i.e., h(π)=P(uu)h(\pi) = \mathbb{P}(u \neq u') under π\pi (Ellerman, 2021).

2. Compound Logical Entropies and the Dit–Bit Transform

Logical entropy admits joint, conditional, and mutual definitions that precisely mimic measure-theoretic Venn-diagram relations:

  • Joint logical entropy: h(π,σ)=dit(π)dit(σ)n2h(\pi, \sigma) = \frac{| \mathrm{dit}(\pi) \cup \mathrm{dit}(\sigma) |}{n^2}
  • Conditional: h(πσ)=h(πσ)h(σ)h(\pi \mid \sigma) = h(\pi \vee \sigma) - h(\sigma)
  • Mutual: m(π,σ)=h(π)+h(σ)h(π,σ)m(\pi, \sigma) = h(\pi) + h(\sigma) - h(\pi, \sigma)

These relationships yield exact analogs to Shannon’s identities, but with a direct basis in set measure. The Shannon formulas are systematically derived by the "dit–bit transform," which uniformly replaces each 1pi1-p_i by log2(1/pi)\log_2(1/p_i) inside the expectation (Ellerman, 2017, Ellerman, 2021, Ellerman, 2016): h(p)=ipi(1pi)H(p)=ipilog2(1/pi)h(p) = \sum_{i} p_i (1-p_i) \longmapsto H(p) = \sum_i p_i \log_2 (1/p_i) This transform yields the standard Shannon information measures while preserving all Venn-diagram relations as formal consequences of the set-theoretic basis for logical entropy.

3. Quantum Logical Entropy

Logical entropy extends naturally to quantum mechanics by replacing the classical probability distribution with a density operator ρ\rho on a Hilbert space (Ellerman, 2017, Tamir et al., 2014, Tamir et al., 2021): h(ρ)=1Tr(ρ2)h(\rho) = 1 - \mathrm{Tr}(\rho^2) If ρ\rho is diagonal with eigenvalues λi\lambda_i, h(ρ)=1iλi2h(\rho) = 1 - \sum_i \lambda_i^2, mirroring the classical case. Logical entropy is zero for pure states and maximal for the maximally mixed state: 0h(ρ)11d0 \leq h(\rho) \leq 1 - \frac{1}{d} where dd is the Hilbert space dimension (Tamir et al., 2021). The value h(ρ)h(\rho) quantifies the probability that two independent preparations yield orthogonal outcomes and is operationally equivalent to the probability of distinguishing two eigenstates in a projective measurement.

Quantum logical entropy, also known as the "linear entropy" or the Tsallis entropy at q=2q=2, is widely used as a mixedness or decoherence measure and exhibits properties including unitary invariance, joint convexity, subadditivity for product/directed states, and monotonicity under unital maps and partial trace. For product states, h(ρAρB)=h(ρA)+h(ρB)h(ρA)h(ρB)h(\rho_A \otimes \rho_B) = h(\rho_A) + h(\rho_B) - h(\rho_A) h(\rho_B) (Tamir et al., 2014, Tamir et al., 2021). Measurement, modeled by the Lüders rule, cannot decrease logical entropy.

4. Measurement, Decoherence, and the Fundamental Theorem

A central result in quantum logical entropy is the explicit connection between projective measurement and the creation of distinctions (decoherence) (Ellerman, 2017, Ellerman, 2016). For a projective measurement ρρ\rho \mapsto \rho' (with ρ=iPiρPi\rho' = \sum_i P_i \rho P_i), the increase in logical entropy is

Δh=h(ρ)h(ρ)=jkρjk2\Delta h = h(\rho') - h(\rho) = \sum_{j \ne k} | \rho_{jk} |^2

where the sum is over matrix elements "zeroed out" (i.e., decohered) by the measurement. This theorem quantifies the amount of coherence lost as the number of distinctions made—something for which the von Neumann entropy S(ρ)=Tr[ρlogρ]S(\rho) = -\mathrm{Tr}[\rho \log \rho] provides no such direct analysis. Logical entropy thus provides a fine-grained instrument for tracking decoherence and state distinguishability under quantum measurements (Ellerman, 2017, Ellerman, 2016).

5. Logical Entropy in Information Theory and Computation

Logical entropy serves as a measure of information in terms of distinctions rather than bits, aligning information theory with the structure of partition logic (Ellerman, 2016, Ellerman, 2021). In computational settings, variants such as "arithmetic logical entropy" quantify the average information lost by a deterministic operation f:XYf : X \to Y (Lapin, 2022). Here, logical entropy is the conditional Shannon entropy H(XY)H(X|Y), explicitly counting how many bits are irreversibly erased due to the many-to-one nature of ff. This measure is foundational for the analysis of computational irreversibility, Landauer's principle, and the limits of algorithmic decidability.

Table 1: Logical entropy in different domains

Domain Definition Interpretation
Classical (partition) h(π)=1pi2h(\pi) = 1 - \sum p_i^2 Probability of distinction for two draws
Classical (distribution) h(p)=1pi2h(p) = 1 - \sum p_i^2 Chance draws yield different outcomes
Quantum (density) h(ρ)=1Tr(ρ2)h(\rho) = 1 - \mathrm{Tr}(\rho^2) Chance of distinguishing eigenstates
Computation Harith(f)=H(XY)H_{\mathrm{arith}}(f) = H(X|Y) Avg. bits erased by logical/arithmetic operation

6. Qualitative and Relational Logical Entropy

Logical entropy can be interpreted in qualitative, non-numeric frameworks as well. Informational (logical) entropy in modal logic is modeled via a reflexive, not-necessarily-symmetric binary relation EZ×ZE \subseteq Z \times Z on a state space ZZ, where aEba\,E\,b encodes that aa is indiscernible from bb. This approach demarcates the "boundary to knowability" within semantic/conceptual structures, relevant in epistemic logic and the modeling of limits in perception, language, and knowledge (Conradie et al., 2019). Here, logical entropy abstracts away from quantitative measures, providing a structural boundary to distinctions that can be detected or known.

7. Negative Probabilities and Continuous Extensions

The quadratic form of logical entropy, 1pi21 - \sum p_i^2, allows extensions to contexts involving negative or quasi-probabilities, such as those encountered in Wigner function formalism and quasi-distributions in quantum mechanics (Manfredi, 2022). The evolution of logical entropy-preserving densities can lead, under mild assumptions, directly to the quantum-mechanical evolution (e.g., Wigner–Moyal equation), suggesting a fundamental role for logical entropy in underpinning non-classical probabilistic structures in physics.


References: (Ellerman, 2017, Ellerman, 2021, Tamir et al., 2014, Ellerman, 2016, Ellerman, 2016, Tamir et al., 2021, Manfredi, 2022, Conradie et al., 2019, Lapin, 2022)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Logical Entropy.