Logical Entropy: Theory & Applications
- Logical entropy is a quadratic measure defined as 1 minus the sum of squared probabilities, quantifying the chance that two independent picks yield distinct outcomes.
- It extends to joint, conditional, and mutual forms via the dit–bit transform, offering exact, set-theoretic analogs to traditional Shannon identities.
- In quantum settings, logical entropy measures state purity and decoherence through density matrices, serving as a practical tool for analyzing distinguishability.
Logical entropy is a measure-theoretically grounded, quadratic information measure that quantifies distinctions—interpreted as pairs of elements or states that are separated by a partition, probability distribution, or quantum observable. Unlike Shannon/von Neumann entropy, which is based on coding-theoretic notions of uncertainty and surprise, logical entropy arises directly from the logic of partitions and has a straightforward combinatorial and probabilistic interpretation as the normalized count or probability of distinctions. This construction has broad applicability, including classical probability, quantum information, qualitative logics of knowability, and the formal analysis of computational irreversibility.
1. Logical Entropy in Partition Logic
Logical entropy originates from the categorical duality between subsets and partitions. In set theory, the Boolean logic of subsets treats elements as the primitive atoms. Partition logic, by contrast, treats distinctions (dits)—ordered pairs of elements that are separated by a partition—as atomic (Ellerman, 2016). For a finite set %%%%1%%%% partitioned into blocks , the distinction set is
The logical entropy of is defined as the normalized size of this set: With a probability distribution over blocks, the logical entropy is
Probabilistically, is the chance that two independent samples fall into different blocks, i.e., under (Ellerman, 2021).
2. Compound Logical Entropies and the Dit–Bit Transform
Logical entropy admits joint, conditional, and mutual definitions that precisely mimic measure-theoretic Venn-diagram relations:
- Joint logical entropy:
- Conditional:
- Mutual:
These relationships yield exact analogs to Shannon’s identities, but with a direct basis in set measure. The Shannon formulas are systematically derived by the "dit–bit transform," which uniformly replaces each by inside the expectation (Ellerman, 2017, Ellerman, 2021, Ellerman, 2016): This transform yields the standard Shannon information measures while preserving all Venn-diagram relations as formal consequences of the set-theoretic basis for logical entropy.
3. Quantum Logical Entropy
Logical entropy extends naturally to quantum mechanics by replacing the classical probability distribution with a density operator on a Hilbert space (Ellerman, 2017, Tamir et al., 2014, Tamir et al., 2021): If is diagonal with eigenvalues , , mirroring the classical case. Logical entropy is zero for pure states and maximal for the maximally mixed state: where is the Hilbert space dimension (Tamir et al., 2021). The value quantifies the probability that two independent preparations yield orthogonal outcomes and is operationally equivalent to the probability of distinguishing two eigenstates in a projective measurement.
Quantum logical entropy, also known as the "linear entropy" or the Tsallis entropy at , is widely used as a mixedness or decoherence measure and exhibits properties including unitary invariance, joint convexity, subadditivity for product/directed states, and monotonicity under unital maps and partial trace. For product states, (Tamir et al., 2014, Tamir et al., 2021). Measurement, modeled by the Lüders rule, cannot decrease logical entropy.
4. Measurement, Decoherence, and the Fundamental Theorem
A central result in quantum logical entropy is the explicit connection between projective measurement and the creation of distinctions (decoherence) (Ellerman, 2017, Ellerman, 2016). For a projective measurement (with ), the increase in logical entropy is
where the sum is over matrix elements "zeroed out" (i.e., decohered) by the measurement. This theorem quantifies the amount of coherence lost as the number of distinctions made—something for which the von Neumann entropy provides no such direct analysis. Logical entropy thus provides a fine-grained instrument for tracking decoherence and state distinguishability under quantum measurements (Ellerman, 2017, Ellerman, 2016).
5. Logical Entropy in Information Theory and Computation
Logical entropy serves as a measure of information in terms of distinctions rather than bits, aligning information theory with the structure of partition logic (Ellerman, 2016, Ellerman, 2021). In computational settings, variants such as "arithmetic logical entropy" quantify the average information lost by a deterministic operation (Lapin, 2022). Here, logical entropy is the conditional Shannon entropy , explicitly counting how many bits are irreversibly erased due to the many-to-one nature of . This measure is foundational for the analysis of computational irreversibility, Landauer's principle, and the limits of algorithmic decidability.
Table 1: Logical entropy in different domains
| Domain | Definition | Interpretation |
|---|---|---|
| Classical (partition) | Probability of distinction for two draws | |
| Classical (distribution) | Chance draws yield different outcomes | |
| Quantum (density) | Chance of distinguishing eigenstates | |
| Computation | Avg. bits erased by logical/arithmetic operation |
6. Qualitative and Relational Logical Entropy
Logical entropy can be interpreted in qualitative, non-numeric frameworks as well. Informational (logical) entropy in modal logic is modeled via a reflexive, not-necessarily-symmetric binary relation on a state space , where encodes that is indiscernible from . This approach demarcates the "boundary to knowability" within semantic/conceptual structures, relevant in epistemic logic and the modeling of limits in perception, language, and knowledge (Conradie et al., 2019). Here, logical entropy abstracts away from quantitative measures, providing a structural boundary to distinctions that can be detected or known.
7. Negative Probabilities and Continuous Extensions
The quadratic form of logical entropy, , allows extensions to contexts involving negative or quasi-probabilities, such as those encountered in Wigner function formalism and quasi-distributions in quantum mechanics (Manfredi, 2022). The evolution of logical entropy-preserving densities can lead, under mild assumptions, directly to the quantum-mechanical evolution (e.g., Wigner–Moyal equation), suggesting a fundamental role for logical entropy in underpinning non-classical probabilistic structures in physics.
References: (Ellerman, 2017, Ellerman, 2021, Tamir et al., 2014, Ellerman, 2016, Ellerman, 2016, Tamir et al., 2021, Manfredi, 2022, Conradie et al., 2019, Lapin, 2022)