Papers
Topics
Authors
Recent
Search
2000 character limit reached

Shannon Entropy Reinterpreted

Published 23 Jun 2017 in cond-mat.stat-mech | (1706.07735v2)

Abstract: In this paper we remark that Shannon entropy can be expressed as a function of the self-information (i.e. the logarithm) and the inverse of the Lambert $W$ function. It means that we consider that Shannon entropy has the trace form: $-k \sum_{i} W{-1} \circ \mathsf{ln}(p_{i})$. Based on this remark we define a generalized entropy which has as a limit the Shannon entropy. In order to facilitate the reasoning this generalized entropy is obtained by a one-parameter deformation of the logarithmic function. Introducing a new concept of independence of two systems the Shannon additivity is replaced by a non-commutative and non-associative law which limit is the usual addition. The main properties associated with the generalized entropy are established, particularly those corresponding to statistical ensembles. The Boltzmann-Gibbs statistics is recovered as a limit. The connection with thermodynamics is also studied. We also provide a guideline for systematically defining a deformed algebra which limit is the classical linear algebra. As an illustrative example we study a generalized entropy based on Tsallis self-information. We point out possible connections between deformed algebra and fuzzy logics. Finally, noticing that the new concept of independence is based on t-norm the one-parameter deformation of the logarithm is interpreted as an additive generator of t-norms.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.