2000 character limit reached
On Tsallis extropy with an application to pattern recognition
Published 12 Mar 2021 in math.PR | (2103.07168v2)
Abstract: Recently, a new measure of information called extropy has been introduced by Lad, Sanfilippo and Agr`o as the dual version of Shannon entropy. In the literature, Tsallis introduced a measure for a discrete random variable, named Tsallis entropy, as a generalization of Boltzmann-Gibbs statistics. In this work, a new measure of discrimination, called Tsallis extropy, is introduced and some of its properties are then discussed. The relation between Tsallis extropy and entropy is given and some bounds are also presented. Finally, an application of this extropy to pattern recognition is demonstrated.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.