Papers
Topics
Authors
Recent
Search
2000 character limit reached

Bounds of Shannon entropy and Extropy and their application in exploring the extreme value behavior of a large set of data

Published 18 Jul 2025 in math.ST and stat.TH | (2507.13656v1)

Abstract: This paper derives bounds for two omnipresent information theoretic measures, the Shannon entropy and its complementary dual, the extropy. Based on a large size data set from a logconcave model, the said bounds are obtained for the entropy and the extropy of the distribution of the largest order statistic and the respective normalized sequence, in the extreme value theory setting. A characterization of the exponential distribution is provided as the model that maximizes the Shannon entropy and the extropy which are associated with the distribution of the maximum value, in a large sample size regime. This characterization is exploited to provide an alternative, immediate proof of the convergence of Shannon entropy and extropy of the normalized maxima of a large size sample to the respective measures for the Gumbel distribution, studied recently for Shannon entropy in Johnson (2024) and references therein.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 2 likes about this paper.