Papers
Topics
Authors
Recent
Search
2000 character limit reached

On principles of large deviation and selected data compression

Published 24 Apr 2016 in cs.IT, math.IT, and math.PR | (1604.06971v1)

Abstract: The Shannon Noiseless coding theorem (the data-compression principle) asserts that for an information source with an alphabet $\mathcal X={0,\ldots ,\ell -1}$ and an asymptotic equipartition property, one can reduce the number of stored strings $(x_0,\ldots ,x_{n-1})\in {\mathcal X}n$ to $\ell{nh}$ with an arbitrary small error-probability. Here $h$ is the entropy rate of the source (calculated to the base $\ell$). We consider further reduction based on the concept of utility of a string measured in terms of a rate of a weight function. The novelty of the work is that the distribution of memory is analyzed from a probabilistic point of view. A convenient tool for assessing the degree of reduction is a probabilistic large deviation principle. Assuming a Markov-type setting, we discuss some relevant formulas, including the case of a general alphabet.

Citations (10)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.