On the Use of Variability Measures to Analyze Source Coding Data Based on the Shannon Entropy

Helio M. de Oliveira, Raydonal Ospina, Carlos Martin-Barreiro, Víctor Leiva, Christophe Chesneau

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Source coding maps elements from an information source to a sequence of alphabetic symbols. Then, the source symbols can be recovered exactly from the binary units. In this paper, we derive an approach that includes information variation in the source coding. The approach is more realistic than its standard version. We employ the Shannon entropy for coding the sequences of a source. Our approach is also helpful for short sequences when the central limit theorem does not apply. We rely on a quantifier of the information variation as a source. This quantifier corresponds to the second central moment of a random variable that measures the information content of a source symbol; that is, considering the standard deviation. An interpretation of typical sequences is also provided through this approach. We show how to use a binary memoryless source as an example. In addition, Monte Carlo simulation studies are conducted to evaluate the performance of our approach. We apply this approach to two real datasets related to purity and wheat prices in Brazil.

Original languageEnglish
Article number293
Issue number2
StatePublished - Jan 2023


  • Monte Carlo simulation
  • Newton–Raphson method
  • communication science
  • discrete memoryless source
  • entropy
  • information theory
  • statistical moments
  • variance


Dive into the research topics of 'On the Use of Variability Measures to Analyze Source Coding Data Based on the Shannon Entropy'. Together they form a unique fingerprint.

Cite this