Your browser doesn't support javascript.
loading
Asymptotic Normality for Plug-In Estimators of Generalized Shannon's Entropy.
Zhang, Jialin; Shi, Jingyi.
Affiliation
  • Zhang J; Department of Mathematics and Statistics, Mississippi State University, Mississippi State, MS 39762, USA.
  • Shi J; Department of Mathematics and Statistics, Mississippi State University, Mississippi State, MS 39762, USA.
Entropy (Basel) ; 24(5)2022 May 12.
Article in En | MEDLINE | ID: mdl-35626567
ABSTRACT
Shannon's entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon's entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon's entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon's entropy. The asymptotic distribution for Shannon's entropy's plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon's entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon's entropy.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Entropy (Basel) Year: 2022 Document type: Article Affiliation country: United States

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Entropy (Basel) Year: 2022 Document type: Article Affiliation country: United States