Your browser doesn't support javascript.
loading
On the Accurate Estimation of Information-Theoretic Quantities from Multi-Dimensional Sample Data.
Álvarez Chaves, Manuel; Gupta, Hoshin V; Ehret, Uwe; Guthke, Anneli.
Afiliación
  • Álvarez Chaves M; Stuttgart Center for Simulation Science, Cluster of Excellence EXC 2075, University of Stuttgart, 70569 Stuttgart, Germany.
  • Gupta HV; Hydrology and Atmospheric Sciences, The University of Arizona, Tucson, AZ 85721, USA.
  • Ehret U; Institute of Water and River Basin Management, Karlsruhe Institute of Technology (KIT), 76131 Karlsruhe, Germany.
  • Guthke A; Stuttgart Center for Simulation Science, Cluster of Excellence EXC 2075, University of Stuttgart, 70569 Stuttgart, Germany.
Entropy (Basel) ; 26(5)2024 Apr 30.
Article en En | MEDLINE | ID: mdl-38785636
ABSTRACT
Using information-theoretic quantities in practical applications with continuous data is often hindered by the fact that probability density functions need to be estimated in higher dimensions, which can become unreliable or even computationally unfeasible. To make these useful quantities more accessible, alternative approaches such as binned frequencies using histograms and k-nearest neighbors (k-NN) have been proposed. However, a systematic comparison of the applicability of these methods has been lacking. We wish to fill this gap by comparing kernel-density-based estimation (KDE) with these two alternatives in carefully designed synthetic test cases. Specifically, we wish to estimate the information-theoretic quantities entropy, Kullback-Leibler divergence, and mutual information, from sample data. As a reference, the results are compared to closed-form solutions or numerical integrals. We generate samples from distributions of various shapes in dimensions ranging from one to ten. We evaluate the estimators' performance as a function of sample size, distribution characteristics, and chosen hyperparameters. We further compare the required computation time and specific implementation challenges. Notably, k-NN estimation tends to outperform other methods, considering algorithmic implementation, computational efficiency, and estimation accuracy, especially with sufficient data. This study provides valuable insights into the strengths and limitations of the different estimation methods for information-theoretic quantities. It also highlights the significance of considering the characteristics of the data, as well as the targeted information-theoretic quantity when selecting an appropriate estimation technique. These findings will assist scientists and practitioners in choosing the most suitable method, considering their specific application and available data. We have collected the compared estimation methods in a ready-to-use open-source Python 3 toolbox and, thereby, hope to promote the use of information-theoretic quantities by researchers and practitioners to evaluate the information in data and models in various disciplines.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Entropy (Basel) Año: 2024 Tipo del documento: Article País de afiliación: Alemania

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Entropy (Basel) Año: 2024 Tipo del documento: Article País de afiliación: Alemania
...