On the Accurate Estimation of Information-Theoretic Quantities from Multi-Dimensional Sample Data

Manuel Álvarez Chaves, Hoshin V. Gupta, Uwe Ehret, Anneli Guthke

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Using information-theoretic quantities in practical applications with continuous data is often hindered by the fact that probability density functions need to be estimated in higher dimensions, which can become unreliable or even computationally unfeasible. To make these useful quantities more accessible, alternative approaches such as binned frequencies using histograms and k-nearest neighbors (k-NN) have been proposed. However, a systematic comparison of the applicability of these methods has been lacking. We wish to fill this gap by comparing kernel-density-based estimation (KDE) with these two alternatives in carefully designed synthetic test cases. Specifically, we wish to estimate the information-theoretic quantities: entropy, Kullback–Leibler divergence, and mutual information, from sample data. As a reference, the results are compared to closed-form solutions or numerical integrals. We generate samples from distributions of various shapes in dimensions ranging from one to ten. We evaluate the estimators’ performance as a function of sample size, distribution characteristics, and chosen hyperparameters. We further compare the required computation time and specific implementation challenges. Notably, k-NN estimation tends to outperform other methods, considering algorithmic implementation, computational efficiency, and estimation accuracy, especially with sufficient data. This study provides valuable insights into the strengths and limitations of the different estimation methods for information-theoretic quantities. It also highlights the significance of considering the characteristics of the data, as well as the targeted information-theoretic quantity when selecting an appropriate estimation technique. These findings will assist scientists and practitioners in choosing the most suitable method, considering their specific application and available data. We have collected the compared estimation methods in a ready-to-use open-source Python 3 toolbox and, thereby, hope to promote the use of information-theoretic quantities by researchers and practitioners to evaluate the information in data and models in various disciplines.

Original languageEnglish (US)
Article number387
JournalEntropy
Volume26
Issue number5
DOIs
StatePublished - May 2024
Externally publishedYes

Keywords

  • Kullback–Leibler divergence
  • binning
  • data
  • entropy
  • information theory
  • k-NN
  • k-nearest neighbors
  • kernel density estimation
  • mutual information
  • non-parametric estimation
  • relative entropy

ASJC Scopus subject areas

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • General Physics and Astronomy
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'On the Accurate Estimation of Information-Theoretic Quantities from Multi-Dimensional Sample Data'. Together they form a unique fingerprint.

Cite this