Bounds on mutual information of mixture data for classification tasks

Yijun Ding, Amit Ashok

Research output: Contribution to journalArticlepeer-review


To quantify the optimum performance for classification tasks, the Shannon mutual information is a natural information-theoretic metric, as it is directly related to the probability of error. The data produced by many imaging systems can be modeled by mixture distributions. The mutual information between mixture data and the class label does not have an analytical expression nor any efficient computational algorithms.We introduce a variational upper bound, a lower bound, and three approximations, all employing pair-wise divergences between mixture components.We compare the new bounds and approximations withMonte Carlo stochastic sampling and bounds derived from entropy bounds. To conclude, we evaluate the performance of the bounds and approximations through numerical simulations.

Original languageEnglish (US)
Pages (from-to)1160-1171
Number of pages12
JournalJournal of the Optical Society of America A: Optics and Image Science, and Vision
Issue number7
StatePublished - Jul 2022

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Atomic and Molecular Physics, and Optics
  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Bounds on mutual information of mixture data for classification tasks'. Together they form a unique fingerprint.

Cite this