Abstract
To quantify the optimum performance for classification tasks, the Shannon mutual information is a natural information-theoretic metric, as it is directly related to the probability of error. The data produced by many imaging systems can be modeled by mixture distributions. The mutual information between mixture data and the class label does not have an analytical expression nor any efficient computational algorithms.We introduce a variational upper bound, a lower bound, and three approximations, all employing pair-wise divergences between mixture components.We compare the new bounds and approximations withMonte Carlo stochastic sampling and bounds derived from entropy bounds. To conclude, we evaluate the performance of the bounds and approximations through numerical simulations.
Original language | English (US) |
---|---|
Pages (from-to) | 1160-1171 |
Number of pages | 12 |
Journal | Journal of the Optical Society of America A: Optics and Image Science, and Vision |
Volume | 39 |
Issue number | 7 |
DOIs | |
State | Published - Jul 2022 |
ASJC Scopus subject areas
- Electronic, Optical and Magnetic Materials
- Atomic and Molecular Physics, and Optics
- Computer Vision and Pattern Recognition