TY - GEN
T1 - Strong data processing inequality in neural networks with noisy neurons and its implications
AU - Zhou, Chuteng
AU - Zhuang, Quntao
AU - Mattina, Matthew
AU - Whatmough, Paul N.
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/12
Y1 - 2021/7/12
N2 - Neural networks have gained importance as the machine learning models that achieve state-of-the-art performance on large-scale image classification, object detection and natural language processing tasks. In this paper, we consider noisy binary neural networks, where each neuron has a non-zero probability of producing an incorrect output. These noisy models may arise from biological, physical and electronic contexts and constitute an important class of models that are relevant to the physical world. Intuitively, the number of neurons in such systems has to grow to compensate for the noise while maintaining the same level of expressive power and computation reliability. Our key finding is a lower bound for the required number of neurons in noisy neural networks, which is first of its kind. To prove this lower bound, we take an information theoretic approach and obtain a strong data processing inequality (SDPI), which not only generalizes the Evans-Schulman results for binary channels to general channels but also improves the tightness drastically when applied to estimate end-to-end information contraction. Applying the SDPI in noisy binary neural networks, we obtain our key lower bound and investigate its implications on network depth-width trade-offs, our results suggest a depth-width trade-off for noisy neural networks that is very different from the established understanding regarding noiseless neural networks. This paper offers new understanding of noisy information processing systems through the lens of information theory.
AB - Neural networks have gained importance as the machine learning models that achieve state-of-the-art performance on large-scale image classification, object detection and natural language processing tasks. In this paper, we consider noisy binary neural networks, where each neuron has a non-zero probability of producing an incorrect output. These noisy models may arise from biological, physical and electronic contexts and constitute an important class of models that are relevant to the physical world. Intuitively, the number of neurons in such systems has to grow to compensate for the noise while maintaining the same level of expressive power and computation reliability. Our key finding is a lower bound for the required number of neurons in noisy neural networks, which is first of its kind. To prove this lower bound, we take an information theoretic approach and obtain a strong data processing inequality (SDPI), which not only generalizes the Evans-Schulman results for binary channels to general channels but also improves the tightness drastically when applied to estimate end-to-end information contraction. Applying the SDPI in noisy binary neural networks, we obtain our key lower bound and investigate its implications on network depth-width trade-offs, our results suggest a depth-width trade-off for noisy neural networks that is very different from the established understanding regarding noiseless neural networks. This paper offers new understanding of noisy information processing systems through the lens of information theory.
UR - http://www.scopus.com/inward/record.url?scp=85115063197&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85115063197&partnerID=8YFLogxK
U2 - 10.1109/ISIT45174.2021.9517787
DO - 10.1109/ISIT45174.2021.9517787
M3 - Conference contribution
AN - SCOPUS:85115063197
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 1170
EP - 1175
BT - 2021 IEEE International Symposium on Information Theory, ISIT 2021 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE International Symposium on Information Theory, ISIT 2021
Y2 - 12 July 2021 through 20 July 2021
ER -