TY - GEN
T1 - Thinning, photonic beamsplitting, and a general discrete Entropy power Inequality
AU - Guha, Saikat
AU - Shapiro, Jeffrey H.
AU - Sanchez, Raul Garcia Patron
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/10
Y1 - 2016/8/10
N2 - Many partially-successful attempts have been made to find the most natural discrete-variable version of Shannon's entropy power inequality (EPI). We develop an axiomatic framework from which we deduce the natural form of a discrete-variable EPI and an associated entropic monotonicity in a discrete-variable central limit theorem. In this discrete EPI, the geometric distribution, which has the maximum entropy among all discrete distributions with a given mean, assumes a role analogous to the Gaussian distribution in Shannon's EPI. The entropy power of X is defined as the mean of a geometric random variable with entropy H(X). The crux of our construction is a discrete-variable version of Lieb's scaled addition X plusbη Y of two random variables X and Y with η ⋯ (0, 1). We discuss the relationship of our discrete EPI with recent work of Yu and Johnson who developed an EPI for a restricted class of random variables that have ultra-log-concave (ULC) distributions. Even though we leave open the proof of the aforesaid natural form of the discrete EPI, we show that this discrete EPI holds true for variables with arbitrary discrete distributions when the entropy power is redefined as eH(X) in analogy with the continuous version. Finally, we show that our conjectured discrete EPI is a special case of the yet-unproven Entropy Photon-number Inequality (EPnI), which assumes a role analogous to Shannon's EPI in capacity proofs for Gaussian bosonic (quantum) channels.
AB - Many partially-successful attempts have been made to find the most natural discrete-variable version of Shannon's entropy power inequality (EPI). We develop an axiomatic framework from which we deduce the natural form of a discrete-variable EPI and an associated entropic monotonicity in a discrete-variable central limit theorem. In this discrete EPI, the geometric distribution, which has the maximum entropy among all discrete distributions with a given mean, assumes a role analogous to the Gaussian distribution in Shannon's EPI. The entropy power of X is defined as the mean of a geometric random variable with entropy H(X). The crux of our construction is a discrete-variable version of Lieb's scaled addition X plusbη Y of two random variables X and Y with η ⋯ (0, 1). We discuss the relationship of our discrete EPI with recent work of Yu and Johnson who developed an EPI for a restricted class of random variables that have ultra-log-concave (ULC) distributions. Even though we leave open the proof of the aforesaid natural form of the discrete EPI, we show that this discrete EPI holds true for variables with arbitrary discrete distributions when the entropy power is redefined as eH(X) in analogy with the continuous version. Finally, we show that our conjectured discrete EPI is a special case of the yet-unproven Entropy Photon-number Inequality (EPnI), which assumes a role analogous to Shannon's EPI in capacity proofs for Gaussian bosonic (quantum) channels.
UR - http://www.scopus.com/inward/record.url?scp=84985920087&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84985920087&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2016.7541390
DO - 10.1109/ISIT.2016.7541390
M3 - Conference contribution
AN - SCOPUS:84985920087
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 705
EP - 709
BT - Proceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE International Symposium on Information Theory, ISIT 2016
Y2 - 10 July 2016 through 15 July 2016
ER -