TY - JOUR
T1 - Designing Finite Alphabet Iterative Decoders of LDPC Codes Via Recurrent Quantized Neural Networks
AU - Xiao, Xin
AU - Vasic, Bane
AU - Tandon, Ravi
AU - Lin, Shu
N1 - Funding Information:
Manuscript received September 17, 2019; revised February 13, 2020; accepted March 26, 2020. Date of publication April 6, 2020; date of current version July 15, 2020. The work of X. Xiao and B. Vasić was funded by the NSF under grant NSF SaTC-1813401 and NSF CCF-1855879. The work of R. Tandon was supported in part by the 2018 Keysight Early Career Professor Award, and by NSF under grants CAREER 1651492, and CNS 1715947. The associate editor coordinating the review of this article and approving it for publication was A. Graell i Amat. (Corresponding author: Xin Xiao.) Xin Xiao, Bane Vasić, and Ravi Tandon are with the Department of Electrical and Computer Engineering, The University of Arizona, Tucson, AZ 85721 USA (e-mail: [email protected]; [email protected]; [email protected]).
Publisher Copyright:
© 1972-2012 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - In this paper, we propose a new approach to design finite alphabet iterative decoders (FAIDs) for Low-Density Parity Check (LDPC) codes over binary symmetric channel (BSC) via recurrent quantized neural networks (RQNN). We focus on the linear FAID class and use RQNNs to optimize the message update look-up tables by jointly training their message levels and RQNN parameters. Existing neural networks for channel coding work well over Additive White Gaussian Noise Channel (AWGNC) but are inefficient over BSC due to the finite channel values of BSC fed into neural networks. We propose the bit error rate (BER) as the loss function to train the RQNNs over BSC. The low precision activations in the RQNN and quantization in the BER cause a critical issue that their gradients vanish almost everywhere, making it difficult to use classical backward propagation. We leverage straight-through estimators as surrogate gradients to tackle this issue and provide a joint training scheme. We show that the framework is flexible for various code lengths and column weights. Specifically, in high column weight case, it automatically designs low precision linear FAIDs with superior performance, lower complexity, and faster convergence than the floating-point belief propagation algorithms in waterfall region.
AB - In this paper, we propose a new approach to design finite alphabet iterative decoders (FAIDs) for Low-Density Parity Check (LDPC) codes over binary symmetric channel (BSC) via recurrent quantized neural networks (RQNN). We focus on the linear FAID class and use RQNNs to optimize the message update look-up tables by jointly training their message levels and RQNN parameters. Existing neural networks for channel coding work well over Additive White Gaussian Noise Channel (AWGNC) but are inefficient over BSC due to the finite channel values of BSC fed into neural networks. We propose the bit error rate (BER) as the loss function to train the RQNNs over BSC. The low precision activations in the RQNN and quantization in the BER cause a critical issue that their gradients vanish almost everywhere, making it difficult to use classical backward propagation. We leverage straight-through estimators as surrogate gradients to tackle this issue and provide a joint training scheme. We show that the framework is flexible for various code lengths and column weights. Specifically, in high column weight case, it automatically designs low precision linear FAIDs with superior performance, lower complexity, and faster convergence than the floating-point belief propagation algorithms in waterfall region.
KW - Binary symmetric channel
KW - finite alphabet iterative decoders
KW - low-density parity-check codes
KW - quantized neural network
KW - straight-through estimator
UR - http://www.scopus.com/inward/record.url?scp=85088535637&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85088535637&partnerID=8YFLogxK
U2 - 10.1109/TCOMM.2020.2985678
DO - 10.1109/TCOMM.2020.2985678
M3 - Article
AN - SCOPUS:85088535637
SN - 0090-6778
VL - 68
SP - 3963
EP - 3974
JO - IEEE Transactions on Communications
JF - IEEE Transactions on Communications
IS - 7
M1 - 9057584
ER -