Low Complexity Bit Reliability And Predication Based Symbol Value Selection Decoding Algorithms For Non-Binary Ldpc Codes

IEEE ACCESS(2020)

Cited 8|Views14
No score
Abstract
The main challenge for hardware implementation of non-binary LDPC decoding is the high computational complexity and large memory requirement. To address this challenge, five new low complexity LDPC decoding algorithms are proposed in this paper. The proposed algorithms are developed specifically towards the low complexity, yet effective, decoding of the NB LDPC codes. The proposed decoding algorithms update, iteratively, the hard decision received vector to search for the valid codeword in the vector space of Galois field (GF). The selection criterion for least reliable symbol positions is based on the information from the failed checks and the reliability information from the Galois field structure as well as from the received channel soft information. To choose the correct value for the candidate symbol, two methods are used. The first method is based on the prediction of the error symbol from the set of Galois field symbols which maximize an objective function. In the second method, individual bits are flipped based on the reliability information obtained from the channel. Algorithms 1 and 2 flip a single symbol per iteration whilst the other three algorithms 3, 4 and 5 flip multiple symbols in each iteration. The proposed voting based Algorithms 1, 2 and 5 first short list the unreliable positions using a majority voting scheme and then choose the candidate symbol value from the set of the symbols in GF(q) while not violating the field order q. These methods simplify the decoding complexity in terms of computation and memory. Results and analysis of these algorithms show an appealing tradeoff between computational complexity and bit error rate performance for NB LDPC codes.
More
Translated text
Key words
Multiple vote,non-binary LDPC,iterative reliability decoding,symbol flipping,sum product algorithm,low complexity decoding
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined