Galligan, Kevin
(2024)
Universal Error Correction Decoding Algorithms.
PhD thesis, National University of Ireland Maynooth.
Abstract
There is no perfect communication channel, and any communication necessarily
involves some level of noise. Attempting to hold a conversation across a crowded
room, for instance, will likely result in miscommunication due to background noise.
Channel coding, as a field, is concerned with reducing the rate of error in such noisy
communication channels. This can be achieved by encoding messages with channel
codes, which allow communication errors to be detected and corrected. The study
of channel coding was launched in 1948 [78], and it now underlies critical technology
such as the Internet, space communications, and storage of digital information [57].
In this thesis, we develop new algorithms and channel coding techniques based
on Guessing Random Additive Noise Decoding (GRAND), a recently introduced
family of decoders for channel codes. GRAND algorithms, unusually, can decode
any channel code of any length that has a moderate amount of redundancy.
Assuming that all messages are equally likely, they achieve maximum-likelihood
decoding, which is the best possible outcome of decoding a channel code. GRAND
challenges several assumptions of traditional channel coding and asserts a new decoding
paradigm in which the particular channel code being used doesn't matter,
allowing greater flexibility in the design of communication schemes. Given that an
upper bound on GRAND's computational complexity increases exponentially with
the amount of redundancy in a code, it is impractical to directly decode arbitrary
channel codes that have a large amount of redundancy.
The goal of this thesis is thus to explore if GRAND can be used to decode such
high-redundancy codes, which are suitable for the noisiest channel environments.
To that end, we introduce and develop two iterative decoding algorithms, Iterative GRAND (IGRAND) and block turbo decoding with GRAND, for a powerful
class of channel codes known as product codes. Product codes are, in general,
high-redundancy codes formed from a concatenation of low- to moderateredundancy
component codes. The key insight of the algorithms considered here
is that GRAND can decode product codes by decoding each of their component
codes in turn, circumventing the aforementioned complexity constraint.
Soft information indicates the reliability of a received message and is useful for
a wide range of applications, including error detection and turbo decoding. In
addition to the goal of decoding high-redundancy codes, this thesis also investigates
the question of whether it is possible for GRAND decoding to output accurate soft
information. We derive probabilistic soft output formulae for GRAND algorithms,
evaluate their accuracy, and explore their application to error detection.
Item Type: |
Thesis
(PhD)
|
Keywords: |
Universal; Error; Correction; Decoding; Algorithms; |
Academic Unit: |
Faculty of Science and Engineering > Research Institutes > Hamilton Institute |
Item ID: |
18343 |
Depositing User: |
IR eTheses
|
Date Deposited: |
02 Apr 2024 10:56 |
URI: |
|
Use Licence: |
This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available
here |
Repository Staff Only(login required)
|
Item control page |
Downloads per month over past year
Origin of downloads