MURAL - Maynooth University Research Archive Library



    Guessing noise, not code-words


    Duffy, Ken R. and Li, Jianje and Medard, Muriel (2018) Guessing noise, not code-words. In: 2018 IEEE International Symposium on Information Theory (ISIT). IEEE, pp. 671-675. ISBN 9781538647813

    [img]
    Preview
    Download (1MB) | Preview


    Share your research

    Twitter Facebook LinkedIn GooglePlus Email more...



    Add this article to your Mendeley library


    Abstract

    We introduce a new algorithm for Maximum Likelihood (ML) decoding for channels with memory. The algorithm is based on the principle that the receiver rank orders noise sequences from most likely to least likely. Subtracting noise from the received signal in that order, the first instance that results in an element of the code-book is the ML decoding. In contrast to traditional approaches, this novel scheme has the desirable property that it becomes more efficient as the code-book rate increases. We establish that the algorithm is capacity achieving for randomly selected code-books. When the code-book rate is less than capacity, we identify asymptotic error exponents as the block length becomes large. When the code-book rate is beyond capacity, we identify asymptotic success exponents. We determine properties of the complexity of the scheme in terms of the number of computations the receiver must perform per block symbol. Worked examples are presented for binary memoryless and Markovian noise. These demonstrate that block-lengths that offer a good complexity–rate tradeoff are typically smaller than the reciprocal of the bit error rate.

    Item Type: Book Section
    Additional Information: This work is in part supported by the National Science Foundation (NSF) under Grant No. 6932716. Cite as: K. R. Duffy, J. Li and M. Médard, "Guessing noise, not code-words," 2018 IEEE International Symposium on Information Theory (ISIT), Vail, CO, 2018, pp. 671-675, doi: 10.1109/ISIT.2018.8437648.
    Keywords: ML Decoding; Noise Guessing; Complexity Analysis; Error and Success Exponents; Decoding; Manganese; Receivers; Channel coding; Entropy; Source coding;
    Academic Unit: Faculty of Science and Engineering > Research Institutes > Hamilton Institute
    Item ID: 13337
    Identification Number: https://doi.org/10.1109/ISIT.2018.8437648
    Depositing User: Dr Ken Duffy
    Date Deposited: 30 Sep 2020 15:38
    Publisher: IEEE
    Refereed: Yes
    URI:
    Use Licence: This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here

    Repository Staff Only(login required)

    View Item Item control page

    Downloads

    Downloads per month over past year

    Origin of downloads