Design of High Rate RCM-LDGM Codes
Issue Date: 
2019
Defense Date: 
29-Sep-2017
Publisher: 
Servicio de Publicaciones. Universidad de Navarra
Citation: 
GRANADA, Imanol."Design of High Rate RCM-LDGM Codes" Crespo, P. Trabajo fin de Master. Universidad de Navarra.2017
Abstract
This master thesis is studies the design of High Rate RCM-LDGM codes and it is divided in two parts: In the rst part, it proposes an EXIT chart analysis and a Bit Error Rate (BER) prediction procedure suitable for implementing high rate codes based on the parallel concatenation of a Rate Compatible Modulation (RCM) code with a Low Density Generator Matrix (LDGM) code. The decoder of a parallel RCMLDGM code is based on the iterative Sum-Product algorithm which exchange information between variable nodes (VN) and the corresponding two types of check nodes: RCM-CN and LDGM-CN. To obtain good codes that achieve near Shannon limit performance one is required to obtain BER versus SNR behaviors for di erent families of possible code design parameters. For large input block lengths, this could take large amount of simulation time. To overcome this design drawback, the proposed EXIT-BER chart procedure predicts in a much faster way this BER versus SNR behavior, and consequently speeds up their design procedure. In the second part, it studies two di erent strategies for transmitting sources with memory. The rst strategy consists on exploiting the source statistics in the decoding process by attaching the factor graph of the source to the RCMLDGM one and running the Sum-Product Algorithm to the entire factor graph. On the other hand, the second strategy uses the Burrows-Wheeler Transform to convert the source with memory into several independent Discrete Memoryless (DMS) binary Sources and encodes them separately.

Files in This Item:
Thumbnail
File
Granada_Imanol_MIT.pdf
Description
Size
743.25 kB
Format
Adobe PDF


Statistics and impact
0 citas en
0 citas en

Items in Dadun are protected by copyright, with all rights reserved, unless otherwise indicated.