Click on the paper title for full-text access.

Journal papers

Title: Constellations Cross Circular auto-Correlation C4-sequences (June 2024)
Authors: Emmanuel Boutillon

Abstract: This paper introduces a novel type of sequences called C4-sequences. C4-sequences share similar optimal autocorrelation properties with Zadoff-Chu sequences. However, C4-sequences offer the additional advantage of being also optimal (in the sense of minimal Euclidean distance between sequences) for four truncation lengths, providing flexibility in adapting to different channel conditions without compromising performance. Moreover, unlike Zadoff-Chu sequences, the points of a constellation associated with a C4-sequence are not limited to the unit circle. This opens up possibilities for achieving shaping gain, leading to enhanced spectral efficiency. By combining a truncated C4-sequence modulation as an inner code with a fixed-rate non-binary outer code, flexible and performant rate-adaptive communication systems can also be achieved. Finally, the notion of C4-sequences can be generalized.

To appear in IEEE Transactions on Communications.
Title: Decoding Short LDPC Codes via BP-RNN Diversity and Reliability-Based Post-Processing (Nov 2022)
Authors: Joachim Rosseel, Valérian Mannoni, Inbar Fijalkow, and Valentin Savin

Abstract: This paper investigates decoder diversity architectures for short low-density parity-check (LDPC) codes, based on recurrent neural network (RNN) models of the belief-propagation (BP) algorithm. We propose a new approach to achieve decoder diversity in the waterfall region, by specializing BP-RNN decoders to specific classes of errors, with absorbing set support. We further combine our approach with an ordered statistics decoding (OSD) post-processing step, which effectively leverages the bit-error rate optimization deriving from the use of the binary cross-entropy loss function. We show that a single specialized BP-RNN decoder combines better than BP with the OSD post-processing step. Moreover, combining OSD post-processing with the diversity brought by the use of multiple BP-RNN decoders, provides an efficient way to bridge the gap to maximum likelihood decoding.

Published in IEEE Transactions on Communications, vol. 70, no. 12, Dec. 2022.

A companion dataset is available on this page.

Conference papers

Title: Extrinsic Versus APP Information Feedback in Turbo VEP MU-MIMO Receivers: Optimization Via Deep Unfolding (Apr. 2024)
Authors: Arthur Michon, Charly Poulliat, Adam Mekhiche, Antonio Maria Cipriano.

Abstract: The joint use of Soft-Input Soft-Output (SISO) detectors and channel decoders in an iterative manner has received growing attention for Multi-User Multiple-Input Multiple-Output (MU-MIMO) transmission schemes since several years, as it has been shown to operate close to fundamental limits, at least asymptotically. Amongst SISO detectors, message passing algorithms such as Vector Expectation Propagation (VEP) proved to outperform significantly linear detectors such as Linear Minimum Mean Square Error (LMMSE). Aside from its higher computational complexity, turbo VEP receivers rely on different hyper-parameters that can be optimized.In this context, we propose a joint optimization through deep-unfolding of the hyper-parameters that naturally arise in this kind of doubly iterative turbo VEP receivers. One of the difficulties arising for this type of receiver is when and how to choose between an extrinsic or an A Posteriori (APP) information feedback within the turbo receiver. The optimal selection is shown here to depend on the type of the considered SISO components. By properly choosing the hyper-parameters to be optimized, we show that deep-unfolding can naturally optimize the trade-off between extrinsic and APP information feedback and bring performance gains.

Published in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2024, Seoul, Korea, 14-19 Apr. 2024 (link to full-paper in IEEEXplore)
Title: C4-Sequences: Rate Adaptive Coded Modulation for Few Bits Message (Sept. 2023)
Authors: Emmanuel Boutillon

Abstract: This paper introduces a novel type of sequences called C4-sequences. C4-sequences share similar optimal autocorrelation properties with Zadoff-Chu sequences. However, C4-sequences offer the additional advantage of having also optimal (in the sense of minimal Euclidean distance between sequences) for several truncation lengths, providing flexibility in adapting to different channel conditions without compromising performance. Moreover, unlike Zadoff-Chu sequences, the points of a constellation associated with a C4-sequence are not limited to the unit circle. This opens up possibilities for achieving shaping gain, leading to enhanced spectral efficiency. By combining a truncated C4-sequence modulation as an inner code with a fixedrate non-binary outer code, flexible and performant rate-adaptive communication systems can be achieved.

Published in 12th International Symposium on Topics in Coding (ISTC), Brest, France, 4-8 Sept. 2023 (link to full-paper in IEEEXplore)
Title: Rate-Adaptive Cyclic Complex Spreading Sequence for Non-Binary Decoders (Sept. 2023)
Authors: Cédric Marchand, Alexandru-Liviu Olteanu, and Emmanuel Boutillon

Abstract: This paper presents rate-adaptive spreading sequences for Non-Binary (NB) decoders. The sequence's rateadaptive property enables matching the data rate to the channel conditions. It is obtained by truncating codewords of a cyclic code shift keying (CCSK) sequence while keeping the NB decoder optimized at a fixed code rate. Each chip of the sequence is mapped to a q-ary constellation. The paper presents q-PSK sequence construction processes giving bi-orthogonal codewords at different levels of puncturing. Considering a 256-QAM CCSK modulation and 120 bits payload, the spectral efficiency range is from 0.02 to 6 bits/s/Hz for an SNR range from-15.5 dB to 20 dB, with better FER performance than 5G for all SNR values.

Published in 12th International Symposium on Topics in Coding (ISTC), Brest, France, 4-8 Sept. 2023 (link to full-paper in IEEEXplore)
Title: Sets of complementary LLRs to improve OSD post-processing of BP decoding (Sept. 2023)
Authors: Joachim Rosseel, Valérian Mannoni, Valentin Savin, and Inbar Fijalkow

Abstract: This article deals with Ordered Statistics Decoding (OSD) applied to the soft outputs of the Belief Propagation (BP) algorithm. We first model the weighted sum of the a posteriori LLRs across BP decoding iterations into a neuron. The neuron is then trained with the focal loss to compute for each BP decoding failure a set of accumulated Log Likelihood Ratios (LLRs) suited for OSD post-processing. Then, we propose a recursive selection procedure of LLRs sets, for multiple OSD post-processing. This selection is carried out from the sets of a posteriori LLRs calculated at each BP iteration, and from the accumulated LLRs optimized for the OSD, based on their joint probabilities of failure with OSD post-processing. An OSD is then applied to each set of LLRs belonging to the selection. In addition, we propose to reduce the OSD post-processing decoding complexity without significantly degrading its performance. Our results show that this new decoding method provides an effective way to bridge the gap to maximum likelihood decoding for short and long Low Density Parity Check (LDPC) codes.

Published in 12th International Symposium on Topics in Coding (ISTC), Brest, France, 4-8 Sept. 2023 (link to full-paper in IEEEXplore)
Title: Post-traitement OSD pour le décodage BP basé sur des ensembles de LLRs complémentaires (Aug. 2023)
Authors: Joachim Rosseel, Valérian Mannoni, Valentin Savin, and Inbar Fijalkow

Abstract: Cet article traite du post-traitement par OSD (Ordered Statistics Decoding) appliqué aux sorties souples de l'algorithme BP (Belief Propagation). Notre approche consiste dans un premier temps à modéliser la somme pondérée des LLRs (Log Likelihood Ratios) a posteriori calculés au cours des itérations du BP en un seul neurone. Ce neurone est alors entraîné avec la fonction de coût focale afin d'obtenir pour chaque échec du BP un ensemble de LLRs accumulés qui soient adaptés au post-traitement par OSD. Nous proposons ensuite une procédure de sélection récursive d'ensembles de LLRs pour un post-traitement OSD multiple. Cette sélection est réalisée à partir des ensembles des LLRs a posteriori calculés à chaque itération du BP ainsi que des LLRs accumulés et optimisés pour l'OSD, selon leurs probabilités conjointes d'échec avec le post-traitement OSD. Un OSD est alors appliqué sur chaque ensemble de LLRs de la sélection. Nos résultats montrent que cette nouvelle méthode de décodage fournit un moyen efficace d'atteindre la performance du décodeur par maximum de vraisemblance pour des codes LDPC (Low Density Parity Check) courts.

Published in 29ème édition du colloque GRETSI, Grenoble, France, 28 Aug. - 01 Sep. 2023.
Title: Décodeurs BP-RNNs mis en parallèle et spécialisés dans le décodage de codes LDPC courts (Sept. 2022)
Authors: Joachim Rosseel, Inbar Fijalkow, Valentin Savin, and Valérian Mannoni

Abstract: Cet article traite du décodage des codes LDPC (Low Density Parity Check) courts par l'algorithme BP (Belief Propagation). Ce dernier pouvant être modélisé à l'aide d'un réseau de neurones récurrent (BP-RNN), nous introduisons une nouvelle méthode d'entraînement qui a pour objectif de spécialiser le décodeur BP-RNN sur des événements d’erreur partageant des propriétés structurelles similaires. Cette approche est ensuite associée à une nouvelle architecture de décodage composée de plusieurs BP-RNNs spécialisés mis en parallèle, où chaque BP-RNN est entraîné à corriger un type différent d’événement d’erreur. Nos résultats de simulations montrent que les BP-RNNs spécialisés mis en parallèles améliorent efficacement la capacité de décodage des codes LDPC de taille courte.

Published in 28ème édition du colloque GRETSI, Nancy, France, 6-9 Sept. 2022.
Title: Error Structure Aware Parallel BP-RNN Decoders for Short LDPC Codes (Aug. 2021)
Authors: Joachim Rosseel, Valérian Mannoni, Valentin Savin, and Inbar Fijalkow

Abstract: This article deals with the decoding of short block length Low Density Parity Check (LDPC) codes. It has already been demonstrated that Belief Propagation (BP) can be adjusted to the short coding length, thanks to its modeling by a Recurrent Neural Network (BP-RNN). To strengthen this adaptation, we introduce a new training method for the BP-RNN. Its aim is to specialize the BP-RNN on error events sharing the same structural properties. This approach is then associated with a new decoder composed of several parallel specialized BP-RNN decoders, each trained on correcting a different type of error events. Our results show that the proposed specialized BP-RNNs working in parallel effectively enhance the decoding capacity for short block length LDPC codes.

Published in 11th International Symposium on Topics in Coding (ISTC), Montreal, Canada, Aug 2021.

PhD Thesis

Title: Décodage de codes correcteurs d'erreurs assisté par apprentissage pour l'IOT (Defended on December 13, 2023)
Author: Joachim Rosseel

Résumé: Les communications sans fil, déjà très présentes dans notre société, soulèvent de nouveaux défis dans le cadre du déploiement de l'Internet des Objets (IoT) tels que le développement de nouvelles méthodes de décodage au niveau de la couche physique permettant d'assurer de bonnes performances pour la transmission de messages courts. En particulier, les codes LDPC (Low Density Parity Check) sont une famille de codes correcteurs d'erreurs très connus pour leurs excellentes performances asymptotiques lorsqu'ils sont décodés par l'algorithme de propagation de croyance (BP, pour Belief Propagation, en anglais). Cependant, la capacité de correction de l'algorithme BP se retrouve fortement dégradée pour les codes LDPC courts. Ainsi, cette thèse porte sur l'amélioration du décodage des codes LDPC courts, grâce notamment à des outils d'apprentissage automatique, tels que les réseaux de neurones.Après avoir introduit les notions et caractéristiques des codes LDPC et du décodage BP,ainsi que la modélisation du BP par un réseau de neurones récurrent (BP-Recurrent NeuralNetwork ou BP-RNN), nous développons de nouvelles méthodes d'entraînement afin de spécialiser le décodeur BP-RNN sur des motifs d'erreurs partageant des propriétés structurelles similaires. Ces approches de spécialisation sont associées à des architectures de décodage composées de plusieurs BP-RNNs spécialisés, où chaque BP-RNN est entraîné à corriger un type différent de motif d'erreurs (diversité de décodage). Nous nous intéressons ensuite au post-traitement du BP (ou du BP-RNN) avec un décodage par statistiques ordonnées (Ordered Statistics Decoding ou OSD) afin de se rapprocher de la performance du décodage par maximum de vraisemblance. Pour améliorer les performances du post-traitement, nous optimisons son entrée grâce à un neurone simple, puis nous introduisons une stratégie de décodage pour un post-traitement par OSD multiples. Il est alors montré que cette stratégie tire efficacement partie de la diversité de ses entrées, fournissant ainsi un moyen efficace de combler l'écart avec le décodage par maximum de vraisemblance.

Abstract: Wireless communications, already very present in our society, still raise new challengesas part of the deployment of the Internet of Things (IoT) such as the development of newdecoding methods at the physical layer ensuring good performance for the transmission ofshort messages. In particular, Low Density Parity Check (LDPC) codes are a family of errorcorrecting codes well-known for their excellent asymptotic error correction performanceunder iterative Belief Propagation (BP) decoding. However, the error correcting capacity ofthe BP algorithm is severely deteriorated for short LDPC codes. Thus, this thesis focuses on improving the decoding of short LDPC codes, thanks in particular to machine learning tools such as neural networks.After introducing the notions and characteristics of LDPC codes and BP decoding, aswell as the modeling of the BP algorithm by a Recurrent Neural Network (BP-RecurrentNeural Network or BP-RNN), we develop new training methods specializing the BP-RNN ondecoding error events sharing similar structural properties. These specialization approaches are subsequently associated decoding architectures composed of several specialized BP-RNNs, where each BP-RNN is trained to decode a specific kind of error events (decoding diversity). Secondly, we are interested in the post-processing of the BP (or the BP-RNN) with an Ordered Statistics Decoding (OSD) in order to close the gap the maximum likelihood (ML) decoding performance. To improve the post-processing performance, we optimize its input thanks to a single neuron and we introduce a multiple OSD post-processing decoding strategy. We then show that this strategy effectively takes advantage of the diversity of its inputs, thus providing an effective way to close the gap with ML decoding.