Deep Learning-Based One-Bit Signal Detection: A Model-Aware Approach with LoRD-Net
It is common in the fields of communication and sensing, to face the challenge of reconstructing a high-dimensional signal from a noisy low-dimensional quantized version of the signal. This paper is aimed at the extreme case of one-bit quantizers which are encountered in many modern communication systems. Specifically, we present a deep neural network called LoRD-Net that takes one-bit measurements of information symbols and is able to recover them. This paper presents a data-driven modeling framework based on deep unfolding of the iterative first-order optimization schemes. LoRD-Net is blind in the sense that it does not require knowledge of the one-bit channel used to obtain the measurements but is able to recover the signal of interest from the one-bit noisy measurements. This stems from the fact that the proposed deep detector contains far fewer parameters than that of black-box deep networks that incorporate domain-knowledge in the design of its architectural frameworks. Operating this way avoids system modeling and behavior understanding in line with LoRDA. LoRD-Net is also considerate but blind in the sense that it addresses the challenges posed by non-linear data-acquisition systems and attempts to provide, to some extent, the appropriate optimization criteria for signal retrieval.
Consequently, we provide a two-step learning procedure for LoRD-NET, whereby the first step is aimed at finding the appropriate optimization schema to be implemented, whereas the second one has the end model to be trained in ‘all at once’ fashion. We simulate the performance of the suggested receiver structure devoted to one-bit recovery in wireless communication systems and show that the proposed hybrid approach achieves better results than best data driven and model based methods available at the moment of development while requiring a limited amount, around ∼500 samples, of training data.