ﻻ يوجد ملخص باللغة العربية
In the upcoming Internet-of-Things (IoT) era, the communication is often featured by massive connection, sporadic transmission, and small-sized data packets, which poses new requirements on the delay expectation and resource allocation efficiency of the Random Access (RA) mechanisms of the IoT communication stack. A grant-free non-orthogonal random access (NORA) system is considered in this paper, which could simultaneously reduce the access delay and support more Machine Type Communication (MTC) devices with limited resources. In order to address the joint user activity detection (UAD) and channel estimation (CE) problem in the grant-free NORA system, we propose a deep neural network-aided message passing-based block sparse Bayesian learning (DNN-MP-BSBL) algorithm. In the DNN-MP-BSBL algorithm, the iterative message passing process is transferred from a factor graph to a deep neural network (DNN). Weights are imposed on the messages in the DNN and trained to minimize the estimation error. It is shown that the trained weights could alleviate the convergence problem of the MP-BSBL algorithm, especially on crowded RA scenarios. Simulation results show that the proposed DNN-MP-BSBL algorithm could improve the UAD and CE accuracy with a smaller number of iterations, indicating its advantages for low-latency grant-free NORA systems.
Faced with the massive connection, sporadic transmission, and small-sized data packets in future cellular communication, a grant-free non-orthogonal random access (NORA) system is considered in this paper, which could reduce the access delay and supp
With recent advances on the dense low-earth orbit (LEO) constellation, LEO satellite network has become one promising solution to providing global coverage for Internet-of-Things (IoT) services. Confronted with the sporadic transmission from randomly
In the massive machine-type communication (mMTC) scenario, a large number of devices with sporadic traffic need to access the network on limited radio resources. While grant-free random access has emerged as a promising mechanism for massive access,
Millimeter-wave/Terahertz (mmW/THz) communications have shown great potential for wideband massive access in next-generation cellular internet of things (IoT) networks. To decrease the length of pilot sequences and the computational complexity in wid
Device activity detection is one main challenge in grant-free massive access, which is recently proposed to support massive machine-type communications (mMTC). Existing solutions for device activity detection fail to consider inter-cell interference