ترغب بنشر مسار تعليمي؟ اضغط هنا

User Detection Performance Analysis for Grant-Free Uplink Transmission in Large-Scale Antenna Systems

109   0   0.0 ( 0 )
 نشر من قبل Jonghyun Kim
 تاريخ النشر 2019
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper, user detection performance of a grant-free uplink transmission in a large scale antenna system is analyzed, in which a general grant-free multiple access is considered as the system model and Zadoff-Chu sequence is used for the uplink pilot. The false alarm probabilities of various user detection schemes under the target detection probabilities are evaluated.



قيم البحث

اقرأ أيضاً

Large-scale antenna arrays employed by the base station (BS) constitute an essential next-generation communications technique. However, due to the constraints of size, cost, and power consumption, it is usually considered unrealistic to use a large-s cale antenna array at the user side. Inspired by the emerging technique of reconfigurable intelligent surfaces (RIS), we firstly propose the concept of user-side RIS (US-RIS) for facilitating the employment of a large-scale antenna array at the user side in a cost- and energy-efficient way. In contrast to the existing employments of RIS, which belong to the family of base-station-side RISs (BSS-RISs), the US-RIS concept by definition facilitates the employment of RIS at the user side for the first time. This is achieved by conceiving a multi-layer structure to realize a compact form-factor. Furthermore, our theoretical results demonstrate that, in contrast to the existing single-layer structure, where only the phase of the signal reflected from RIS can be adjusted, the amplitude of the signal penetrating multi-layer US-RIS can also be partially controlled, which brings about a new degree of freedom (DoF) for beamformer design that can be beneficially exploited for performance enhancement. In addition, based on the proposed multi-layer US-RIS, we formulate the signal-to-noise ratio (SNR) maximization problem of US-RIS-aided communications. Due to the non-convexity of the problem introduced by this multi-layer structure, we propose a multi-layer transmit beamformer design relying on an iterative algorithm for finding the optimal solution by alternately updating each variable. Finally, our simulation results verify the superiority of the proposed multi-layer US-RIS as a compact realization of a large-scale antenna array at the user side for uplink transmission.
86 - Ke Lai , Jing Lei , Yansha Deng 2021
Grant-free sparse code multiple access (GF-SCMA) is considered to be a promising multiple access candidate for future wireless networks. In this paper, we focus on characterizing the performance of uplink GF-SCMA schemes in a network with ubiquitous connections, such as the Internet of Things (IoT) networks. To provide a tractable approach to evaluate the performance of GF-SCMA, we first develop a theoretical model taking into account the property of multi-user detection (MUD) in the SCMA system. We then analyze the error rate performance of GF-SCMA in the case of codebook collision to investigate the reliability of GF-SCMA when reusing codebook in massive IoT networks. For performance evaluation, accurate approximations for both success probability and average symbol error probability (ASEP) are derived. To elaborate further, we utilize the analytical results to discuss the impact of codeword sparse degree in GFSCMA. After that, we conduct a comparative study between SCMA and its variant, dense code multiple access (DCMA), with GF transmission to offer insights into the effectiveness of these two schemes. This facilitates the GF-SCMA system design in practical implementation. Simulation results show that denser codebooks can help to support more UEs and increase the reliability of data transmission in a GF-SCMA network. Moreover, a higher success probability can be achieved by GFSCMA with denser UE deployment at low detection thresholds since SCMA can achieve overloading gain.
With recent advances on the dense low-earth orbit (LEO) constellation, LEO satellite network has become one promising solution to providing global coverage for Internet-of-Things (IoT) services. Confronted with the sporadic transmission from randomly activated IoT devices, we consider the random access (RA) mechanism, and propose a grant-free RA (GF-RA) scheme to reduce the access delay to the mobile LEO satellites. A Bernoulli-Rician message passing with expectation maximization (BR-MP-EM) algorithm is proposed for this terrestrial-satellite GF-RA system to address the user activity detection (UAD) and channel estimation (CE) problem. This BR-MP-EM algorithm is divided into two stages. In the inner iterations, the Bernoulli messages and Rician messages are updated for the joint UAD and CE problem. Based on the output of the inner iterations, the expectation maximization (EM) method is employed in the outer iterations to update the hyper-parameters related to the channel impairments. Finally, simulation results show the UAD and CE accuracy of the proposed BR-MP-EM algorithm, as well as the robustness against the channel impairments.
This paper presents a novel framework for traffic prediction of IoT devices activated by binary Markovian events. First, we consider a massive set of IoT devices whose activation events are modeled by an On-Off Markov process with known transition pr obabilities. Next, we exploit the temporal correlation of the traffic events and apply the forward algorithm in the context of hidden Markov models (HMM) in order to predict the activation likelihood of each IoT device. Finally, we apply the fast uplink grant scheme in order to allocate resources to the IoT devices that have the maximal likelihood for transmission. In order to evaluate the performance of the proposed scheme, we define the regret metric as the number of missed resource allocation opportunities. The proposed fast uplink scheme based on traffic prediction outperforms both conventional random access and time division duplex in terms of regret and efficiency of system usage, while it maintains its superiority over random access in terms of average age of information for massive deployments.
In the upcoming Internet-of-Things (IoT) era, the communication is often featured by massive connection, sporadic transmission, and small-sized data packets, which poses new requirements on the delay expectation and resource allocation efficiency of the Random Access (RA) mechanisms of the IoT communication stack. A grant-free non-orthogonal random access (NORA) system is considered in this paper, which could simultaneously reduce the access delay and support more Machine Type Communication (MTC) devices with limited resources. In order to address the joint user activity detection (UAD) and channel estimation (CE) problem in the grant-free NORA system, we propose a deep neural network-aided message passing-based block sparse Bayesian learning (DNN-MP-BSBL) algorithm. In the DNN-MP-BSBL algorithm, the iterative message passing process is transferred from a factor graph to a deep neural network (DNN). Weights are imposed on the messages in the DNN and trained to minimize the estimation error. It is shown that the trained weights could alleviate the convergence problem of the MP-BSBL algorithm, especially on crowded RA scenarios. Simulation results show that the proposed DNN-MP-BSBL algorithm could improve the UAD and CE accuracy with a smaller number of iterations, indicating its advantages for low-latency grant-free NORA systems.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا