No Arabic abstract
This paper presents a novel framework for traffic prediction of IoT devices activated by binary Markovian events. First, we consider a massive set of IoT devices whose activation events are modeled by an On-Off Markov process with known transition probabilities. Next, we exploit the temporal correlation of the traffic events and apply the forward algorithm in the context of hidden Markov models (HMM) in order to predict the activation likelihood of each IoT device. Finally, we apply the fast uplink grant scheme in order to allocate resources to the IoT devices that have the maximal likelihood for transmission. In order to evaluate the performance of the proposed scheme, we define the regret metric as the number of missed resource allocation opportunities. The proposed fast uplink scheme based on traffic prediction outperforms both conventional random access and time division duplex in terms of regret and efficiency of system usage, while it maintains its superiority over random access in terms of average age of information for massive deployments.
Grant-free sparse code multiple access (GF-SCMA) is considered to be a promising multiple access candidate for future wireless networks. In this paper, we focus on characterizing the performance of uplink GF-SCMA schemes in a network with ubiquitous connections, such as the Internet of Things (IoT) networks. To provide a tractable approach to evaluate the performance of GF-SCMA, we first develop a theoretical model taking into account the property of multi-user detection (MUD) in the SCMA system. We then analyze the error rate performance of GF-SCMA in the case of codebook collision to investigate the reliability of GF-SCMA when reusing codebook in massive IoT networks. For performance evaluation, accurate approximations for both success probability and average symbol error probability (ASEP) are derived. To elaborate further, we utilize the analytical results to discuss the impact of codeword sparse degree in GFSCMA. After that, we conduct a comparative study between SCMA and its variant, dense code multiple access (DCMA), with GF transmission to offer insights into the effectiveness of these two schemes. This facilitates the GF-SCMA system design in practical implementation. Simulation results show that denser codebooks can help to support more UEs and increase the reliability of data transmission in a GF-SCMA network. Moreover, a higher success probability can be achieved by GFSCMA with denser UE deployment at low detection thresholds since SCMA can achieve overloading gain.
In this paper, user detection performance of a grant-free uplink transmission in a large scale antenna system is analyzed, in which a general grant-free multiple access is considered as the system model and Zadoff-Chu sequence is used for the uplink pilot. The false alarm probabilities of various user detection schemes under the target detection probabilities are evaluated.
The current random access (RA) allocation techniques suffer from congestion and high signaling overhead while serving massive machine type communication (mMTC) applications. To this end, 3GPP introduced the need to use fast uplink grant (FUG) allocation in order to reduce latency and increase reliability for smart internet-of-things (IoT) applications with strict QoS constraints. We propose a novel FUG allocation based on support vector machine (SVM), First, MTC devices are prioritized using SVM classifier. Second, LSTM architecture is used for traffic prediction and correction techniques to overcome prediction errors. Both results are used to achieve an efficient resource scheduler in terms of the average latency and total throughput. A Coupled Markov Modulated Poisson Process (CMMPP) traffic model with mixed alarm and regular traffic is applied to compare the proposed FUG allocation to other existing allocation techniques. In addition, an extended traffic model based CMMPP is used to evaluate the proposed algorithm in a more dense network. We test the proposed scheme using real-time measurement data collected from the Numenta Anomaly Benchmark (NAB) database. Our simulation results show the proposed model outperforms the existing RA allocation schemes by achieving the highest throughput and the lowest access delay of the order of 1 ms by achieving prediction accuracy of 98 $%$ when serving the target massive and critical MTC applications with a limited number of resources.
Future wireless communications are largely inclined to deploy a massive number of antennas at the base stations (BS) by exploiting energy-efficient and environmentally friendly technologies. An emerging technology called dynamic metasurface antennas (DMAs) is promising to realize such massive antenna arrays with reduced physical size, hardware cost, and power consumption. This paper aims to optimize the energy efficiency (EE) performance of DMAs-assisted massive MIMO uplink communications. We propose an algorithmic framework for designing the transmit precoding of each multi-antenna user and the DMAs tuning strategy at the BS to maximize the EE performance, considering the availability of the instantaneous and statistical channel state information (CSI), respectively. Specifically, the proposed framework includes Dinkelbachs transform, alternating optimization, and deterministic equivalent methods. In addition, we obtain a closed-form solution to the optimal transmit signal directions for the statistical CSI case, which simplifies the corresponding transmission design. The numerical results show good convergence performance of our proposed algorithms as well as considerable EE performance gains of the DMAs-assisted massive MIMO uplink communications over the baseline schemes.
This paper investigates a full-duplex orthogonal-frequency-division multiple access (OFDMA) based multiple unmanned aerial vehicles (UAVs)-enabled wireless-powered Internet-of-Things (IoT) networks. In this paper, a swarm of UAVs is first deployed in three dimensions (3D) to simultaneously charge all devices, i.e., a downlink (DL) charging period, and then flies to new locations within this area to collect information from scheduled devices in several epochs via OFDMA due to potential limited number of channels available in Narrow Band IoT, i.e., an uplink (UL) communication period. To maximize the UL throughput of IoT devices, we jointly optimizes the UL-and-DL 3D deployment of the UAV swarm, including the device-UAV association, the scheduling order, and the UL-DL time allocation. In particular, the DL energy harvesting (EH) threshold of devices and the UL signal decoding threshold of UAVs are taken into consideration when studying the problem. Besides, both line-of-sight (LoS) and non-line-of-sight (NLoS) channel models are studied depending on the position of sensors and UAVs. The influence of the potential limited channels issue in NB-IoT is also considered by studying the IoT scheduling policy. Two scheduling policies, a near-first (NF) policy and a far-first (FF) policy, are studied. It is shown that the NF scheme outperforms FF scheme in terms of sum throughput maximization; whereas FF scheme outperforms NF scheme in terms of system fairness.