No Arabic abstract
We provide an overview of the 3rd generation partnership project (3GPP) work on evolving the 5G wireless technology to support non-terrestrial satellite networks. Adapting 5G to support non-terrestrial networks entails a holistic design spanning across multiple areas from radio access network to services and system aspects to core and terminals. In this article, we describe the main topics of non-terrestrial networks, explain in detail the design aspects, and share various design rationales influencing standardization.
Non-terrestrial networks (NTNs) traditionally had certain limited applications. However, the recent technological advancements opened up myriad applications of NTNs for 5G and beyond networks, especially when integrated into terrestrial networks (TNs). This article comprehensively surveys the evolution of NTNs highlighting its relevance to 5G networks and essentially, how it will play a pivotal role in the development of 6G and beyond wireless networks. The survey discusses important features of NTNs integration into TNs by delving into the new range of services and use cases, various architectures, and new approaches being adopted to develop a new wireless ecosystem. Our survey includes the major progresses and outcomes from academic research as well as industrial efforts. We first start with introducing the relevant 5G use cases and general integration challenges such as handover and deployment difficulties. Then, we review the NTNs operations in mmWave and their potential for the internet of things (IoT). Further, we discuss the significance of mobile edge computing (MEC) and machine learning (ML) in NTNs by reviewing the relevant research works. Furthermore, we also discuss the corresponding higher layer advancements and relevant field trials/prototyping at both academic and industrial levels. Finally, we identify and review 6G and beyond application scenarios, novel architectures, technological enablers, and higher layer aspects pertinent to NTNs integration.
Evolving 5G New Radio (NR) to support non-terrestrial networks (NTNs), particularly satellite communication networks, is under exploration in 3GPP. The movement of the spaceborne platforms in NTNs may result in large timing varying Doppler shift that differs for devices in different locations. Using orthogonal frequency-division multiple access (OFDMA) in the uplink, each device will need to apply a different frequency adjustment value to compensate for the Doppler shift. To this end, the 3GPP Release-17 work on NTNs assumes that an NTN device is equipped with a global navigation satellite system (GNSS) chipset and thereby can determine its position and calculate the needed frequency adjustment value using its position information and satellite ephemeris data. This makes GNSS support essential for the NTN operation. However, GNSS signals are weak, not ubiquitous, and susceptible to interference and spoofing. We show that devices without access to GNSS signals can utilize reference signals in more than one frequency position in an OFDM carrier to estimate the Doppler shift and thereby determine the needed frequency adjustment value for pre-compensating the Doppler shift in the uplink. We analyze the performance, elaborate on how to utilize the NR reference signals, and present simulation results. The solution can reduce the dependency of NTN operation on GNSS with reasonable complexity and performance trade-off.
The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.
The recently standardized millimeter wave-based 3GPP New Radio technology is expected to become an enabler for both enhanced Mobile Broadband (eMBB) and ultra-reliable low latency communication (URLLC) services specified to future 5G systems. One of the first steps in mathematical modeling of such systems is the characterization of the session resource request probability mass function (pmf) as a function of the channel conditions, cell size, application demands, user location and system parameters including modulation and coding schemes employed at the air interface. Unfortunately, this pmf cannot be expressed via elementary functions. In this paper, we develop an accurate approximation of the sought pmf. First, we show that Normal distribution provides a fairly accurate approximation to the cumulative distribution function (CDF) of the signal-to-noise ratio for communication systems operating in the millimeter frequency band, further allowing evaluating the resource request pmf via error function. We also investigate the impact of shadow fading on the resource request pmf.
Mobile apps are increasingly relying on high-throughput and low-latency content delivery, while the available bandwidth on wireless access links is inherently time-varying. The handoffs between base stations and access modes due to user mobility present additional challenges to deliver a high level of user Quality-of-Experience (QoE). The ability to predict the available bandwidth and the upcoming handoffs will give applications valuable leeway to make proactive adjustments to avoid significant QoE degradation. In this paper, we explore the possibility and accuracy of realtime mobile bandwidth and handoff predictions in 4G/LTE and 5G networks. Towards this goal, we collect long consecutive traces with rich bandwidth, channel, and context information from public transportation systems. We develop Recurrent Neural Network models to mine the temporal patterns of bandwidth evolution in fixed-route mobility scenarios. Our models consistently outperform the conventional univariate and multivariate bandwidth prediction models. For 4G & 5G co-existing networks, we propose a new problem of handoff prediction between 4G and 5G, which is important for low-latency applications like self-driving strategy in realistic 5G scenarios. We develop classification and regression based prediction models, which achieve more than 80% accuracy in predicting 4G and 5G handoffs in a recent 5G dataset.