No Arabic abstract
Internet has shown itself to be a catalyst for economic growth and social equity but its potency is thwarted by the fact that the Internet is off limits for the vast majority of human beings. Mobile phones---the fastest growing technology in the world that now reaches around 80% of humanity---can enable universal Internet access if it can resolve coverage problems that have historically plagued previous cellular architectures (2G, 3G, and 4G). These conventional architectures have not been able to sustain universal service provisioning since these architectures depend on having enough users per cell for their economic viability and thus are not well suited to rural areas (which are by definition sparsely populated). The new generation of mobile cellular technology (5G), currently in a formative phase and expected to be finalized around 2020, is aimed at orders of magnitude performance enhancement. 5G offers a clean slate to network designers and can be molded into an architecture also amenable to universal Internet provisioning. Keeping in mind the great social benefits of democratizing Internet and connectivity, we believe that the time is ripe for emphasizing universal Internet provisioning as an important goal on the 5G research agenda. In this paper, we investigate the opportunities and challenges in utilizing 5G for global access to the Internet for all (GAIA). We have also identified the major technical issues involved in a 5G-based GAIA solution and have set up a future research agenda by defining open research problems.
For many, this is no longer a valid question and the case is considered settled with SDN/NFV (Software Defined Networking/Network Function Virtualization) providing the inevitable innovation enablers solving many outstanding management issues regarding 5G. However, given the monumental task of softwarization of radio access network (RAN) while 5G is just around the corner and some companies have started unveiling their 5G equipment already, the concern is very realistic that we may only see some point solutions involving SDN technology instead of a fully SDN-enabled RAN. This survey paper identifies all important obstacles in the way and looks at the state of the art of the relevant solutions. This survey is different from the previous surveys on SDN-based RAN as it focuses on the salient problems and discusses solutions proposed within and outside SDN literature. Our main focus is on fronthaul, backward compatibility, supposedly disruptive nature of SDN deployment, business cases and monetization of SDN related upgrades, latency of general purpose processors (GPP), and additional security vulnerabilities, softwarization brings along to the RAN. We have also provided a summary of the architectural developments in SDN-based RAN landscape as not all work can be covered under the focused issues. This paper provides a comprehensive survey on the state of the art of SDN-based RAN and clearly points out the gaps in the technology.
The cellular technology is mostly an urban technology that has been unable to serve rural areas well. This is because the traditional cellular models are not economical for areas with low user density and lesser revenues. In 5G cellular networks, the coverage dilemma is likely to remain the same, thus widening the rural-urban digital divide further. It is about time to identify the root cause that has hindered the rural technology growth and analyse the possible options in 5G architecture to address this issue. We advocate that it can only be accomplished in two phases by sequentially addressing economic viability followed by performance progression. We deliberate how various works in literature focus on the later stage of this two-phase problem and are not feasible to implement in the first place. We propose the concept of TV band white space (TVWS) dovetailed with 5G infrastructure for rural coverage and show that it can yield cost-effectiveness from a service provider perspective.
The paper presents a reinforcement learning solution to dynamic resource allocation for 5G radio access network slicing. Available communication resources (frequency-time blocks and transmit powers) and computational resources (processor usage) are allocated to stochastic arrivals of network slice requests. Each request arrives with priority (weight), throughput, computational resource, and latency (deadline) requirements, and if feasible, it is served with available communication and computational resources allocated over its requested duration. As each decision of resource allocation makes some of the resources temporarily unavailable for future, the myopic solution that can optimize only the current resource allocation becomes ineffective for network slicing. Therefore, a Q-learning solution is presented to maximize the network utility in terms of the total weight of granted network slicing requests over a time horizon subject to communication and computational constraints. Results show that reinforcement learning provides major improvements in the 5G network utility relative to myopic, random, and first come first served solutions. While reinforcement learning sustains scalable performance as the number of served users increases, it can also be effectively used to assign resources to network slices when 5G needs to share the spectrum with incumbent users that may dynamically occupy some of the frequency-time blocks.
The Fifth Generation (5G) wireless service of sensor networks involves significant challenges when dealing with the coordination of ever-increasing number of devices accessing shared resources. This has drawn major interest from the research community as many existing works focus on the radio access network congestion control to efficiently manage resources in the context of device-to-device (D2D) interaction in huge sensor networks. In this context, this paper pioneers a study on the impact of D2D link reliability in group-assisted random access protocols, by shedding the light on beneficial performance and potential limitations of approaches of this kind against tunable parameters such as group size, number of sensors and reliability of D2D links. Additionally, we leverage on the association with a Geolocation Database (GDB) capability to assist the grouping decisions by drawing parallels with recent regulatory-driven initiatives around GDBs and arguing benefits of the suggested proposal. Finally, the proposed method is approved to significantly reduce the delay over random access channels, by means of an exhaustive simulation campaign.
The newly introduced ultra-reliable low latency communication service class in 5G New Radio depends on innovative low latency radio resource management solutions that can guarantee high reliability. Grant-free random access, where channel resources are accessed without undergoing assignment through a handshake process, is proposed in 5G New Radio as an important latency reducing solution. However, this comes at an increased likelihood of collisions resulting from uncontrolled channel access, when the same resources are preallocated to a group of users. Novel reliability enhancement techniques are therefore needed. This article provides an overview of grant-free random access in 5G New Radio focusing on the ultra-reliable low latency communication service class, and presents two reliability-enhancing solutions. The first proposes retransmissions over shared resources, whereas the second proposal incorporates grant-free transmission with non-orthogonal multiple access with overlapping transmissions being resolved through the use of advanced receivers. Both proposed solutions result in significant performance gains, in terms of reliability as well as resource efficiency. For example, the proposed non-orthogonal multiple access scheme can support a normalized load of more than 1.5 users/slot at packet loss rates of ~10^{-5} - a significant improvement over the maximum supported load with conventional grant-free schemes like slotted-ALOHA.