ترغب بنشر مسار تعليمي؟ اضغط هنا

NeuroRAN: Rethinking Virtualization for AI-native Radio Access Networks in 6G

102   0   0.0 ( 0 )
 نشر من قبل James Gross
 تاريخ النشر 2021
والبحث باللغة English




اسأل ChatGPT حول البحث

Network softwarization has revolutionized the architecture of cellular wireless networks. State-of-the-art container based virtual radio access networks (vRAN) provide enormous flexibility and reduced life cycle management costs, but they also come with prohibitive energy consumption. We argue that for future AI-native wireless networks to be flexible and energy efficient, there is a need for a new abstraction in network softwarization that caters for neural network type of workloads and allows a large degree of service composability. In this paper we present the NeuroRAN architecture, which leverages stateful function as a user facing execution model, and is complemented with virtualized resources and decentralized resource management. We show that neural network based implementations of common transceiver functional blocks fit the proposed architecture, and we discuss key research challenges related to compilation and code generation, resource management, reliability and security.



قيم البحث

اقرأ أيضاً

164 - Wen Wu , Conghao Zhou , Mushu Li 2021
With the global roll-out of the fifth generation (5G) networks, it is necessary to look beyond 5G and envision the sixth generation (6G) networks. The 6G networks are expected to have space-air-ground integrated networking, advanced network virtualiz ation, and ubiquitous intelligence. This article proposes an artificial intelligence (AI)-native network slicing architecture for 6G networks to facilitate intelligent network management and support emerging AI services. AI is built in the proposed network slicing architecture to enable the synergy of AI and network slicing. AI solutions are investigated for the entire lifecycle of network slicing to facilitate intelligent network management, i.e., AI for slicing. Furthermore, network slicing approaches are discussed to support emerging AI services by constructing slice instances and performing efficient resource management, i.e., slicing for AI. Finally, a case study is presented, followed by a discussion of open research issues that are essential for AI-native network slicing in 6G.
Current network access infrastructures are characterized by heterogeneity, low latency, high throughput, and high computational capability, enabling massive concurrent connections and various services. Unfortunately, this design does not pay signific ant attention to mobile services in underserved areas. In this context, the use of aerial radio access networks (ARANs) is a promising strategy to complement existing terrestrial communication systems. Involving airborne components such as unmanned aerial vehicles, drones, and satellites, ARANs can quickly establish a flexible access infrastructure on demand. ARANs are expected to support the development of seamless mobile communication systems toward a comprehensive sixth-generation (6G) global access infrastructure. This paper provides an overview of recent studies regarding ARANs in the literature. First, we investigate related work to identify areas for further exploration in terms of recent knowledge advancements and analyses. Second, we define the scope and methodology of this study. Then, we describe ARAN architecture and its fundamental features for the development of 6G networks. In particular, we analyze the system model from several perspectives, including transmission propagation, energy consumption, communication latency, and network mobility. Furthermore, we introduce technologies that enable the success of ARAN implementations in terms of energy replenishment, operational management, and data delivery. Subsequently, we discuss application scenarios envisioned for these technologies. Finally, we highlight ongoing research efforts and trends toward 6G ARANs.
Driven by the emerging use cases in massive access future networks, there is a need for technological advancements and evolutions for wireless communications beyond the fifth-generation (5G) networks. In particular, we envisage the upcoming sixth-gen eration (6G) networks to consist of numerous devices demanding extremely high-performance interconnections even under strenuous scenarios such as diverse mobility, extreme density, and dynamic environment. To cater for such a demand, investigation on flexible and sustainable radio access network (RAN) techniques capable of supporting highly diverse requirements and massive connectivity is of utmost importance. To this end, this paper first outlines the key driving applications for 6G, including smart city and factory, which trigger the transformation of existing RAN techniques. We then examine and provide in-depth discussions on several critical performance requirements (i.e., the level of flexibility, the support for massive interconnectivity, and energy efficiency), issues, enabling technologies, and challenges in designing 6G massive RANs. We conclude the article by providing several artificial-intelligence-based approaches to overcome future challenges.
We present DeepIA, a deep neural network (DNN) framework for enabling fast and reliable initial access for AI-driven beyond 5G and 6G millimeter (mmWave) networks. DeepIA reduces the beam sweep time compared to a conventional exhaustive search-based IA process by utilizing only a subset of the available beams. DeepIA maps received signal strengths (RSSs) obtained from a subset of beams to the beam that is best oriented to the receiver. In both line of sight (LoS) and non-line of sight (NLoS) conditions, DeepIA reduces the IA time and outperforms the conventional IAs beam prediction accuracy. We show that the beam prediction accuracy of DeepIA saturates with the number of beams used for IA and depends on the particular selection of the beams. In LoS conditions, the selection of the beams is consequential and improves the accuracy by up to 70%. In NLoS situations, it improves accuracy by up to 35%. We find that, averaging multiple RSS snapshots further reduces the number of beams needed and achieves more than 95% accuracy in both LoS and NLoS conditions. Finally, we evaluate the beam prediction time of DeepIA through embedded hardware implementation and show the improvement over the conventional beam sweeping.
161 - Weisen Shi , Junlng Li , Nan Cheng 2019
Drone base station (DBS) is a promising technique to extend wireless connections for uncovered users of terrestrial radio access networks (RAN). To improve user fairness and network performance, in this paper, we design 3D trajectories of multiple DB Ss in the drone assisted radio access networks (DA-RAN) where DBSs fly over associated areas of interests (AoIs) and relay communications between the base station (BS) and users in AoIs. We formulate the multi-DBS 3D trajectory planning and scheduling as a mixed integer non-linear programming (MINLP) problem with the objective of minimizing the average DBS-to-user (D2U) pathloss. The 3D trajectory variations in both horizontal and vertical directions, as well as the state-of-the-art DBS-related channel models are considered in the formulation. To address the non-convexity and NP-hardness of the MINLP problem, we first decouple it into multiple integer linear programming (ILP) and quasi-convex sub-problems in which AoI association, D2U communication scheduling, horizontal trajectories and flying heights of DBSs are respectively optimized. Then, we design a multi-DBS 3D trajectory planning and scheduling algorithm to solve the sub-problems iteratively based on the block coordinate descent (BCD) method. A k-means-based initial trajectory generation and a search-based start slot scheduling are considered in the proposed algorithm to improve trajectory design performance and ensure inter-DBS distance constraint, respectively. Extensive simulations are conducted to investigate the impacts of DBS quantity, horizontal speed and initial trajectory on the trajectory planning results. Compared with the static DBS deployment, the proposed trajectory planning can achieve 10-15 dB reduction on average D2U pathloss, and reduce the D2U pathloss standard deviation by 68%, which indicate the improvements of network performance and user fairness.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا