Do you want to publish a course? Click here

Computing Research Challenges in Next Generation Wireless Networking

103   0   0.0 ( 0 )
 Added by Elisa Bertino
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

By all measures, wireless networking has seen explosive growth over the past decade. Fourth Generation Long Term Evolution (4G LTE) cellular technology has increased the bandwidth available for smartphones, in essence, delivering broadband speeds to mobile devices. The most recent 5G technology is further enhancing the transmission speeds and cell capacity, as well as, reducing latency through the use of different radio technologies and is expected to provide Internet connections that are an order of magnitude faster than 4G LTE. Technology continues to advance rapidly, however, and the next generation, 6G, is already being envisioned. 6G will make possible a wide range of powerful, new applications including holographic telepresence, telehealth, remote education, ubiquitous robotics and autonomous vehicles, smart cities and communities (IoT), and advanced manufacturing (Industry 4.0, sometimes referred to as the Fourth Industrial Revolution), to name but a few. The advances we will see begin at the hardware level and extend all the way to the top of the software stack. Artificial Intelligence (AI) will also start playing a greater role in the development and management of wireless networking infrastructure by becoming embedded in applications throughout all levels of the network. The resulting benefits to society will be enormous. At the same time these exciting new wireless capabilities are appearing rapidly on the horizon, a broad range of research challenges loom ahead. These stem from the ever-increasing complexity of the hardware and software systems, along with the need to provide infrastructure that is robust and secure while simultaneously protecting the privacy of users. Here we outline some of those challenges and provide recommendations for the research that needs to be done to address them.



rate research

Read More

Computing has dramatically changed nearly every aspect of our lives, from business and agriculture to communication and entertainment. As a nation, we rely on computing in the design of systems for energy, transportation and defense; and computing fuels scientific discoveries that will improve our fundamental understanding of the world and help develop solutions to major challenges in health and the environment. Computing has changed our world, in part, because our innovations can run on computers whose performance and cost-performance has improved a million-fold over the last few decades. A driving force behind this has been a repeated doubling of the transistors per chip, dubbed Moores Law. A concomitant enabler has been Dennard Scaling that has permitted these performance doublings at roughly constant power, but, as we will see, both trends face challenges. Consider for a moment the impact of these two trends over the past 30 years. A 1980s supercomputer (e.g. a Cray 2) was rated at nearly 2 Gflops and consumed nearly 200 KW of power. At the time, it was used for high performance and national-scale applications ranging from weather forecasting to nuclear weapons research. A computer of similar performance now fits in our pocket and consumes less than 10 watts. What would be the implications of a similar computing/power reduction over the next 30 years - that is, taking a petaflop-scale machine (e.g. the Cray XK7 which requires about 500 KW for 1 Pflop (=1015 operations/sec) performance) and repeating that process? What is possible with such a computer in your pocket? How would it change the landscape of high capacity computing? In the remainder of this paper, we articulate some opportunities and challenges for dramatic performance improvements of both personal to national scale computing, and discuss some out of the box possibilities for achieving computing at this scale.
The Quantum Internet is envisioned as the final stage of the quantum revolution, opening fundamentally new communications and computing capabilities, including the distributed quantum computing. But the Quantum Internet is governed by the laws of quantum mechanics. Phenomena with no counterpart in classical networks, such as no-cloning, quantum measurement, entanglement and teleporting, impose very challenging constraints for the network design. Specifically, classical network functionalities, ranging from error-control mechanisms to overhead-control strategies, are based on the assumption that classical information can be safely read and copied. But this assumption does not hold in the Quantum Internet. As a consequence, the design of the Quantum Internet requires a major network-paradigm shift to harness the quantum mechanics specificities. The goal of this work is to shed light on the challenges and the open problems of the Quantum Internet design. To this aim, we first introduce some basic knowledge of quantum mechanics, needed to understand the differences between a classical and a quantum network. Then, we introduce quantum teleportation as the key strategy for transmitting quantum information without physically transferring the particle that stores the quantum information or violating the principles of the quantum mechanics. Finally, the key research challenges to design quantum communication networks are described.
The 5G Phase-2 and beyond wireless systems will focus more on vertical applications such as autonomous driving and industrial Internet-of-things, many of which are categorized as ultra-Reliable Low-Latency Communications (uRLLC). In this article, an alternative view on uRLLC is presented, that information latency, which measures the distortion of information resulted from time lag of its acquisition process, is more relevant than conventional communication latency of uRLLC in wireless networked control systems. An AI-assisted Situationally-aware Multi-Agent Reinforcement learning framework for wireless neTworks (SMART) is presented to address the information latency optimization challenge. Case studies of typical applications in Autonomous Driving (AD) are demonstrated, i.e., dense platooning and intersection management, which show that SMART can effectively optimize information latency, and more importantly, information latency-optimized systems outperform conventional uRLLC-oriented systems significantly in terms of AD performance such as traffic efficiency, thus pointing out a new research and system design paradigm.
The advent of miniature biosensors has generated numerous opportunities for deploying wireless sensor networks in healthcare. However, an important barrier is that acceptance by healthcare stakeholders is influenced by the effectiveness of privacy safeguards for personal and intimate information which is collected and transmitted over the air, within and beyond these networks. In particular, these networks are progressing beyond traditional sensors, towards also using multimedia sensors, which raise further privacy concerns. Paradoxically, less research has addressed privacy protection, compared to security. Nevertheless, privacy protection has gradually evolved from being assumed an implicit by-product of security measures, and it is maturing into a research concern in its own right. However, further technical and socio-technical advances are needed. As a contribution towards galvanising further research, the hallmarks of this paper include: (i) a literature survey explicitly anchored on privacy preservation, it is underpinned by untangling privacy goals from security goals, to avoid mixing privacy and security concerns, as is often the case in other papers; (ii) a critical survey of privacy preservation services for wireless sensor networks in healthcare, including threat analysis and assessment methodologies; it also offers classification trees for the multifaceted challenge of privacy protection in healthcare, and for privacy threats, attacks and countermeasures; (iii) a discussion of technical advances complemented by reflection over the implications of regulatory frameworks; (iv) a discussion of open research challenges, leading onto offers of directions for future research towards unlocking the door onto privacy protection which is appropriate for healthcare in the twenty-first century.
A high-rate yet low-cost air-to-ground (A2G) communication backbone is conceived for integrating the space and terrestrial network by harnessing the opportunistic assistance of the passenger planes or high altitude platforms (HAPs) as mobile base stations (BSs) and millimetre wave communication. The airliners act as the network-provider for the terrestrial users while relying on satellite backhaul. Three different beamforming techniques relying on a large-scale planar array are used for transmission by the airliner/HAP for achieving a high directional gain, hence minimizing the interference among the users. Furthermore, approximate spectral efficiency (SE) and area spectral efficiency (ASE) expressions are derived and quantified for diverse system parameters.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا