ترغب بنشر مسار تعليمي؟ اضغط هنا

An Overview of Machine Learning Approaches in Wireless Mesh Networks

99   0   0.0 ( 0 )
 نشر من قبل Haris Gacanin
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Wireless Mesh Networks (WMNs) have been extensively studied for nearly two decades as one of the most promising candidates expected to power the high bandwidth, high coverage wireless networks of the future. However, consumer demand for such networks has only recently caught up, rendering efforts at optimizing WMNs to support high capacities and offer high QoS, while being secure and fault tolerant, more important than ever. To this end, a recent trend has been the application of Machine Learning (ML) to solve various design and management tasks related to WMNs. In this work, we discuss key ML techniques and analyze how past efforts have applied them in WMNs, while noting some existing issues and suggesting potential solutions. We also provide directions on how ML could advance future research and examine recent developments in the field.

قيم البحث

اقرأ أيضاً

Todays telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users behavioral data, etc. Advanced mat hematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions.
Wireless mesh networks are a promising technology for connecting sensors and actuators with high flexibility and low investment costs. In industrial applications, however, reliability is essential. Therefore, two time-slotted medium access methods, D SME and TSCH, were added to the IEEE 802.15.4 standard. They allow collision-free communication in multi-hop networks and provide channel hopping for mitigating external interferences. The slot schedule used in these networks is of high importance for the network performance. This paper supports the development of efficient schedules by providing an analytical model for the assessment of such schedules, focused on TSCH. A Markov chain model for the finite queue on every node is introduced that takes the slot distribution into account. The models of all nodes are interconnected to calculate network metrics such as packet delivery ratio, end-to-end delay and throughput. An evaluation compares the model with a simulation of the Orchestra schedule. The model is applied to Orchestra as well as to two simple distributed scheduling algorithms to demonstrate the importance of traffic-awareness for achieving high throughput.
Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable inform ation services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.
320 - Xin Fan , Yan Huo 2021
Ultra-low latency supported by the fifth generation (5G) give impetus to the prosperity of many wireless network applications, such as autonomous driving, robotics, telepresence, virtual reality and so on. Ultra-low latency is not achieved in a momen t, but requires long-term evolution of network structure and key enabling communication technologies. In this paper, we provide an evolutionary overview of low latency in mobile communication systems, including two different evolutionary perspectives: 1) network architecture; 2) physical layer air interface technologies. We firstly describe in detail the evolution of communication network architecture from the second generation (2G) to 5G, highlighting the key points reducing latency. Moreover, we review the evolution of key enabling technologies in the physical layer from 2G to 5G, which is also aimed at reducing latency. We also discussed the challenges and future research directions for low latency in network architecture and physical layer.
163 - Weifeng Sun , Rong Cong , Feng Xia 2010
Even though channel assignment has been studied for years, the performance of most IEEE 802.11-based multi-hop wireless networks such as wireless sensor network (WSN), wireless mesh network (WMN), mobile ad hoc network (MANET) is limited by channel i nterference. Properly assigning orthogonal channels to wireless links can improve the throughput of multi-hop networks. To solve the dynamic channel assignment problem, a routing-based channel assignment algorithm called R-CA is proposed. R-CA can allocate channels for wireless nodes when needed and free channels after data transmission. Thus more channel resource can be explored by wireless nodes. Simulation results show that R-CA can effectively enhance the network throughput and packet delivery rate.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا