ترغب بنشر مسار تعليمي؟ اضغط هنا

Three-layer Approach to Detect Anomalies in Industrial Environments based on Machine Learning

80   0   0.0 ( 0 )
 نشر من قبل Pedro Henrique Juliano Nardelli
 تاريخ النشر 2020
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper introduces a general approach to design a tailored solution to detect rare events in different industrial applications based on Internet of Things (IoT) networks and machine learning algorithms. We propose a general framework based on three layers (physical, data and decision) that defines the possible designing options so that the rare events/anomalies can be detected ultra-reliably. This general framework is then applied in a well-known benchmark scenario, namely Tennessee Eastman Process. We then analyze this benchmark under three threads related to data processes: acquisition, fusion and analytics. Our numerical results indicate that: (i) event-driven data acquisition can significantly decrease the number of samples while filtering measurement noise, (ii) mutual information data fusion method can significantly decrease the variable spaces and (iii) quantitative association rule mining method for data analytics is effective for the rare event detection, identification and diagnosis. These results indicates the benefits of an integrated solution that jointly considers the different levels of data processing following the proposed general three layer framework, including details of the communication network and computing platform to be employed.



قيم البحث

اقرأ أيضاً

Wireless cellular networks have many parameters that are normally tuned upon deployment and re-tuned as the network changes. Many operational parameters affect reference signal received power (RSRP), reference signal received quality (RSRQ), signal-t o-interference-plus-noise-ratio (SINR), and, ultimately, throughput. In this paper, we develop and compare two approaches for maximizing coverage and minimizing interference by jointly optimizing the transmit power and downtilt (elevation tilt) settings across sectors. To evaluate different parameter configurations offline, we construct a realistic simulation model that captures geographic correlations. Using this model, we evaluate two optimization methods: deep deterministic policy gradient (DDPG), a reinforcement learning (RL) algorithm, and multi-objective Bayesian optimization (BO). Our simulations show that both approaches significantly outperform random search and converge to comparable Pareto frontiers, but that BO converges with two orders of magnitude fewer evaluations than DDPG. Our results suggest that data-driven techniques can effectively self-optimize coverage and capacity in cellular networks.
With the explosively increasing demands on the network capacity, throughput and number of connected wireless devices, massive connectivity is an urgent problem for the next generation wireless communications. In this paper, we propose a grant-free ac cess protocol for massive connectivity that utilizes a large number of antennas in a base station (BS) and is expected to be widely deployed in cellular networks. The scheme consists of a sparse structure in sparse code multiple access (SCMA) and receiver processing based on dictionary learning (DL). A large number of devices can transmit data without any scheduling process. Unlike existing schemes, whose signal schedulings require a lot of overhead, the scheduling overhead required by the proposed scheme is negligible, which is attractive for resource utilization and transmission power efficiency. The numerical results show that the proposed scheme has promising performance in massive connectivity scenario of cellular networks.
Unmanned aerial vehicles (UAVs) are emerging in commercial spaces and will support many applications and services, such as smart agriculture, dynamic network deployment, and network coverage extension, surveillance and security. The unmanned aircraft system (UAS) traffic management (UTM) provides a framework for safe UAV operation integrating UAV controllers and central data bases via a communications network. This paper discusses the challenges and opportunities for machine learning (ML) for effectively providing critical UTM services. We introduce the four pillars of UTM---operation planning, situational awareness, status and advisors and security---and discuss the main services, specific opportunities for ML and the ongoing research. We conclude that the multi-faceted operating environment and operational parameters will benefit from collected data and data-driven algorithms, as well as online learning to face new UAV operation situations.
141 - Yan Liu , Yansha Deng , Nan Jiang 2020
NarrowBand-Internet of Things (NB-IoT) is a new 3GPP radio access technology designed to provide better coverage for Low Power Wide Area (LPWA) networks. To provide reliable connections with extended coverage, a repetition transmission scheme and up to three Coverage Enhancement (CE) groups are introduced into NB-IoT during both Random Access CHannel (RACH) procedure and data transmission procedure, where each CE group is configured with different repetition values and transmission resources. To characterize the RACH performance of the NB-IoT network with three CE groups, this paper develops a novel traffic-aware spatio-temporal model to analyze the RACH success probability, where both the preamble transmission outage and the collision events of each CE group jointly determine the traffic evolution and the RACH success probability. Based on this analytical model, we derive the analytical expression for the RACH success probability of a randomly chosen IoT device in each CE group over multiple time slots with different RACH schemes, including baseline, back-off (BO), access class barring (ACB), and hybrid ACB and BO schemes (ACB&BO). Our results have shown that the RACH success probabilities of the devices in three CE groups outperform that of a single CE group network but not for all the groups, which is affected by the choice of the categorizing parameters.This mathematical model and analytical framework can be applied to evaluate the performance of multiple group users of other networks with spatial separations.
With the development and widespread use of wireless devices in recent years (mobile phones, Internet of Things, Wi-Fi), the electromagnetic spectrum has become extremely crowded. In order to counter security threats posed by rogue or unknown transmit ters, it is important to identify RF transmitters not by the data content of the transmissions but based on the intrinsic physical characteristics of the transmitters. RF waveforms represent a particular challenge because of the extremely high data rates involved and the potentially large number of transmitters present in a given location. These factors outline the need for rapid fingerprinting and identification methods that go beyond the traditional hand-engineered approaches. In this study, we investigate the use of machine learning (ML) strategies to the classification and identification problems, and the use of wavelets to reduce the amount of data required. Four different ML strategies are evaluated: deep neural nets (DNN), convolutional neural nets (CNN), support vector machines (SVM), and multi-stage training (MST) using accelerated Levenberg-Marquardt (A-LM) updates. The A-LM MST method preconditioned by wavelets was by far the most accurate, achieving 100% classification accuracy of transmitters, as tested using data originating from 12 different transmitters. We discuss strategies for extension of MST to a much larger number of transmitters.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا