Do you want to publish a course? Click here

Zero-bias Deep Learning Enabled Quick and Reliable Abnormality Detection in IoT

78   0   0.0 ( 0 )
 Added by Yongxin Liu
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Abnormality detection is essential to the performance of safety-critical and latency-constrained systems. However, as systems are becoming increasingly complicated with a large quantity of heterogeneous data, conventional statistical change point detection methods are becoming less effective and efficient. Although Deep Learning (DL) and Deep Neural Networks (DNNs) are increasingly employed to handle heterogeneous data, they still lack theoretic assurable performance and explainability. This paper integrates zero-bias DNN and Quickest Event Detection algorithms to provide a holistic framework for quick and reliable detection of both abnormalities and time-dependent abnormal events in the Internet of Things (IoT). We first use the zero-bias dense layer to increase the explainability of DNN. We provide a solution to convert zero-bias DNN classifiers into performance assured binary abnormality detectors. Using the converted abnormality detector, we then present a sequential quickest detection scheme that provides the theoretically assured lowest abnormal event detection delay under false alarm constraints. Finally, we demonstrate the effectiveness of the framework using both massive signal records from real-world aviation communication systems and simulated data. Code and data of our work is available at url{https://github.com/pcwhy/AbnormalityDetectionInZbDNN}



rate research

Read More

163 - Udit Gupta 2015
As network size continues to grow exponentially, there has been a proportionate increase in the number of nodes in the corresponding network. With the advent of Internet of things (IOT), it is assumed that many more devices will be connected to the existing network infrastructure. As a result, monitoring is expected to get more complex for administrators as networks tend to become more heterogeneous. Moreover, the addressing for IOTs would be more complex given the scale at which devices will be added to the network and hence monitoring is bound to become an uphill task due to management of larger range of addresses. This paper will throw light on what kind of monitoring mechanisms can be deployed in internet of things (IOTs) and their overall effectiveness.
We present DeepIA, a deep neural network (DNN) framework for enabling fast and reliable initial access for AI-driven beyond 5G and 6G millimeter (mmWave) networks. DeepIA reduces the beam sweep time compared to a conventional exhaustive search-based IA process by utilizing only a subset of the available beams. DeepIA maps received signal strengths (RSSs) obtained from a subset of beams to the beam that is best oriented to the receiver. In both line of sight (LoS) and non-line of sight (NLoS) conditions, DeepIA reduces the IA time and outperforms the conventional IAs beam prediction accuracy. We show that the beam prediction accuracy of DeepIA saturates with the number of beams used for IA and depends on the particular selection of the beams. In LoS conditions, the selection of the beams is consequential and improves the accuracy by up to 70%. In NLoS situations, it improves accuracy by up to 35%. We find that, averaging multiple RSS snapshots further reduces the number of beams needed and achieves more than 95% accuracy in both LoS and NLoS conditions. Finally, we evaluate the beam prediction time of DeepIA through embedded hardware implementation and show the improvement over the conventional beam sweeping.
Internet of Things (IoT) with its growing number of deployed devices and applications raises significant challenges for network maintenance procedures. In this work, we formulate a problem of autonomous maintenance in IoT networks as a Partially Observable Markov Decision Process. Subsequently, we utilize Deep Reinforcement Learning algorithms (DRL) to train agents that decide if a maintenance procedure is in order or not and, in the former case, the proper type of maintenance needed. To avoid wasting the scarce resources of IoT networks we utilize the Age of Information (AoI) metric as a reward signal for the training of the smart agents. AoI captures the freshness of the sensory data which are transmitted by the IoT sensors as part of their normal service provision. Numerical results indicate that AoI integrates enough information about the past and present states of the system to be successfully used in the training of smart agents for the autonomous maintenance of the network.
The number of connected Internet of Things (IoT) devices within cyber-physical infrastructure systems grows at an increasing rate. This poses significant device management and security challenges to current IoT networks. Among several approaches to cope with these challenges, data-based methods rooted in deep learning (DL) are receiving an increased interest. In this paper, motivated by the upcoming surge of 5G IoT connectivity in industrial environments, we propose to integrate a DL-based anomaly detection (AD) as a service into the 3GPP mobile cellular IoT architecture. The proposed architecture embeds autoencoder based anomaly detection modules both at the IoT devices (ADM-EDGE) and in the mobile core network (ADM-FOG), thereby balancing between the system responsiveness and accuracy. We design, integrate, demonstrate and evaluate a testbed that implements the above service in a real-world deployment integrated within the 3GPP Narrow-Band IoT (NB-IoT) mobile operator network.
Machine learning is a modern approach to problem-solving and task automation. In particular, machine learning is concerned with the development and applications of algorithms that can recognize patterns in data and use them for predictive modeling. Artificial neural networks are a particular class of machine learning algorithms and models that evolved into what is now described as deep learning. Given the computational advances made in the last decade, deep learning can now be applied to massive data sets and in innumerable contexts. Therefore, deep learning has become its own subfield of machine learning. In the context of biological research, it has been increasingly used to derive novel insights from high-dimensional biological data. To make the biological applications of deep learning more accessible to scientists who have some experience with machine learning, we solicited input from a community of researchers with varied biological and deep learning interests. These individuals collaboratively contributed to this manuscripts writing using the GitHub version control platform and the Manubot manuscript generation toolset. The goal was to articulate a practical, accessible, and concise set of guidelines and suggestions to follow when using deep learning. In the course of our discussions, several themes became clear: the importance of understanding and applying machine learning fundamentals as a baseline for utilizing deep learning, the necessity for extensive model comparisons with careful evaluation, and the need for critical thought in interpreting results generated by deep learning, among others.

suggested questions

comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا