Do you want to publish a course? Click here

Deep Learning Anomaly Detection for Cellular IoT with Applications in Smart Logistics

102   0   0.0 ( 0 )
 Added by Dejan Vukobratovic
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

The number of connected Internet of Things (IoT) devices within cyber-physical infrastructure systems grows at an increasing rate. This poses significant device management and security challenges to current IoT networks. Among several approaches to cope with these challenges, data-based methods rooted in deep learning (DL) are receiving an increased interest. In this paper, motivated by the upcoming surge of 5G IoT connectivity in industrial environments, we propose to integrate a DL-based anomaly detection (AD) as a service into the 3GPP mobile cellular IoT architecture. The proposed architecture embeds autoencoder based anomaly detection modules both at the IoT devices (ADM-EDGE) and in the mobile core network (ADM-FOG), thereby balancing between the system responsiveness and accuracy. We design, integrate, demonstrate and evaluate a testbed that implements the above service in a real-world deployment integrated within the 3GPP Narrow-Band IoT (NB-IoT) mobile operator network.



rate research

Read More

113 - Tie Luo , Sai G. Nagarajan 2018
Wireless sensor networks (WSN) are fundamental to the Internet of Things (IoT) by bridging the gap between the physical and the cyber worlds. Anomaly detection is a critical task in this context as it is responsible for identifying various events of interests such as equipment faults and undiscovered phenomena. However, this task is challenging because of the elusive nature of anomalies and the volatility of the ambient environments. In a resource-scarce setting like WSN, this challenge is further elevated and weakens the suitability of many existing solutions. In this paper, for the first time, we introduce autoencoder neural networks into WSN to solve the anomaly detection problem. We design a two-part algorithm that resides on sensors and the IoT cloud respectively, such that (i) anomalies can be detected at sensors in a fully distributed manner without the need for communicating with any other sensors or the cloud, and (ii) the relatively more computation-intensive learning task can be handled by the cloud with a much lower (and configurable) frequency. In addition to the minimal communication overhead, the computational load on sensors is also very low (of polynomial complexity) and readily affordable by most COTS sensors. Using a real WSN indoor testbed and sensor data collected over 4 consecutive months, we demonstrate via experiments that our proposed autoencoder-based anomaly detection mechanism achieves high detection accuracy and low false alarm rate. It is also able to adapt to unforeseeable and new changes in a non-stationary environment, thanks to the unsupervised learning feature of our chosen autoencoder neural networks.
113 - Pan Wang , Shidong Liu , Feng Ye 2018
The smart grid utilizes many Internet of Things (IoT) applications to support its intelligent grid monitoring and control. The requirements of the IoT applications vary due to different tasks in the smart grid. In this paper, we propose a new computing paradigm to offer location-aware, latencysensitive monitoring and intelligent control for IoT applications in the smart grid. In particular, a new fog-based architecture and programming model is designed. Fog computing extends computing to the edge of a network, which has a perfect match to IoT applications. However, existing schemes can hardly satisfy the distributed coordination within fog computing nodes in the smart grid. In the proposed model, we introduce a new distributed fog computing coordinator, which periodically gathers information of fog computing nodes, e.g., remaining resources, tasks, etc. Moreover, the fog computing coordinator also manages jobs so that all computing nodes can collaborate on complex tasks. In addition, we construct a working prototype of intelligent electric vehicle service to evaluate the proposed model. Experiment results are also presented to demonstrate that our proposed model exceed the traditional fog computing schemes for IoT applications in the smart grid.
Abnormality detection is essential to the performance of safety-critical and latency-constrained systems. However, as systems are becoming increasingly complicated with a large quantity of heterogeneous data, conventional statistical change point detection methods are becoming less effective and efficient. Although Deep Learning (DL) and Deep Neural Networks (DNNs) are increasingly employed to handle heterogeneous data, they still lack theoretic assurable performance and explainability. This paper integrates zero-bias DNN and Quickest Event Detection algorithms to provide a holistic framework for quick and reliable detection of both abnormalities and time-dependent abnormal events in the Internet of Things (IoT). We first use the zero-bias dense layer to increase the explainability of DNN. We provide a solution to convert zero-bias DNN classifiers into performance assured binary abnormality detectors. Using the converted abnormality detector, we then present a sequential quickest detection scheme that provides the theoretically assured lowest abnormal event detection delay under false alarm constraints. Finally, we demonstrate the effectiveness of the framework using both massive signal records from real-world aviation communication systems and simulated data. Code and data of our work is available at url{https://github.com/pcwhy/AbnormalityDetectionInZbDNN}
Age of Information (AoI) has gained importance as a Key Performance Indicator (KPI) for characterizing the freshness of information in information-update systems and time-critical applications. Recent theoretical research on the topic has generated significant understanding of how various algorithms perform in terms of this metric on various system models and networking scenarios. In this paper, by the help of the theoretical results, we analyzed the AoI behavior on real-life networks, using our two test-beds, addressing IoT networks and regular computers. Excessive number of AoI measurements are provided for variations of transport protocols such as TCP, UDP and web-socket, on wired and wireless links. Practical issues such as synchronization and selection of hardware along with transport protocol, and their effects on AoI are discussed. The results provide insight toward application and transport layer mechanisms for optimizing AoI in real-life networks.
The adoption of Internet of Things (IoT) technologies is increasing and thus IoT is seemingly shifting from hype to reality. However, the actual use of IoT over significant timescales has not been empirically analyzed. In other words the reality remains unexplored. Furthermore, despite the variety of IoT verticals, the use of IoT across vertical industries has not been compared. This paper uses a two-year IoT dataset from a major Finnish mobile network operator to investigate different aspects of cellular IoT traffic including temporal evolution and the use of IoT devices across industries. We present a variety of novel findings. For example, our results show that IoT traffic volume per device increased three-fold over the last two years. Additionally, we illustrate diversity in IoT usage among different industries with orders of magnitude differences in traffic volume and device mobility. Though we also note that the daily traffic patterns of all devices can be clustered into only three patterns, differing mainly in the presence and timing of a peak hour. Finally, we illustrate that the share of LTE-enabled IoT devices has remained low at around 2% and 30% of IoT devices are still 2G only.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا