ﻻ يوجد ملخص باللغة العربية
As we are about to embark upon the highly hyped Society 5.0, powered by the Internet of Things (IoT), traditional ways to monitor human heart signals for tracking cardio-vascular conditions are challenging, particularly in remote healthcare settings. On the merits of low power consumption, portability, and non-intrusiveness, there are no suitable IoT solutions that can provide information comparable to the conventional Electrocardiography (ECG). In this paper, we propose an IoT device utilizing a spintronic ultra-sensitive sensor that measures the magnetic fields produced by cardio-vascular electrical activity, i.e. Magentocardiography (MCG). After that, we treat the low-frequency noise generated by the sensors, which is also a challenge for most other sensors dealing with low-frequency bio-magnetic signals. Instead of relying on generic signal processing techniques such as averaging or filtering, we employ deep-learning training on bio-magnetic signals. Using an existing dataset of ECG records, MCG labels are synthetically constructed. A unique deep learning structure composed of combined Convolutional Neural Network (CNN) with Gated Recurrent Unit (GRU) is trained using the labeled data moving through a striding window, which is able to smartly capture and eliminate the noise features. Simulation results are reported to evaluate the effectiveness of the proposed method that demonstrates encouraging performance.
A novel approach is presented in this work for context-aware connectivity and processing optimization of Internet of things (IoT) networks. Different from the state-of-the-art approaches, the proposed approach simultaneously selects the best connecti
Besides being part of the Internet of Things (IoT), drones can play a relevant role in it as enablers. The 3D mobility of UAVs can be exploited to improve node localization in IoT networks for, e.g., search and rescue or goods localization and tracki
Wearable sensor-based human activity recognition (HAR) has been a research focus in the field of ubiquitous and mobile computing for years. In recent years, many deep models have been applied to HAR problems. However, deep learning methods typically
This paper considers the problem of time-difference-of-arrival (TDOA) source localization using possibly unreliable data collected by the Internet of Things (IoT) sensors in the error-prone environments. The Welsch loss function is integrated into a
As Internet of Things (IoT) has emerged as the next logical stage of the Internet, it has become imperative to understand the vulnerabilities of the IoT systems when supporting diverse applications. Because machine learning has been applied in many I