Do you want to publish a course? Click here

Roughsets-based Approach for Predicting Battery Life in IoT

75   0   0.0 ( 0 )
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Internet of Things (IoT) and related applications have successfully contributed towards enhancing the value of life in this planet. The advanced wireless sensor networks and its revolutionary computational capabilities have enabled various IoT applications become the next frontier, touching almost all domains of life. With this enormous progress, energy optimization has also become a primary concern with the need to attend to green technologies. The present study focuses on the predictions pertinent to the sustainability of battery life in IoT frameworks in the marine environment. The data used is a publicly available dataset collected from the Chicago district beach water. Firstly, the missing values in the data are replaced with the attribute mean. Later, one-hot encoding technique is applied for achieving data homogeneity followed by the standard scalar technique to normalize the data. Then, rough set theory is used for feature extraction, and the resultant data is fed into a Deep Neural Network (DNN) model for the optimized prediction results. The proposed model is then compared with the state of the art machine learning models and the results justify its superiority on the basis of performance metrics such as Mean Squared Error, Mean Absolute Error, Root Mean Squared Error, and Test Variance Score.



rate research

Read More

Hundreds of millions of people lack access to electricity. Decentralised solar-battery systems are key for addressing this whilst avoiding carbon emissions and air pollution, but are hindered by relatively high costs and rural locations that inhibit timely preventative maintenance. Accurate diagnosis of battery health and prediction of end of life from operational data improves user experience and reduces costs. But lack of controlled validation tests and variable data quality mean existing lab-based techniques fail to work. We apply a scaleable probabilistic machine learning approach to diagnose health in 1027 solar-connected lead-acid batteries, each running for 400-760 days, totalling 620 million data rows. We demonstrate 73% accurate prediction of end of life, eight weeks in advance, rising to 82% at the point of failure. This work highlights the opportunity to estimate health from existing measurements using `big data techniques, without additional equipment, extending lifetime and improving performance in real-world applications.
134 - Arlene John , Barry Cardiff , 2021
Internet of Things (IoT) enabled wearable sensors for health monitoring are widely used to reduce the cost of personal healthcare and improve quality of life. The sleep apnea-hypopnea syndrome, characterized by the abnormal reduction or pause in breathing, greatly affects the quality of sleep of an individual. This paper introduces a novel method for apnea detection (pause in breathing) from electrocardiogram (ECG) signals obtained from wearable devices. The novelty stems from the high resolution of apnea detection on a second-by-second basis, and this is achieved using a 1-dimensional convolutional neural network for feature extraction and detection of sleep apnea events. The proposed method exhibits an accuracy of 99.56% and a sensitivity of 96.05%. This model outperforms several lower resolution state-of-the-art apnea detection methods. The complexity of the proposed model is analyzed. We also analyze the feasibility of model pruning and binarization to reduce the resource requirements on a wearable IoT device. The pruned model with 80% sparsity exhibited an accuracy of 97.34% and a sensitivity of 86.48%. The binarized model exhibited an accuracy of 75.59% and sensitivity of 63.23%. The performance of low complexity patient-specific models derived from the generic model is also studied to analyze the feasibility of retraining existing models to fit patient-specific requirements. The patient-specific models on average exhibited an accuracy of 97.79% and sensitivity of 92.23%. The source code for this work is made publicly available.
As digitization increases, the need to automate various entities becomes crucial for development. The data generated by the IoT devices need to be processed accurately and in a secure manner. The basis for the success of such a scenario requires blockchain as a means of unalterable data storage to improve the overall security and trust in the system. By providing trust in an automated system, with real-time data updates to all stakeholders, an improved form of implementation takes the stage and can help reduce the stress of adaptability to complete automated systems. This research focuses on a use case with respect to the real time Internet of Things (IoT) network which is deployed at the beach of Chicago Park District. This real time data which is collected from various sensors is then used to design a predictive model using Deep Neural Networks for estimating the battery life of IoT sensors that is deployed at the beach. This proposed model could help the government to plan for placing orders of replaceable batteries before time so that there can be an uninterrupted service. Since this data is sensitive and requires to be secured, the predicted battery life value is stored in blockchain which would be a tamper-proof record of the data.
77 - Sicong Liu , Liang Xiao , Zhu Han 2020
Narrowband internet-of-things (NB-IoT) is a competitive 5G technology for massive machine-type communication scenarios, but meanwhile introduces narrowband interference (NBI) to existing broadband transmission such as the long term evolution (LTE) systems in enhanced mobile broadband (eMBB) scenarios. In order to facilitate the harmonic and fair coexistence in wireless heterogeneous networks, it is important to eliminate NB-IoT interference to LTE systems. In this paper, a novel sparse machine learning based framework and a sparse combinatorial optimization problem is formulated for accurate NBI recovery, which can be efficiently solved using the proposed iterative sparse learning algorithm called sparse cross-entropy minimization (SCEM). To further improve the recovery accuracy and convergence rate, regularization is introduced to the loss function in the enhanced algorithm called regularized SCEM. Moreover, exploiting the spatial correlation of NBI, the framework is extended to multiple-input multiple-output systems. Simulation results demonstrate that the proposed methods are effective in eliminating NB-IoT interference to LTE systems, and significantly outperform the state-of-the-art methods.
This paper presents a novel framework for traffic prediction of IoT devices activated by binary Markovian events. First, we consider a massive set of IoT devices whose activation events are modeled by an On-Off Markov process with known transition probabilities. Next, we exploit the temporal correlation of the traffic events and apply the forward algorithm in the context of hidden Markov models (HMM) in order to predict the activation likelihood of each IoT device. Finally, we apply the fast uplink grant scheme in order to allocate resources to the IoT devices that have the maximal likelihood for transmission. In order to evaluate the performance of the proposed scheme, we define the regret metric as the number of missed resource allocation opportunities. The proposed fast uplink scheme based on traffic prediction outperforms both conventional random access and time division duplex in terms of regret and efficiency of system usage, while it maintains its superiority over random access in terms of average age of information for massive deployments.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا