ترغب بنشر مسار تعليمي؟ اضغط هنا

Predicting battery end of life from solar off-grid system field data using machine learning

61   0   0.0 ( 0 )
 نشر من قبل David Howey
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Hundreds of millions of people lack access to electricity. Decentralised solar-battery systems are key for addressing this whilst avoiding carbon emissions and air pollution, but are hindered by relatively high costs and rural locations that inhibit timely preventative maintenance. Accurate diagnosis of battery health and prediction of end of life from operational data improves user experience and reduces costs. But lack of controlled validation tests and variable data quality mean existing lab-based techniques fail to work. We apply a scaleable probabilistic machine learning approach to diagnose health in 1027 solar-connected lead-acid batteries, each running for 400-760 days, totalling 620 million data rows. We demonstrate 73% accurate prediction of end of life, eight weeks in advance, rising to 82% at the point of failure. This work highlights the opportunity to estimate health from existing measurements using `big data techniques, without additional equipment, extending lifetime and improving performance in real-world applications.



قيم البحث

اقرأ أيضاً

Autonomous vehicles are most concerned about safety control issues, and the slip ratio is critical to the safety of the vehicle control system. In this paper, different machine learning algorithms (Neural Networks, Gradient Boosting Machine, Random F orest, and Support Vector Machine) are used to train the slip ratio estimation model based on the acceleration signals ($a_x$, $a_y$, and $a_z$) from the tri-axial Micro-Electro Mechanical System (MEMS) accelerometer utilized in the intelligent tire system, where the acceleration signals are divided into four sets ($a_x/a_y/a_z$, $a_x/a_z$, $a_y/a_z$, and $a_z$) as algorithm inputs. The experimental data used in this study are collected through the MTS Flat-Trac tire test platform. Performance of different slip ratio estimation models is compared using the NRMS errors in 10-fold cross-validation (CV). The results indicate that NN and GBM have more promising accuracy, and the $a_z$ input type has a better performance compared to other input types, with the best result being the estimation model of the NN algorithm with $a_z$ as input, which results is 4.88%. The present study with the fusion of intelligent tire system and machine learning paves the way for the accurate estimation of tire slip ratio under different driving conditions, which will open up a new way of Autonomous vehicles, intelligent tires, and tire slip ratio estimation.
Modern smart grid systems are heavily dependent on Information and Communication Technology, and this dependency makes them prone to cyberattacks. The occurrence of a cyberattack has increased in recent years resulting in substantial damage to power systems. For a reliable and stable operation, cyber protection, control, and detection techniques are becoming essential. Automated detection of cyberattacks with high accuracy is a challenge. To address this, we propose a two-layer hierarchical machine learning model having an accuracy of 95.44 % to improve the detection of cyberattacks. The first layer of the model is used to distinguish between the two modes of operation (normal state or cyberattack). The second layer is used to classify the state into different types of cyberattacks. The layered approach provides an opportunity for the model to focus its training on the targeted task of the layer, resulting in improvement in model accuracy. To validate the effectiveness of the proposed model, we compared its performance against other recent cyber attack detection models proposed in the literature.
Internet of Things (IoT) and related applications have successfully contributed towards enhancing the value of life in this planet. The advanced wireless sensor networks and its revolutionary computational capabilities have enabled various IoT applic ations become the next frontier, touching almost all domains of life. With this enormous progress, energy optimization has also become a primary concern with the need to attend to green technologies. The present study focuses on the predictions pertinent to the sustainability of battery life in IoT frameworks in the marine environment. The data used is a publicly available dataset collected from the Chicago district beach water. Firstly, the missing values in the data are replaced with the attribute mean. Later, one-hot encoding technique is applied for achieving data homogeneity followed by the standard scalar technique to normalize the data. Then, rough set theory is used for feature extraction, and the resultant data is fed into a Deep Neural Network (DNN) model for the optimized prediction results. The proposed model is then compared with the state of the art machine learning models and the results justify its superiority on the basis of performance metrics such as Mean Squared Error, Mean Absolute Error, Root Mean Squared Error, and Test Variance Score.
Lithium-ion cells may experience rapid degradation in later life, especially with more extreme usage protocols. The onset of rapid degradation is called the `knee point, and forecasting it is important for the safe and economically viable use for bat teries. We propose a data-driven method that uses automated feature selection to produce inputs for a Gaussian process regression model that estimates changes in battery health, from which the entire capacity fade trajectory, knee point and end of life may be predicted. The feature selection procedure flexibly adapts to varying inputs and prioritises those that impact degradation. For the datasets considered, it was found that calendar time and time spent in specific voltage regions had a strong impact on degradation rate. The approach produced median root mean square errors on capacity estimates under 1%, and also produced median knee point and end of life prediction errors of 2.6% and 1.3% respectively.
As power systems are undergoing a significant transformation with more uncertainties, less inertia and closer to operation limits, there is increasing risk of large outages. Thus, there is an imperative need to enhance grid emergency control to maint ain system reliability and security. Towards this end, great progress has been made in developing deep reinforcement learning (DRL) based grid control solutions in recent years. However, existing DRL-based solutions have two main limitations: 1) they cannot handle well with a wide range of grid operation conditions, system parameters, and contingencies; 2) they generally lack the ability to fast adapt to new grid operation conditions, system parameters, and contingencies, limiting their applicability for real-world applications. In this paper, we mitigate these limitations by developing a novel deep meta reinforcement learning (DMRL) algorithm. The DMRL combines the meta strategy optimization together with DRL, and trains policies modulated by a latent space that can quickly adapt to new scenarios. We test the developed DMRL algorithm on the IEEE 300-bus system. We demonstrate fast adaptation of the meta-trained DRL polices with latent variables to new operating conditions and scenarios using the proposed method and achieve superior performance compared to the state-of-the-art DRL and model predictive control (MPC) methods.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا