No Arabic abstract
Battery Asset Management problem determines the minimum cost replacement schedules for each individual asset in a group of battery assets that operate in parallel. Battery cycle life varies under different operating conditions including temperature, depth of discharge, charge rate, etc., and a battery deteriorates due to usage, which cannot be handled by current asset management models. This paper presents battery cycle life prognosis and its integration with parallel asset management to reduce lifecycle cost of the Battery Energy Storage System (BESS). A nonlinear capacity fade model is incorporated in the parallel asset management model to update battery capacity. Parametric studies have been conducted to explore the influence of different model inputs (e.g. usage rate, unit battery capacity, operating condition and periodical demand) for a five-year time horizon. Experiment results verify the reasonableness of this new framework and suggest that the increase in battery lifetime leads to decrease in lifecycle cost.
The complex nature of lithium-ion battery degradation has led to many machine learning based approaches to health forecasting being proposed in literature. However, machine learning can be computationally intensive. Linear approaches are faster but have previously been too inflexible for successful prognosis. For both techniques, the choice and quality of the inputs is a limiting factor of performance. Piecewise-linear models, combined with automated feature selection, offer a fast and flexible alternative without being as computationally intensive as machine learning. Here, a piecewise-linear approach to battery health forecasting was compared to a Gaussian process regression tool and found to perform equally well. The input feature selection process demonstrated the benefit of limiting the correlation between inputs. Further trials found that the piecewise-linear approach was robust to changing input size and availability of training data.
Lithium-ion cells may experience rapid degradation in later life, especially with more extreme usage protocols. The onset of rapid degradation is called the `knee point, and forecasting it is important for the safe and economically viable use for batteries. We propose a data-driven method that uses automated feature selection to produce inputs for a Gaussian process regression model that estimates changes in battery health, from which the entire capacity fade trajectory, knee point and end of life may be predicted. The feature selection procedure flexibly adapts to varying inputs and prioritises those that impact degradation. For the datasets considered, it was found that calendar time and time spent in specific voltage regions had a strong impact on degradation rate. The approach produced median root mean square errors on capacity estimates under 1%, and also produced median knee point and end of life prediction errors of 2.6% and 1.3% respectively.
Off-grid systems have emerged as a sustainable and cost-effective solution for rural electrification. In sub-Sarahan Africa (SSA), a great number of solar-hybrid microgrids have been installed or planned, operating stand-alone or grid-tied to a weak grid. Presence of intermittent energy sources necessitates the provision of energy storage for system balancing. Reliability and economic performance of those rural microgrids strongly depend on specific control strategies. This work develops a predictive control framework dedicated to rural microgrids incorporating a temperature-dependent battery degradation model. Based on a scalable DC PV-battery microgrid, the realistic simulation shows its superior performance in the reliability improvement and cost reduction. Compared with the day-ahead control without the temperature-dependent battery degradation model, this control strategy can improve the reliability by 5.5% and extend the lead-acid battery life time by 26%, equivalent to lowering the levelised cost of electricity (LCOE) by 13%.
In metropolitan areas populated with commercial buildings, electric power supply is stringent especially during business hours. Demand side management using battery is a promising solution to mitigate peak demands, however long payback time creates barriers for large scale adoption. In this paper, we have developed a design phase battery life-cycle cost assessment tool and a runtime controller for the building owners, taking into account the degradation of battery. In the design phase, perfect knowledge on building load profile is assumed to estimate ideal payback time. In runtime, stochastic programming and load predictions are applied to address the uncertainties in loads for producing optimal battery operation. For validation, we have performed numerical experiments using the real-life tariff model serves New York City, Zn/MnO2 battery, and state-of-the-art building simulation tool. Experimental results shows a small gap between design phase assessment and runtime control. To further examine the proposed methods, we have applied the same tariff model and performed numerical experiments on nine weather zones and three types of commercial buildings. On contrary to the common practice of shallow discharging battery for preventing phenomenal degradation, experimental results show promising payback time achieved by optimally deep discharge a battery.
The increasing gap between electricity prices and feed-in tariffs for photovoltaic (PV) electricity in many countries, along with the recent strong cost degression of batteries, led to a rise in installed combined PV and battery systems worldwide. The load profile of a property greatly affects the self-consumption rate and, thus, the profitability of the system. Therefore, insights from analyses of residential applications, which are well studied, cannot simply be transferred to other types of properties. In comparison to residential applications, PV is especially suitable for municipal buildings, due to their better match of demand and supply. In order to analyze the value of additional batteries, municipal PV battery systems of different sizes were simulated, taking load profiles of 101 properties as inputs. It was found that self-consumption differs significantly from households, while different types of municipal buildings are largely similar in terms of the indicators analyzed. The share of electricity consumed during summertime was found to have the most significant impact on the self-consumption rate for most considered system sizes. Due to lower electricity tariffs and lower increases in self-consumption provided through batteries in municipal buildings, the investment into a battery is not economically advantageous in most of the cases considered.