No Arabic abstract
Large-scale integration of converter-based renewable energy sources (RESs) into the power system will lead to a higher risk of frequency nadir limit violation and even frequency instability after the large power disturbance. Therefore, it is essential to consider the frequency nadir constraint (FNC) in power system scheduling. Nevertheless, the FNC is highly nonlinear and non-convex. The state-of-the-art method to simplify the constraint is to construct a low-order frequency response model at first, and then linearize the frequency nadir equation. In this letter, an extreme learning machine (ELM)-based network is built to de-rive the linear formulation of FNC, where the two-step fitting process is integrated into one training process and more details about the physical model of the generator are considered to reduce the fitting error. Simulation results show the superiority of the proposed method on the fitting accuracy.
This work investigates robust monotonic convergent iterative learning control (ILC) for uncertain linear systems in both time and frequency domains, and the ILC algorithm optimizing the convergence speed in terms of $l_{2}$ norm of error signals is derived. Firstly, it is shown that the robust monotonic convergence of the ILC system can be established equivalently by the positive definiteness of a matrix polynomial over some set. Then, a necessary and sufficient condition in the form of sum of squares (SOS) for the positive definiteness is proposed, which is amendable to the feasibility of linear matrix inequalities (LMIs). Based on such a condition, the optimal ILC algorithm that maximizes the convergence speed is obtained by solving a set of convex optimization problems. Moreover, the order of the learning function can be chosen arbitrarily so that the designers have the flexibility to decide the complexity of the learning algorithm.
The capability to switch between grid-connected and islanded modes has promoted adoption of microgrid technology for powering remote locations. Stabilizing frequency during the islanding event, however, is a challenging control task, particularly under high penetration of converter-interfaced sources. In this paper, a numerical optimal control (NOC)-based control synthesis methodology is proposed for preparedness of microgrid islanding that ensure guaranteed performance. The key feature of the proposed paradigm is near real-time centralized scheduling for real-time decentralized executing. For tractable computation, linearized models are used in the problem formulation. To accommodate the linearization errors, interval analysis is employed to compute linearization-induced uncertainty as numerical intervals so that the NOC problem can be formulated into a robust mixed-integer linear program. The proposed control is verified on the full nonlinear model in Simulink. The simulation results shown effectiveness of the proposed control paradigm and the necessity of considering linearization-induced uncertainty.
The main objective of this work is to develop a miniaturized, high accuracy, single-turn absolute, rotary encoder called ASTRAS360. Its measurement principle is based on capturing an image that uniquely identifies the rotation angle. To evaluate this angle, the image first has to be classified into its sector based on its color, and only then can the angle be regressed. In-spired by machine learning, we built a calibration setup, able to generate labeled training data automatically. We used these training data to test, characterize, and compare several machine learning algorithms for the classification and the regression. In an additional experiment, we also characterized the tolerance of our rotary encoder to eccentric mounting. Our findings demonstrate that various algorithms can perform these tasks with high accuracy and reliability; furthermore, providing extra-inputs (e.g. rotation direction) allows the machine learning algorithms to compensate for the mechanical imperfections of the rotary encoder.
In this work, a data-driven modeling framework of switched dynamical systems under time-dependent switching is proposed. The learning technique utilized to model system dynamics is Extreme Learning Machine (ELM). First, a method is developed for the detection of the switching occurrence events in the training data extracted from system traces. The training data thus can be segmented by the detected switching instants. Then, ELM is used to learn the system dynamics of subsystems. The learning process includes segmented trace data merging and subsystem dynamics modeling. Due to the specific learning structure of ELM, the modeling process is formulated as an iterative Least-Squares (LS) optimization problem. Finally, the switching sequence can be reconstructed based on the switching detection and segmented trace merging results. An example of the data-driven modeling DC-DC converter is presented to show the effectiveness of the developed approach.
Coordinating multiple local power sources can restore critical loads after the major outages caused by extreme events. A radial topology is needed for distribution system restoration, while determining a good topology in real-time for online use is a challenge. In this paper, a graph theory-based heuristic considering power flow state is proposed to fast determine the radial topology. The loops of distribution network are eliminated by iteration. The proposed method is validated by one snapshot and multi-period critical load restoration models on different cases. The case studies indicate that the proposed method can determine radial topology in a few seconds and ensure the restoration capacity.