ﻻ يوجد ملخص باللغة العربية
Differentiation is an important task in control, observation and fault detection. Levants differentiator is unique, since it is able to estimate exactly and robustly the derivatives of a signal with a bounded high-order derivative. However, the convergence time, although finite, grows unboundedly with the norm of the initial differentiation error, making it uncertain when the estimated derivative is exact. In this paper we propose an extension of Levants differentiator so that the worst case convergence time can be arbitrarily assigned independently of the initial condition, i.e. the estimation converges in emph{Fixed-Time}. We propose also a family of continuous differentiators and provide a unified Lyapunov framework for analysis and design.
There is an increasing interest in designing differentiators, which converge exactly before a prespecified time regardless of the initial conditions, i.e., which are fixed-time convergent with a predefined Upper Bound of their Settling Time (UBST), d
Constructing differentiation algorithms with a fixed-time convergence and a predefined Upper Bound on their Settling Time (textit{UBST}), i.e., predefined-time differentiators, is attracting attention for solving estimation and control problems under
Algorithms having uniform convergence with respect to their initial condition (i.e., with fixed-time stability) are receiving increasing attention for solving control and observer design problems under time constraints. However, we still lack a gener
In power distribution systems, the growing penetration of renewable energy resources brings new challenges to maintaining voltage safety, which is further complicated by the limited model information of distribution systems. To address these challeng
For optimal power flow problems with chance constraints, a particularly effective method is based on a fixed point iteration applied to a sequence of deterministic power flow problems. However, a priori, the convergence of such an approach is not nec