ﻻ يوجد ملخص باللغة العربية
For many years, the Simplified Refined Instrumental Variable method for Continuous-time systems (SRIVC) has been widely used for identification. The intersample behaviour of the input plays an important role in this method, and it has been shown recently that the SRIVC estimator is not consistent if an incorrect assumption on the intersample behaviour is considered. In this paper, we present an extension of the SRIVC algorithm that is able to deal with continuous-time multisine signals, which cannot be interpolated exactly through hold reconstructions. The proposed estimator is generically consistent for any input reconstructed through zero or first-order-hold devices, and we show that it is generically consistent for continuous-time multisine inputs as well. The statistical performance of the proposed estimator is compared to the standard SRIVC estimator through extensive simulations.
In continuous-time system identification, the intersample behavior of the input signal is known to play a crucial role in the performance of estimation methods. One common input behavior assumption is that the spectrum of the input is band-limited. T
In this paper, we introduce the notion of periodic safety, which requires that the system trajectories periodically visit a subset of a forward-invariant safe set, and utilize it in a multi-rate framework where a high-level planner generates a refere
In this paper, we first propose a method that can efficiently compute the maximal robust controlled invariant set for discrete-time linear systems with pure delay in input. The key to this method is to construct an auxiliary linear system (without de
Discrete abstractions have become a standard approach to assist control synthesis under complex specifications. Most techniques for the construction of discrete abstractions are based on sampling of both the state and time spaces, which may not be ab
We study the problem of controlling multi-agent systems under a set of signal temporal logic tasks. Signal temporal logic is a formalism that is used to express time and space constraints for dynamical systems. Recent methods to solve the control syn