Non-asymptotic Closed-Loop System Identification using Autoregressive Processes and Hankel Model Reduction


Abstract in English

One of the primary challenges of system identification is determining how much data is necessary to adequately fit a model. Non-asymptotic characterizations of the performance of system identification methods provide this knowledge. Such characterizations are available for several algorithms performing open-loop identification. Often times, however, data is collected in closed-loop. Application of open-loop identification methods to closed-loop data can result in biased estimates. One method used by subspace identification techniques to eliminate these biases involves first fitting a long-horizon autoregressive model, then performing model reduction. The asymptotic behavior of such algorithms is well characterized, but the non-asymptotic behavior is not. This work provides a non-asymptotic characterization of one particular variant of these algorithms. More specifically, we provide non-asymptotic upper bounds on the generalization error of the produced model, as well as high probability bounds on the difference between the produced model and the finite horizon Kalman Filter.

Download