ﻻ يوجد ملخص باللغة العربية
Probabilistic methods have attracted much interest in fault detection design, but its need for complete distributional knowledge is seldomly fulfilled. This has spurred endeavors in distributionally robust fault detection (DRFD) design, which secures robustness against inexact distributions by using moment-based ambiguity sets as a prime modelling tool. However, with the worst-case distribution being implausibly discrete, the resulting design suffers from over-pessimisim and can mask the true fault. This paper aims at developing a new DRFD design scheme with reduced conservatism, by assuming unimodality of the true distribution, a property commonly encountered in real-life practice. To tackle the chance constraint on false alarms, we first attain a new generalized Gauss bound on the probability outside an ellipsoid, which is less conservative than known Chebyshev bounds. As a result, analytical solutions to DRFD design problems are obtained, which are less conservative than known ones disregarding unimodality. We further encode bounded support information into ambiguity sets, derive a tightened multivariate Gauss bound, and develop approximate reformulations of design problems as convex programs. Moreover, the derived generalized Gauss bounds are broadly applicable to versatile change detection tasks for setting alarm thresholds. Results on a laborotary system shown that, the incorporation of unimodality information helps reducing conservatism of distributionally robust design and leads to a better tradeoff between robustness and sensitivity.
In this paper, we develop a distributionally robust chance-constrained formulation of the Optimal Power Flow problem (OPF) whereby the system operator can leverage contextual information. For this purpose, we exploit an ambiguity set based on probabi
We consider stochastic programs conditional on some covariate information, where the only knowledge of the possible relationship between the uncertain parameters and the covariates is reduced to a finite data sample of their joint distribution. By ex
Distributionally robust optimization (DRO) is a widely used framework for optimizing objective functionals in the presence of both randomness and model-form uncertainty. A key step in the practical solution of many DRO problems is a tractable reformu
We present a distributionally robust formulation of a stochastic optimization problem for non-i.i.d vector autoregressive data. We use the Wasserstein distance to define robustness in the space of distributions and we show, using duality theory, that
This paper expands the notion of robust moment problems to incorporate distributional ambiguity using Wasserstein distance as the ambiguity measure. The classical Chebyshev-Cantelli (zeroth partial moment) inequalities, Scarf and Lo (first partial mo