Do you want to publish a course? Click here

Fast Universal Algorithms for Robustness Analysis

127   0   0.0 ( 0 )
 Added by Xinjia Chen
 Publication date 2008
  fields
and research's language is English




Ask ChatGPT about the research

In this paper, we develop efficient randomized algorithms for estimating probabilistic robustness margin and constructing robustness degradation curve for uncertain dynamic systems. One remarkable feature of these algorithms is their universal applicability to robustness analysis problems with arbitrary robustness requirements and uncertainty bounding set. In contrast to existing probabilistic methods, our approach does not depend on the feasibility of computing deterministic robustness margin. We have developed efficient methods such as probabilistic comparison, probabilistic bisection and backward iteration to facilitate the computation. In particular, confidence interval for binomial random variables has been frequently used in the estimation of probabilistic robustness margin and in the accuracy evaluation of estimating robustness degradation function. Motivated by the importance of fast computing of binomial confidence interval in the context of probabilistic robustness analysis, we have derived an explicit formula for constructing the confidence interval of binomial parameter with guaranteed coverage probability. The formula overcomes the limitation of normal approximation which is asymptotic in nature and thus inevitably introduce unknown errors in applications. Moreover, the formula is extremely simple and very tight in comparison with classic Clopper-Pearsons approach.



rate research

Read More

111 - Xinjia Chen , Kemin Zhou 2008
This paper considers the robust ${cal D}$-stability margin problem under polynomic structured real parametric uncertainty. Based on the work of De Gaston and Safonov (1988), we have developed techniques such as, a parallel frequency sweeping strategy, different domain splitting schemes, which significantly reduce the computational complexity and guarantee the convergence.
144 - Xinjia Chen , Kemin Zhou 2008
In this paper, we consider robust control using randomized algorithms. We extend the existing order statistics distribution theory to the general case in which the distribution of population is not assumed to be continuous and the order statistics is associated with certain constraints. In particular, we derive an inequality on distribution for related order statistics. Moreover, we also propose two different approaches in searching reliable solutions to the robust analysis and optimal synthesis problems under constraints. Furthermore, minimum computational effort is investigated and bounds for sample size are derived.
The dynamic response of power grids to small disturbances influences their overall stability. This paper examines the effect of network topology on the linearized time-invariant dynamics of electric power systems. The proposed framework utilizes ${cal H}_2$-norm based stability metrics to study the optimal placement of lines on existing networks as well as the topology design of new networks. The design task is first posed as an NP-hard mixed-integer nonlinear program (MINLP) that is exactly reformulated as a mixed-integer linear program (MILP) using McCormick linearization. To improve computation time, graph-theoretic properties are exploited to derive valid inequalities (cuts) and tighten bounds on the continuous optimization variables. Moreover, a cutting plane generation procedure is put forth that is able to interject the MILP solver and augment additional constraints to the problem on-the-fly. The efficacy of our approach in designing optimal grid topologies is demonstrated through numerical tests on the IEEE 39-bus network.
Sample reuse techniques have significantly reduced the numerical complexity of probabilistic robustness analysis. Existing results show that for a nested collection of hyper-spheres the complexity of the problem of performing $N$ equivalent i.i.d. (identical and independent) experiments for each sphere is absolutely bounded, independent of the number of spheres and depending only on the initial and final radii. In this chapter we elevate sample reuse to a new level of generality and establish that the numerical complexity of performing $N$ equivalent i.i.d. experiments for a chain of sets is absolutely bounded if the sets are nested. Each set does not even have to be connected, as long as the nested property holds. Thus, for example, the result permits the integration of deterministic and probabilistic analysis to eliminate regions from an uncertainty set and reduce even further the complexity of some problems. With a more general view, the result enables the analysis of complex decision problems mixing real-valued and discrete-valued random variables.
We develop a fast algorithm to construct the robustness degradation function, which describes quantitatively the relationship between the proportion of systems guaranteeing the robustness requirement and the radius of the uncertainty set. This function can be applied to predict whether a controller design based on an inexact mathematical model will perform satisfactorily when implemented on the true system.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا