ﻻ يوجد ملخص باللغة العربية
Information-theory based variational principles have proven effective at providing scalable uncertainty quantification (i.e. robustness) bounds for quantities of interest in the presence of nonparametric model-form uncertainty. In this work, we combine such variational formulas with functional inequalities (Poincar{e}, $log$-Sobolev, Liapunov functions) to derive explicit uncertainty quantification bounds for time-averaged observables, comparing a Markov process to a second (not necessarily Markov) process. These bounds are well-behaved in the infinite-time limit and apply to steady-states of both discrete and continuous-time Markov processes.
We obtain moment and Gaussian bounds for general Lipschitz functions evaluated along the sample path of a Markov chain. We treat Markov chains on general (possibly unbounded) state spaces via a coupling method. If the first moment of the coupling tim
We prove pathwise large-deviation principles of switching Markov processes by exploiting the connection to associated Hamilton-Jacobi equations, following Jin Fengs and Thomas Kurtzs method. In the limit that we consider, we show how the large-deviat
Rare events, and more general risk-sensitive quantities-of-interest (QoIs), are significantly impacted by uncertainty in the tail behavior of a distribution. Uncertainty in the tail can take many different forms, each of which leads to a particular a
We investigate the dissipativity properties of a class of scalar second order parabolic partial differential equations with time-dependent coefficients. We provide explicit condition on the drift term which ensure that the relative entropy of one par
We present a general framework for uncertainty quantification that is a mosaic of interconnected models. We define global first and second order structural and correlative sensitivity analyses for random counting measures acting on risk functionals o