ﻻ يوجد ملخص باللغة العربية
Rare events, and more general risk-sensitive quantities-of-interest (QoIs), are significantly impacted by uncertainty in the tail behavior of a distribution. Uncertainty in the tail can take many different forms, each of which leads to a particular ambiguity set of alternative models. Distributional robustness bounds over such an ambiguity set constitute a stress-test of the model. In this paper we develop a method, utilizing Renyi-divergences, of constructing the ambiguity set that captures a user-specified form of tail-perturbation. We then obtain distributional robustness bounds (performance guarantees) for risk-sensitive QoIs over these ambiguity sets, using the known connection between Renyi-divergences and robustness for risk-sensitive QoIs. We also expand on this connection in several ways, including a generalization of the Donsker-Varadhan variational formula to Renyi divergences, and various tightness results. These ideas are illustrated through applications to uncertainty quantification in a model of lithium-ion battery failure, robustness of large deviations rate functions, and risk-sensitive distributionally robust optimization for option pricing.
We present a general framework for uncertainty quantification that is a mosaic of interconnected models. We define global first and second order structural and correlative sensitivity analyses for random counting measures acting on risk functionals o
Information-theory based variational principles have proven effective at providing scalable uncertainty quantification (i.e. robustness) bounds for quantities of interest in the presence of nonparametric model-form uncertainty. In this work, we combi
This work affords new insights into Bayesian CART in the context of structured wavelet shrinkage. The main thrust is to develop a formal inferential framework for Bayesian tree-based regression. We reframe Bayesian CART as a g-type prior which depart
Bayesian optimization is a class of global optimization techniques. It regards the underlying objective function as a realization of a Gaussian process. Although the outputs of Bayesian optimization are random according to the Gaussian process assump
Quantifying the impact of parametric and model-form uncertainty on the predictions of stochastic models is a key challenge in many applications. Previous work has shown that the relative entropy rate is an effective tool for deriving path-space uncer