ﻻ يوجد ملخص باللغة العربية
This paper deals with measuring the Bayesian robustness of classes of contaminated priors. Two different classes of priors in the neighborhood of the elicited prior are considered. The first one is the well-known $epsilon$-contaminated class, while the second one is the geometric mixing class. The proposed measure of robustness is based on computing the curvature of Renyi divergence between posterior distributions. Examples are used to illustrate the results by using simulated and real data sets.
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simp
Under the Bayesian brain hypothesis, behavioural variations can be attributed to different priors over generative model parameters. This provides a formal explanation for why individuals exhibit inconsistent behavioural preferences when confronted wi
A common concern with Bayesian methodology in scientific contexts is that inferences can be heavily influenced by subjective biases. As presented here, there are two types of bias for some quantity of interest: bias against and bias in favor. Based u
The density power divergence (DPD) and related measures have produced many useful statistical procedures which provide a good balance between model efficiency on one hand, and outlier stability or robustness on the other. The large number of citation
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannons entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback