Kullback-Leibler Divergence for Bayesian Nonparametric Model Checking


Abstract in English

Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem. This is mainly attributed to the discreteness property of the Dirichlet process and that the Kullback-Leibler divergence between any discrete distribution and any continuous distribution is infinity. The approach proposed in this paper, which is based on incorporating the Dirichlet process, the Kullback-Leibler divergence and the relative belief ratio, is considered the first concrete solution to this issue. Applying the approach is simple and does not require obtaining a closed form of the relative belief ratio. A Monte Carlo study and real data examples show that the developed approach exhibits excellent performance.

Download