ﻻ يوجد ملخص باللغة العربية
Nonparametric latent structure models provide flexible inference on distinct, yet related, groups of observations. Each component of a vector of $d ge 2$ random measures models the distribution of a group of exchangeable observations, while their dependence structure regulates the borrowing of information across different groups. Recent work has quantified the dependence between random measures in terms of Wasserstein distance from the maximally dependent scenario when $d=2$. By solving an intriguing max-min problem we are now able to define a Wasserstein index of dependence $I_mathcal{W}$ with the following properties: (i) it simultaneously quantifies the dependence of $d ge 2$ random measures; (ii) it takes values in [0,1]; (iii) it attains the extreme values ${0,1}$ under independence and complete dependence, respectively; (iv) since it is defined in terms of the underlying Levy measures, it is possible to evaluate it numerically in many Bayesian nonparametric models for partially exchangeable data.
Dependence measures based on reproducing kernel Hilbert spaces, also known as Hilbert-Schmidt Independence Criterion and denoted HSIC, are widely used to statistically decide whether or not two random vectors are dependent. Recently, non-parametric H
Consider a standard white Wishart matrix with parameters $n$ and $p$. Motivated by applications in high-dimensional statistics and signal processing, we perform asymptotic analysis on the maxima and minima of the eigenvalues of all the $m times m$ pr
We extend classic characterisations of posterior distributions under Dirichlet process and gamma random measures priors to a dynamic framework. We consider the problem of learning, from indirect observations, two families of time-dependent processes
In data science, it is often required to estimate dependencies between different data sources. These dependencies are typically calculated using Pearsons correlation, distance correlation, and/or mutual information. However, none of these measures sa
We establish exponential inequalities for a class of V-statistics under strong mixing conditions. Our theory is developed via a novel kernel expansion based on random Fourier features and the use of a probabilistic method. This type of expansion is n