Do you want to publish a course? Click here

Bootstrapping Exchangeable Random Graphs

85   0   0.0 ( 0 )
 Added by Alden Green
 Publication date 2017
and research's language is English




Ask ChatGPT about the research

We introduce two new bootstraps for exchangeable random graphs. One, the empirical graphon, is based purely on resampling, while the other, the histogram stochastic block model, is a model-based sieve bootstrap. We show that both of them accurately approximate the sampling distributions of motif densities, i.e., of the normalized counts of the number of times fixed subgraphs appear in the network. These densities characterize the distribution of (infinite) exchangeable networks. Our bootstraps therefore give, for the first time, a valid quantification of uncertainty in inferences about fundamental network statistics, and so of parameters identifiable from them.



rate research

Read More

79 - Julyan Arbel 2017
These are written discussions of the paper Sparse graphs using exchangeable random measures by Franc{c}ois Caron and Emily B. Fox, contributed to the Journal of the Royal Statistical Society Series B.
Distributions over exchangeable matrices with infinitely many columns, such as the Indian buffet process, are useful in constructing nonparametric latent variable models. However, the distribution implied by such models over the number of features exhibited by each data point may be poorly- suited for many modeling tasks. In this paper, we propose a class of exchangeable nonparametric priors obtained by restricting the domain of existing models. Such models allow us to specify the distribution over the number of features per data point, and can achieve better performance on data sets where the number of features is not well-modeled by the original distribution.
We consider ensembles of real symmetric band matrices with entries drawn from an infinite sequence of exchangeable random variables, as far as the symmetry of the matrices permits. In general the entries of the upper triangular parts of these matrices are correlated and no smallness or sparseness of these correlations is assumed. It is shown that the eigenvalue distribution measures still converge to a semicircle but with random scaling. We also investigate the asymptotic behavior of the corresponding $ell_2$-operator norms. The key to our analysis is a generalisation of a classic result by de Finetti that allows to represent the underlying probability spaces as averages of Wigner band ensembles with entries that are not necessarily centred. Some of our results appear to be new even for such Wigner band matrices.
Recent work has introduced sparse exchangeable graphs and the associated graphex framework, as a generalization of dense exchangeable graphs and the associated graphon framework. The development of this subject involves the interplay between the statistical modeling of network data, the theory of large graph limits, exchangeability, and network sampling. The purpose of the present paper is to clarify the relationships between these subjects by explaining each in terms of a certain natural sampling scheme associated with the graphex model. The first main technical contribution is the introduction of sampling convergence, a new notion of graph limit that generalizes left convergence so that it becomes meaningful for the sparse graph regime. The second main technical contribution is the demonstration that the (somewhat cryptic) notion of exchangeability underpinning the graphex framework is equivalent to a more natural probabilistic invariance expressed in terms of the sampling scheme.
Bootstrapping was designed to randomly resample data from a fixed sample using Monte Carlo techniques. However, the original sample itself defines a discrete distribution. Convolutional methods are well suited for discrete distributions, and we show the advantages of utilizing these techniques for bootstrapping. The discrete convolutional approach can provide exact numerical solutions for bootstrap quantities, or at least mathematical error bounds. In contrast, Monte Carlo bootstrap methods can only provide confidence intervals which converge slowly. Additionally, for some problems the computation time of the convolutional approach can be dramatically less than that of Monte Carlo resampling. This article provides several examples of bootstrapping using the proposed convolutional technique and compares the results to those of the Monte Carlo bootstrap, and to those of the competing saddlepoint method.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا