Non-Bayesian Social Learning on Random Digraphs with Aperiodically Varying Network Connectivity


Abstract in English

We study non-Bayesian social learning on random directed graphs and show that under mild connectivity assumptions, all the agents almost surely learn the true state of the world asymptotically in time if the sequence of the associated weighted adjacency matrices belongs to Class $pstar$ (a broad class of stochastic chains that subsumes uniformly strongly connected chains). We show that uniform strong connectivity, while being unnecessary for asymptotic learning, ensures that all the agents beliefs converge to a consensus almost surely, even when the true state is not identifiable. We then provide a few corollaries of our main results, some of which apply to variants of the original update rule such as inertial non-Bayesian learning and learning via diffusion and adaptation. Others include extensions of known results on social learning. We also show that, if the network of influences is balanced in a certain sense, then asymptotic learning occurs almost surely even in the absence of uniform strong connectivity.

Download