The amount of mutual information contained in time series of two elements gives a measure of how well their activities are coordinated. In a large, complex network of interacting elements, such as a genetic regulatory network within a cell, the average of the mutual information over all pairs <I> is a global measure of how well the system can coordinate its internal dynamics. We study this average pairwise mutual information in random Boolean networks (RBNs) as a function of the distribution of Boolean rules implemented at each element, assuming that the links in the network are randomly placed. Efficient numerical methods for calculating <I> show that as the number of network nodes N approaches infinity, the quantity N<I> exhibits a discontinuity at parameter values corresponding to critical RBNs. For finite systems it peaks near the critical value, but slightly in the disordered regime for typical parameter variations. The source of high values of N<I> is the indirect correlations between pairs of elements from different long chains with a common starting point. The contribution from pairs that are directly linked approaches zero for critical networks and peaks deep in the disordered regime.