Do you want to publish a course? Click here

Models of random graph hierarchies

172   0   0.0 ( 0 )
 Added by Robert Paluch
 Publication date 2015
and research's language is English




Ask ChatGPT about the research

We introduce two models of inclusion hierarchies: Random Graph Hierarchy (RGH) and Limited Random Graph Hierarchy (LRGH). In both models a set of nodes at a given hierarchy level is connected randomly, as in the ErdH{o}s-R{e}nyi random graph, with a fixed average degree equal to a system parameter $c$. Clusters of the resulting network are treated as nodes at the next hierarchy level and they are connected again at this level and so on, until the process cannot continue. In the RGH model we use all clusters, including those of size $1$, when building the next hierarchy level, while in the LRGH model clusters of size $1$ stop participating in further steps. We find that in both models the number of nodes at a given hierarchy level $h$ decreases approximately exponentially with $h$. The height of the hierarchy $H$, i.e. the number of all hierarchy levels, increases logarithmically with the system size $N$, i.e. with the number of nodes at the first level. The height $H$ decreases monotonically with the connectivity parameter $c$ in the RGH model and it reaches a maximum for a certain $c_{max}$ in the LRGH model. The distribution of separate cluster sizes in the LRGH model is a power law with an exponent about $-1.25$. The above results follow from approximate analytical calculations and have been confirmed by numerical simulations.



rate research

Read More

Exponential family Random Graph Models (ERGMs) can be viewed as expressing a probability distribution on graphs arising from the action of competing social forces that make ties more or less likely, depending on the state of the rest of the graph. Such forces often lead to a complex pattern of dependence among edges, with non-trivial large-scale structures emerging from relatively simple local mechanisms. While this provides a powerful tool for probing macro-micro connections, much remains to be understood about how local forces shape global outcomes. One simple question of this type is that of the conditions needed for social forces to stabilize a particular structure. We refer to this property as local stability and seek a general means of identifying the set of parameters under which a target graph is locally stable with respect to a set of alternatives. Here, we provide a complete characterization of the region of the parameter space inducing local stability, showing it to be the interior of a convex cone whose faces can be derived from the change-scores of the sufficient statistics vis-a-vis the alternative structures. As we show, local stability is a necessary but not sufficient condition for more general notions of stability, the latter of which can be explored more efficiently by using the ``stable cone within the parameter space as a starting point. In addition, we show how local stability can be used to determine whether a fitted model implies that an observed structure would be expected to arise primarily from the action of social forces, versus by merit of the model permitting a large number of high probability structures, of which the observed structure is one. We also use our approach to identify the dyads within a given structure that are the least stable, and hence predicted to have the highest probability of changing over time.
Random graph null models have found widespread application in diverse research communities analyzing network datasets, including social, information, and economic networks, as well as food webs, protein-protein interactions, and neuronal networks. The most popular family of random graph null models, called configuration models, are defined as uniform distributions over a space of graphs with a fixed degree sequence. Commonly, properties of an empirical network are compared to properties of an ensemble of graphs from a configuration model in order to quantify whether empirical network properties are meaningful or whether they are instead a common consequence of the particular degree sequence. In this work we study the subtle but important decisions underlying the specification of a configuration model, and investigate the role these choices play in graph sampling procedures and a suite of applications. We place particular emphasis on the importance of specifying the appropriate graph labeling (stub-labeled or vertex-labeled) under which to consider a null model, a choice that closely connects the study of random graphs to the study of random contingency tables. We show that the choice of graph labeling is inconsequential for studies of simple graphs, but can have a significant impact on analyses of multigraphs or graphs with self-loops. The importance of these choices is demonstrated through a series of three vignettes, analyzing network datasets under many different configuration models and observing substantial differences in study conclusions under different models. We argue that in each case, only one of the possible configuration models is appropriate. While our work focuses on undirected static networks, it aims to guide the study of directed networks, dynamic networks, and all other network contexts that are suitably studied through the lens of random graph null models.
143 - Carter T. Butts 2017
Generation of deviates from random graph models with non-trivial edge dependence is an increasingly important problem. Here, we introduce a method which allows perfect sampling from random graph models in exponential family form (exponential family random graph models), using a variant of Coupling From The Past. We illustrate the use of the method via an application to the Markov graphs, a family that has been the subject of considerable research. We also show how the method can be applied to a variant of the biased net models, which are not exponentially parameterized.
We use the exponential random graph models to understand the network structure and its generative process for the Japanese bipartite network of banks and firms. One of the well known and simple model of exponential random graph is the Bernoulli model which shows the links in the bank-firm network are not independent from each other. Another popular exponential random graph model, the two star model, indicates that the bank-firms are in a state where macroscopic variables of the system can show large fluctuations. Moreover, the presence of high fluctuations reflect a fragile nature of the bank-firm network.
273 - Lingqi Meng , Naoki Masuda 2020
Random walks have been proven to be useful for constructing various algorithms to gain information on networks. Algorithm node2vec employs biased random walks to realize embeddings of nodes into low-dimensional spaces, which can then be used for tasks such as multi-label classification and link prediction. The usefulness of node2vec in these applications is considered to be contingent upon properties of random walks that the node2vec algorithm uses. In the present study, we theoretically and numerically analyze random walks used by the node2vec. The node2vec random walk is a second-order Markov chain. We exploit the mapping of its transition rule to a transition probability matrix among directed edges to analyze the stationary probability, relaxation times, and coalescence time. In particular, we provide a multitude of evidence that node2vec random walk accelerates diffusion when its parameters are tuned such that walkers avoid both back-tracking and visiting a neighbor of the previously visited node, but not excessively.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا