Do you want to publish a course? Click here

Random planar graphs and the London street network

192   0   0.0 ( 0 )
 Publication date 2009
  fields Physics
and research's language is English




Ask ChatGPT about the research

In this paper we analyse the street network of London both in its primary and dual representation. To understand its properties, we consider three idealised models based on a grid, a static random planar graph and a growing random planar graph. Comparing the models and the street network, we find that the streets of London form a self-organising system whose growth is characterised by a strict interaction between the metrical and informational space. In particular, a principle of least effort appears to create a balance between the physical and the mental effort required to navigate the city.



rate research

Read More

Previous studies demonstrated empirically that human mobility exhibits Levy flight behaviour. However, our knowledge of the mechanisms governing this Levy flight behaviour remains limited. Here we analyze over 72 000 peoples moving trajectories, obtained from 50 taxicabs during a six-month period in a large street network, and illustrate that the human mobility pattern, or the Levy flight behaviour, is mainly attributed to the underlying street network. In other words, the goal-directed nature of human movement has little effect on the overall traffic distribution. We further simulate the mobility of a large number of random walkers, and find that (1) the simulated random walkers can reproduce the same human mobility pattern, and (2) the simulated mobility rate of the random walkers correlates pretty well (an R square up to 0.87) with the observed human mobility rate.
104 - Sherief Abdallah 2009
Several important complex network measures that helped discovering common patterns across real-world networks ignore edge weights, an important information in real-world networks. We propose a new methodology for generalizing measures of unweighted networks through a generalization of the cardinality concept of a set of weights. The key observation here is that many measures of unweighted networks use the cardinality (the size) of some subset of edges in their computation. For example, the node degree is the number of edges incident to a node. We define the effective cardinality, a new metric that quantifies how many edges are effectively being used, assuming that an edges weight reflects the amount of interaction across that edge. We prove that a generalized measure, using our method, reduces to the original unweighted measure if there is no disparity between weights, which ensures that the laws that govern the original unweighted measure will also govern the generalized measure when the weights are equal. We also prove that our generalization ensures a partial ordering (among sets of weighted edges) that is consistent with the original unweighted measure, unlike previously developed generalizations. We illustrate the applicability of our method by generalizing four unweighted network measures. As a case study, we analyze four real-world weighted networks using our generalized degree and clustering coefficient. The analysis shows that the generalized degree distribution is consistent with the power-law hypothesis but with steeper decline and that there is a common pattern governing the ratio between the generalized degree and the traditional degree. The analysis also shows that nodes with more uniform weights tend to cluster with nodes that also have more uniform weights among themselves.
Starting from inhomogeneous time scaling and linear decorrelation between successive price returns, Baldovin and Stella recently proposed a way to build a model describing the time evolution of a financial index. We first make it fully explicit by using Student distributions instead of power law-truncated Levy distributions; we also show that the analytic tractability of the model extends to the larger class of symmetric generalized hyperbolic distributions and provide a full computation of their multivariate characteristic functions; more generally, the stochastic processes arising in this framework are representable as mixtures of Wiener processes. The Baldovin and Stella model, while mimicking well volatility relaxation phenomena such as the Omori law, fails to reproduce other stylized facts such as the leverage effect or some time reversal asymmetries. We discuss how to modify the dynamics of this process in order to reproduce real data more accurately.
It is shown that prize changes of the US dollar - German Mark exchange rates upon different delay times can be regarded as a stochastic Marcovian process. Furthermore we show that from the empirical data the Kramers-Moyal coefficients can be estimated. Finally, we present an explicite Fokker-Planck equation which models very precisely the empirical probabilitiy distributions.
259 - I. Grabec 2007
The normalized radial basis function neural network emerges in the statistical modeling of natural laws that relate components of multivariate data. The modeling is based on the kernel estimator of the joint probability density function pertaining to given data. From this function a governing law is extracted by the conditional average estimator. The corresponding nonparametric regression represents a normalized radial basis function neural network and can be related with the multi-layer perceptron equation. In this article an exact equivalence of both paradigms is demonstrated for a one-dimensional case with symmetric triangular basis functions. The transformation provides for a simple interpretation of perceptron parameters in terms of statistical samples of multivariate data.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا