ﻻ يوجد ملخص باللغة العربية
We study the popular centrality measure known as effective conductance or in some circles as information centrality. This is an important notion of centrality for undirected networks, with many applications, e.g., for random walks, electrical resistor networks, epidemic spreading, etc. In this paper, we first reinterpret this measure in terms of modulus (energy) of families of walks on the network. This modulus centrality measure coincides with the effective conductance measure on simple undirected networks, and extends it to much more general situations, e.g., directed networks as well. Secondly, we study a variation of this modulus approach in the egocentric network paradigm. Egonetworks are networks formed around a focal node (ego) with a specific order of neighborhoods. We propose efficient analytical and approximate methods for computing these measures on both undirected and directed networks. Finally, we describe a simple method inspired by the modulus point-of-view, called shell degree, which proved to be a useful tool for network science.
We provide a method to correct the observed azimuthal anisotropy in heavy-ion collisions for the event plane resolution in a wide centrality bin. This new procedure is especially useful for rare particles, such as Omega baryons and J/psi mesons, whic
The ideal gas law of physics and chemistry says that PV = nRT. This law is a statement of the relationship between four variables (P, V, n, and T) that reflect properties of a quantity of gas in a container. The law enables us to make accurate predic
We introduce a generalization of Higuchis estimator of the fractal dimension as a new way to characterize the multifractal spectrum of univariate time series. The resulting multifractal Higuchi dimension analysis (MF-HDA) method considers the order-$
Uncovering factors underlying the network formation is a long-standing challenge for data mining and network analysis. In particular, the microscopic organizing principles of directed networks are less understood than those of undirected networks. Th
Deep neural networks, when optimized with sufficient data, provide accurate representations of high-dimensional functions; in contrast, function approximation techniques that have predominated in scientific computing do not scale well with dimensiona