ﻻ يوجد ملخص باللغة العربية
We investigate supervised learning in neural networks. We consider a multi-layered feed-forward network with back propagation. We find that the network of small-world connectivity reduces the learning error and learning time when compared to the networks of regular or random connectivity. Our study has potential applications in the domain of data-mining, image processing, speech recognition, and pattern recognition.
We study the effective resistance of small-world resistor networks. Utilizing recent analytic results for the propagator of the Edwards-Wilkinson process on small-world networks, we obtain the asymptotic behavior of the disorder-averaged two-point re
As its width tends to infinity, a deep neural networks behavior under gradient descent can become simplified and predictable (e.g. given by the Neural Tangent Kernel (NTK)), if it is parametrized appropriately (e.g. the NTK parametrization). However,
The transition to turbulence via spatiotemporal intermittency is investigated in the context of coupled maps defined on small-world networks. The local dynamics is given by the Chate-Manneville minimal map previously used in studies of spatiotemporal
Graphical models are widely used in science to represent joint probability distributions with an underlying conditional dependence structure. The inverse problem of learning a discrete graphical model given i.i.d samples from its joint distribution c
We study the thermodynamic properties of spin systems with bond-disorder on small-world hypergraphs, obtained by superimposing a one-dimensional Ising chain onto a random Bethe graph with p-spin interactions. Using transfer-matrix techniques, we deri