ترغب بنشر مسار تعليمي؟ اضغط هنا

Entropy bifurcation of neural networks on Cayley trees

189   0   0.0 ( 0 )
 نشر من قبل Chih-Hung Chang Lucius
 تاريخ النشر 2017
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

It has been demonstrated that excitable media with a tree structure performed better than other network topologies, it is natural to consider neural networks defined on Cayley trees. The investigation of a symbolic space called tree-shift of finite type is important when it comes to the discussion of the equilibrium solutions of neural networks on Cayley trees. Entropy is a frequently used invariant for measuring the complexity of a system, and constant entropy for an open set of coupling weights between neurons means that the specific network is stable. This paper gives a complete characterization for entropy spectrum of neural networks on Cayley trees and reveals whether the entropy bifurcates when the coupling weights change.



قيم البحث

اقرأ أيضاً

This paper investigates the coloring problem on Fibonacci-Cayley tree, which is a Cayley graph whose vertex set is the Fibonacci sequence. More precisely, we elucidate the complexity of shifts of finite type defined on Fibonacci-Cayley tree via an in variant called entropy. It comes that computing the entropy of a Fibonacci tree-shift of finite type is equivalent to studying a nonlinear recursive system. After proposing an algorithm for the computation of entropy, we apply the result to neural networks defined on Fibonacci-Cayley tree, which reflect those neural systems with neuronal dysfunction. Aside from demonstrating a surprising phenomenon that there are only two possibilities of entropy for neural networks on Fibonacci-Cayley tree, we reveal the formula of the boundary in the parameter space.
Let $Lambda$ be a complex manifold and let $(f_lambda)_{lambdain Lambda}$ be a holomorphic family of rational maps of degree $dgeq 2$ of $mathbb{P}^1$. We define a natural notion of entropy of bifurcation, mimicking the classical definition of entrop y, by the parametric growth rate of critical orbits. We also define a notion a measure-theoretic bifurcation entropy for which we prove a variational principle: the measure of bifurcation is a measure of maximal entropy. We rely crucially on a generalization of Yomdins bound of the volume of the image of a dynamical ball. Applying our technics to complex dynamics in several variables, we notably define and compute the entropy of the trace measure of the Green currents of a holomorphic endomorphism of $mathbb{P}^k$.
Neuronal morphology is an essential element for brain activity and function. We take advantage of current availability of brain-wide neuron digital reconstructions of the Pyramidal cells from a mouse brain, and analyze several emergent features of br ain-wide neuronal morphology. We observe that axonal trees are self-affine while dendritic trees are self-similar. We also show that tree size appear to be random, independent of the number of dendrites within single neurons. Moreover, we consider inhomogeneous branching model which stochastically generates rooted 3-Cayley trees for the brain-wide neuron topology. Based on estimated order-dependent branching probability from actual axonal and dendritic trees, our inhomogeneous model quantitatively captures a number of topological features including size and shape of both axons and dendrites. This sheds lights on a universal mechanism behind the topological formation of brain-wide axonal and dendritic trees.
194 - Ran J. Tessler 2018
We prove a weighted generalization of the formula for the number of plane vertex-labeled trees.
The basin of attraction is the set of initial points that will eventually converge to some attracting set. Its knowledge is important in understanding the dynamical behavior of a given dynamical system of interest. In this work, we address the proble m of reconstructing the basins of attraction of a multistable system, using only labeled data. To this end, we view this problem as a classification task and use a deep neural network as a classifier for predicting the attractor that corresponds to any given initial condition. Additionally, we provide a method for obtaining an approximation of the basin boundary of the underlying system, using the trained classification model. Finally, we provide evidence relating the complexity of the structure of the basins of attraction with the quality of the obtained reconstructions, via the concept of basin entropy. We demonstrate the application of the proposed method on the Lorenz system in a bistable regime.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا