Do you want to publish a course? Click here

Storage capacity in symmetric binary perceptrons

91   0   0.0 ( 0 )
 Added by Benjamin Aubin
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

We study the problem of determining the capacity of the binary perceptron for two variants of the problem where the corresponding constraint is symmetric. We call these variants the rectangle-binary-perceptron (RPB) and the $u-$function-binary-perceptron (UBP). We show that, unlike for the usual step-function-binary-perceptron, the critical capacity in these symmetric cases is given by the annealed computation in a large region of parameter space (for all rectangular constraints and for narrow enough $u-$function constraints, $K<K^*$). We prove this fact (under two natural assumptions) using the first and second moment methods. We further use the second moment method to conjecture that solutions of the symmetric binary perceptrons are organized in a so-called frozen-1RSB structure, without using the replica method. We then use the replica method to estimate the capacity threshold for the UBP case when the $u-$function is wide $K>K^*$. We conclude that full-step-replica-symmetry breaking would have to be evaluated in order to obtain the exact capacity in this case.

rate research

Read More

146 - G.S.Dhesi , M. Ausloos 2016
Nowadays, strict finite size effects must be taken into account in condensed matter problems when treated through models based on lattices or graphs. On the other hand, the cases of directed bonds or links are known as highly relevant, in topics ranging from ferroelectrics to quotation networks. Combining these two points leads to examine finite size random matrices. To obtain basic materials properties, the Green function associated to the matrix has to be calculated. In order to obtain the first finite size correction a perturbative scheme is hereby developed within the framework of the replica method. The averaged eigenvalue spectrum and the corresponding Green function of Wigner random sign real symmetric N x N matrices to order 1/N are in fine obtained analytically. Related simulation results are also presented. The comparison between the analytical formulae and finite size matrices numerical diagonalization results exhibits an excellent agreement, confirming the correctness of the first order finite size expression.
50 - Do-Hyun Kim , Jinha Park , 2016
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of $O(N)$, where $N$ is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.
146 - D.Bolle , T.Verbeiren 1999
The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.
193 - F. Gerl 1996
Within a Kuhn-Tucker cavity method introduced in a former paper, we study optimal stability learning for situations, where in the replica formalism the replica symmetry may be broken, namely (i) the case of a simple perceptron above the critical loading, and (ii) the case of two-layer AND-perceptrons, if one learns with maximal stability. We find that the deviation of our cavity solution from the replica symmetric one in these cases is a clear indication of the necessity of replica symmetry breaking. In any case the cavity solution tends to underestimate the storage capabilities of the networks.
We study a class of Markov chains that describe reversible stochastic dynamics of a large class of disordered mean field models at low temperatures. Our main purpose is to give a precise relation between the metastable time scales in the problem to the properties of the rate functions of the corresponding Gibbs measures. We derive the analog of the Wentzell-Freidlin theory in this case, showing that any transition can be decomposed, with probability exponentially close to one, into a deterministic sequence of ``admissible transitions. For these admissible transitions we give upper and lower bounds on the expected transition times that differ only by a constant. The distribution rescaled transition times are shown to converge to the exponential distribution. We exemplify our results in the context of the random field Curie-Weiss model.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا