ترغب بنشر مسار تعليمي؟ اضغط هنا

How we are leading a 3-XORSAT challenge: from the energy landscape to the algorithm and its efficient implementation on GPUs

56   0   0.0 ( 0 )
 نشر من قبل Federico Ricci-Tersenghi
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

A recent 3-XORSAT challenge required to minimize a very complex and rough energy function, typical of glassy models with a random first order transition and a golf course like energy landscape. We present the ideas beyond the quasi-greedy algorithm and its very efficient implementation on GPUs that are allowing us to rank first in such a competition. We suggest a better protocol to compare algorithmic performances and we also provide analytical predictions about the exponential growth of the times to find the solution in terms of free-energy barriers.

قيم البحث

اقرأ أيضاً

56 - Hajime Yoshino 2019
We develop a statistical mechanical approach based on the replica method to study the design space of deep and wide neural networks constrained to meet a large number of training data. Specifically, we analyze the configuration space of the synaptic weights and neurons in the hidden layers in a simple feed-forward perceptron network for two scenarios: a setting with random inputs/outputs and a teacher-student setting. By increasing the strength of constraints,~i.e. increasing the number of training data, successive 2nd order glass transition (random inputs/outputs) or 2nd order crystalline transition (teacher-student setting) take place layer-by-layer starting next to the inputs/outputs boundaries going deeper into the bulk with the thickness of the solid phase growing logarithmically with the data size. This implies the typical storage capacity of the network grows exponentially fast with the depth. In a deep enough network, the central part remains in the liquid phase. We argue that in systems of finite width N, the weak bias field can remain in the center and plays the role of a symmetry-breaking field that connects the opposite sides of the system. The successive glass transitions bring about a hierarchical free-energy landscape with ultrametricity, which evolves in space: it is most complex close to the boundaries but becomes renormalized into progressively simpler ones in deeper layers. These observations provide clues to understand why deep neural networks operate efficiently. Finally, we present some numerical simulations of learning which reveal spatially heterogeneous glassy dynamics truncated by a finite width $N$ effect.
We carefully investigate the two fundamental assumptions in the Stillinger-Weber analysis of the inherent structures (ISs) in the energy landscape and come to conclude that they cannot be validated. This explains some of the conflicting results betwe en their conclusions and some recent rigorous and exact results. Our analysis shows that basin free energies, and not ISs, are useful for understanding glasses.
Using the potential energy landscape formalism we show that, in the temperature range in which the dynamics of a glass forming system is thermally activated, there exists a unique set of basis glass states each of which is confined to a single metaba sin of the energy landscape of a glass forming system. These basis glass states tile the entire configuration space of the system, exhibit only secondary relaxation and are solid-like. Any macroscopic state of the system (whether liquid or glass) can be represented as a superposition of basis glass states and can be described by a probability distribution over these states. During cooling of a liquid from a high temperature, the probability distribution freezes at sufficiently low temperatures describing the process of liquid to glass transition. The time evolution of the probability distribution towards the equilibrium distribution during subsequent aging describes the primary relaxation of a glass.
91 - Ulisse Ferrari 2015
Maximum entropy models provide the least constrained probability distributions that reproduce statistical properties of experimental datasets. In this work we characterize the learning dynamics that maximizes the log-likelihood in the case of large b ut finite datasets. We first show how the steepest descent dynamics is not optimal as it is slowed down by the inhomogeneous curvature of the model parameters space. We then provide a way for rectifying this space which relies only on dataset properties and does not require large computational efforts. We conclude by solving the long-time limit of the parameters dynamics including the randomness generated by the systematic use of Gibbs sampling. In this stochastic framework, rather than converging to a fixed point, the dynamics reaches a stationary distribution, which for the rectified dynamics reproduces the posterior distribution of the parameters. We sum up all these insights in a rectified Data-Driven algorithm that is fast and by sampling from the parameters posterior avoids both under- and over-fitting along all the directions of the parameters space. Through the learning of pairwise Ising models from the recording of a large population of retina neurons, we show how our algorithm outperforms the steepest descent method.
Landaus theory of phase transitions is adapted to treat independently relaxing regions in complex systems using nanothermodynamics. The order parameter we use governs the thermal fluctuations, not a specific static structure. We find that the entropy term dominates the thermal behavior, as is reasonable for disordered systems. Consequently, the thermal equilibrium occurs at the internal-energy maximum, so that the minima in a potential-energy landscape have negligible influence on the dynamics. Instead the dynamics involves normal thermal fluctuations about the free-energy minimum, with a time scale that is governed by the internal-energy maximum. The temperature dependence of the fluctuations yields VTF-like relaxation rates and approximate time-temperature superposition, consistent with the WLF procedure for analyzing the dynamics of complex fluids; while the size dependence of the fluctuations provides an explanation for the distribution of relaxation times and heterogeneity that are found in glass-forming liquids, thus providing a unified picture for several features in the dynamics of disordered materials.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا