ترغب بنشر مسار تعليمي؟ اضغط هنا

Emergence of Compositional Representations in Restricted Boltzmann Machines

86   0   0.0 ( 0 )
 نشر من قبل J\\'er\\^ome Tubiana
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Jer^ome Tubiana




اسأل ChatGPT حول البحث

Extracting automatically the complex set of features composing real high-dimensional data is crucial for achieving high performance in machine--learning tasks. Restricted Boltzmann Machines (RBM) are empirically known to be efficient for this purpose, and to be able to generate distributed and graded representations of the data. We characterize the structural conditions (sparsity of the weights, low effective temperature, nonlinearities in the activation functions of hidden units, and adaptation of fields maintaining the activity in the visible layer) allowing RBM to operate in such a compositional phase. Evidence is provided by the replica analysis of an adequate statistical ensemble of random RBMs and by RBM trained on the handwritten digits dataset MNIST.

قيم البحث

اقرأ أيضاً

In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model. Assuming that a generative statistical model, such as a Boltzmann machine, can be tr ained in an unsupervised manner on example signals, we demonstrate how this signal model can be used within a Bayesian framework of signal reconstruction. By deriving a message-passing inference for general distribution restricted Boltzmann machines, we are able to integrate these inferred signal models into approximate message passing for compressed sensing reconstruction. Finally, we show for the MNIST dataset that this approach can be very effective, even for $M < K$.
Calculating the spectral function of two dimensional systems is arguably one of the most pressing challenges in modern computational condensed matter physics. While efficient techniques are available in lower dimensions, two dimensional systems prese nt insurmountable hurdles, ranging from the sign problem in quantum Monte Carlo (MC), to the entanglement area law in tensor network based methods. We hereby present a variational approach based on a Chebyshev expansion of the spectral function and a neural network representation for the wave functions. The Chebyshev moments are obtained by recursively applying the Hamiltonian and projecting on the space of variational states using a modified natural gradient descent method. We compare this approach with a modified approximation of the spectral function which uses a Krylov subspace constructed from the Chebyshev wave-functions. We present results for the one-dimensional and two-dimensional Heisenberg model on the square lattice, and compare to those obtained by other methods in the literature.
81 - Yusuke Nomura 2020
The variational wave functions based on neural networks have recently started to be recognized as a powerful ansatz to represent quantum many-body states accurately. In order to show the usefulness of the method among all available numerical methods, it is imperative to investigate the performance in challenging many-body problems for which the exact solutions are not available. Here, we construct a variational wave function with one of the simplest neural networks, the restricted Boltzmann machine (RBM), and apply it to a fundamental but unsolved quantum spin Hamiltonian, the two-dimensional $J_1$-$J_2$ Heisenberg model on the square lattice. We supplement the RBM wave function with quantum-number projections, which restores the symmetry of the wave function and makes it possible to calculate excited states. Then, we perform a systematic investigation of the performance of the RBM. We show that, with the help of the symmetry, the RBM wave function achieves state-of-the-art accuracy both in ground-state and excited-state calculations. The study shows a practical guideline on how we achieve accuracy in a controlled manner.
A Boltzmann machine is a stochastic neural network that has been extensively used in the layers of deep architectures for modern machine learning applications. In this paper, we develop a Boltzmann machine that is capable of modelling thermodynamic o bservables for physical systems in thermal equilibrium. Through unsupervised learning, we train the Boltzmann machine on data sets constructed with spin configurations importance-sampled from the partition function of an Ising Hamiltonian at different temperatures using Monte Carlo (MC) methods. The trained Boltzmann machine is then used to generate spin states, for which we compare thermodynamic observables to those computed by direct MC sampling. We demonstrate that the Boltzmann machine can faithfully reproduce the observables of the physical system. Further, we observe that the number of neurons required to obtain accurate results increases as the system is brought close to criticality.
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which are commonly used as the building blocks for deep architectures neural architectures. In this work, we derive a deterministic framework for the training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer (TAP) mean-field approximation of widely-connected systems with weak interactions coming from spin-glass theory. While the TAP approach has been extensively studied for fully-visible binary spin systems, our construction is generalized to latent-variable models, as well as to arbitrarily distributed real-valued spin systems with bounded support. In our numerical experiments, we demonstrate the effective deterministic training of our proposed models and are able to show interesting features of unsupervised learning which could not be directly observed with sampling. Additionally, we demonstrate how to utilize our TAP-based framework for leveraging trained RBMs as joint priors in denoising problems.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا