No Arabic abstract
Calculating the spectral function of two dimensional systems is arguably one of the most pressing challenges in modern computational condensed matter physics. While efficient techniques are available in lower dimensions, two dimensional systems present insurmountable hurdles, ranging from the sign problem in quantum Monte Carlo (MC), to the entanglement area law in tensor network based methods. We hereby present a variational approach based on a Chebyshev expansion of the spectral function and a neural network representation for the wave functions. The Chebyshev moments are obtained by recursively applying the Hamiltonian and projecting on the space of variational states using a modified natural gradient descent method. We compare this approach with a modified approximation of the spectral function which uses a Krylov subspace constructed from the Chebyshev wave-functions. We present results for the one-dimensional and two-dimensional Heisenberg model on the square lattice, and compare to those obtained by other methods in the literature.
The variational wave functions based on neural networks have recently started to be recognized as a powerful ansatz to represent quantum many-body states accurately. In order to show the usefulness of the method among all available numerical methods, it is imperative to investigate the performance in challenging many-body problems for which the exact solutions are not available. Here, we construct a variational wave function with one of the simplest neural networks, the restricted Boltzmann machine (RBM), and apply it to a fundamental but unsolved quantum spin Hamiltonian, the two-dimensional $J_1$-$J_2$ Heisenberg model on the square lattice. We supplement the RBM wave function with quantum-number projections, which restores the symmetry of the wave function and makes it possible to calculate excited states. Then, we perform a systematic investigation of the performance of the RBM. We show that, with the help of the symmetry, the RBM wave function achieves state-of-the-art accuracy both in ground-state and excited-state calculations. The study shows a practical guideline on how we achieve accuracy in a controlled manner.
In this work, we consider compressed sensing reconstruction from $M$ measurements of $K$-sparse structured signals which do not possess a writable correlation model. Assuming that a generative statistical model, such as a Boltzmann machine, can be trained in an unsupervised manner on example signals, we demonstrate how this signal model can be used within a Bayesian framework of signal reconstruction. By deriving a message-passing inference for general distribution restricted Boltzmann machines, we are able to integrate these inferred signal models into approximate message passing for compressed sensing reconstruction. Finally, we show for the MNIST dataset that this approach can be very effective, even for $M < K$.
Extracting automatically the complex set of features composing real high-dimensional data is crucial for achieving high performance in machine--learning tasks. Restricted Boltzmann Machines (RBM) are empirically known to be efficient for this purpose, and to be able to generate distributed and graded representations of the data. We characterize the structural conditions (sparsity of the weights, low effective temperature, nonlinearities in the activation functions of hidden units, and adaptation of fields maintaining the activity in the visible layer) allowing RBM to operate in such a compositional phase. Evidence is provided by the replica analysis of an adequate statistical ensemble of random RBMs and by RBM trained on the handwritten digits dataset MNIST.
We develop two cutting-edge approaches to construct deep neural networks representing the purified finite-temperature states of quantum many-body systems. Both methods commonly aim to represent the Gibbs state by a highly expressive neural-network wave function, exemplifying the idea of purification. The first method is an entirely deterministic approach to generate deep Boltzmann machines representing the purified Gibbs state exactly. This strongly assures the remarkable flexibility of the ansatz which can fully exploit the quantum-to-classical mapping. The second method employs stochastic sampling to optimize the network parameters such that the imaginary time evolution is well approximated within the expressibility of neural networks. Numerical demonstrations for transverse-field Ising models and Heisenberg models show that our methods are powerful enough to investigate the finite-temperature properties of strongly correlated quantum many-body systems, even when the problematic effect of frustration is present.
The search for novel entangled phases of matter has lead to the recent discovery of a new class of ``entanglement transitions, exemplified by random tensor networks and monitored quantum circuits. Most known examples can be understood as some classical ordering transitions in an underlying statistical mechanics model, where entanglement maps onto the free energy cost of inserting a domain wall. In this paper, we study the possibility of entanglement transitions driven by physics beyond such statistical mechanics mappings. Motivated by recent applications of neural network-inspired variational Ansatze, we investigate under what conditions on the variational parameters these Ansatze can capture an entanglement transition. We study the entanglement scaling of short-range restricted Boltzmann machine (RBM) quantum states with random phases. For uncorrelated random phases, we analytically demonstrate the absence of an entanglement transition and reveal subtle finite size effects in finite size numerical simulations. Introducing phases with correlations decaying as $1/r^alpha$ in real space, we observe three regions with a different scaling of entanglement entropy depending on the exponent $alpha$. We study the nature of the transition between these regions, finding numerical evidence for critical behavior. Our work establishes the presence of long-range correlated phases in RBM-based wave functions as a required ingredient for entanglement transitions.