No Arabic abstract
Machine learning models are a powerful theoretical tool for analyzing data from quantum simulators, in which results of experiments are sets of snapshots of many-body states. Recently, they have been successfully applied to distinguish between snapshots that can not be identified using traditional one and two point correlation functions. Thus far, the complexity of these models has inhibited new physical insights from this approach. Here, using a novel set of nonlinearities we develop a network architecture that discovers features in the data which are directly interpretable in terms of physical observables. In particular, our network can be understood as uncovering high-order correlators which significantly differ between the data studied. We demonstrate this new architecture on sets of simulated snapshots produced by two candidate theories approximating the doped Fermi-Hubbard model, which is realized in state-of-the art quantum gas microscopy experiments. From the trained networks, we uncover that the key distinguishing features are fourth-order spin-charge correlators, providing a means to compare experimental data to theoretical predictions. Our approach lends itself well to the construction of simple, end-to-end interpretable architectures and is applicable to arbitrary lattice data, thus paving the way for new physical insights from machine learning studies of experimental as well as numerical data.
Variational methods have proven to be excellent tools to approximate ground states of complex many body Hamiltonians. Generic tools like neural networks are extremely powerful, but their parameters are not necessarily physically motivated. Thus, an efficient parametrization of the wave-function can become challenging. In this letter we introduce a neural-network based variational ansatz that retains the flexibility of these generic methods while allowing for a tunability with respect to the relevant correlations governing the physics of the system. We illustrate the success of this approach on topological, long-range correlated and frustrated models. Additionally, we introduce compatible variational optimization methods for exploration of low-lying excited states without symmetries that preserve the interpretability of the ansatz.
Over the past years, machine learning has emerged as a powerful computational tool to tackle complex problems over a broad range of scientific disciplines. In particular, artificial neural networks have been successfully deployed to mitigate the exponential complexity often encountered in quantum many-body physics, the study of properties of quantum systems built out of a large number of interacting particles. In this Article, we overview some applications of machine learning in condensed matter physics and quantum information, with particular emphasis on hands-on tutorials serving as a quick-start for a newcomer to the field. We present supervised machine learning with convolutional neural networks to learn a phase transition, unsupervised learning with restricted Boltzmann machines to perform quantum tomography, and variational Monte Carlo with recurrent neural-networks for approximating the ground state of a many-body Hamiltonian. We briefly review the key ingredients of each algorithm and their corresponding neural-network implementation, and show numerical experiments for a system of interacting Rydberg atoms in two dimensions.
Neural networks are a promising tool for simulating quantum many body systems. Recently, it has been shown that neural network-based models describe quantum many body systems more accurately when they are constrained to have the correct symmetry properties. In this paper, we show how to create maximally expressive models for quantum states with specific symmetry properties by drawing on literature from the machine learning community. We implement group equivariant convolutional networks (G-CNN) cite{cohen2016group}, and demonstrate that performance improvements can be achieved without increasing memory use. We show that G-CNNs achieve very good accuracy for Heisenberg quantum spin models in both ordered and spin liquid regimes, and improve the ground state accuracy on the triangular lattice over other variational Monte-Carlo methods.
Gauge symmetries play a key role in physics appearing in areas such as quantum field theories of the fundamental particles and emergent degrees of freedom in quantum materials. Motivated by the desire to efficiently simulate many-body quantum systems with exact local gauge invariance, gauge equivariant neural-network quantum states are introduced, which exactly satisfy the local Hilbert space constraints necessary for the description of quantum lattice gauge theory with Zd gauge group on different geometries. Focusing on the special case of Z2 gauge group on a periodically identified square lattice, the equivariant architecture is analytically shown to contain the loop-gas solution as a special case. Gauge equivariant neural-network quantum states are used in combination with variational quantum Monte Carlo to obtain compact descriptions of the ground state wavefunction for the Z2 theory away from the exactly solvable limit, and to demonstrate the confining/deconfining phase transition of the Wilson loop order parameter.
Gauge invariance plays a crucial role in quantum mechanics from condensed matter physics to high energy physics. We develop an approach to constructing gauge invariant autoregressive neural networks for quantum lattice models. These networks can be efficiently sampled and explicitly obey gauge symmetries. We variationally optimize our gauge invariant autoregressive neural networks for ground states as well as real-time dynamics for a variety of models. We exactly represent the ground and excited states of the 2D and 3D toric codes, and the X-cube fracton model. We simulate the dynamics of the quantum link model of $text{U(1)}$ lattice gauge theory, obtain the phase diagram for the 2D $mathbb{Z}_2$ gauge theory, determine the phase transition and the central charge of the $text{SU(2)}_3$ anyonic chain, and also compute the ground state energy of the $text{SU(2)}$ invariant Heisenberg spin chain. Our approach provides powerful tools for exploring condensed matter physics, high energy physics and quantum information science.