Do you want to publish a course? Click here

Tensor networks and efficient descriptions of classical data

94   0   0.0 ( 0 )
 Added by Sirui Lu
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size $L$, similarly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law $L^ u$ with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. For images, the scaling is close to an area law, hinting at 2D tensor networks such as PEPS could have an adequate expressibility. For the numerical analysis, we introduce a mutual information estimator based on autoregressive networks, and we also use convolutional neural networks in a neural estimator method.

rate research

Read More

Tensor network states provide an efficient class of states that faithfully capture strongly correlated quantum models and systems in classical statistical mechanics. While tensor networks can now be seen as becoming standard tools in the description of such complex many-body systems, close to optimal variational principles based on such states are less obvious to come by. In this work, we generalize a recently proposed variational uniform matrix product state algorithm for capturing one-dimensional quantum lattices in the thermodynamic limit, to the study of regular two-dimensional tensor networks with a non-trivial unit cell. A key property of the algorithm is a computational effort that scales linearly rather than exponentially in the size of the unit cell. We demonstrate the performance of our approach on the computation of the classical partition functions of the antiferromagnetic Ising model and interacting dimers on the square lattice, as well as of a quantum doped resonating valence bond state.
Cluster expansions for the exponential of local operators are constructed using tensor networks. In contrast to other approaches, the cluster expansion does not break any spatial or internal symmetries and exhibits a very favourable prefactor to the error scaling versus bond dimension. This is illustrated by time evolving a matrix product state using very large time steps, and by constructing a novel robust algorithm for finding ground states of 2-dimensional Hamiltonians using projected entangled pair states as fixed points of 2-dimensional transfer matrices.
We characterize the variational power of quantum circuit tensor networks in the representation of physical many-body ground-states. Such tensor networks are formed by replacing the dense block unitaries and isometries in standard tensor networks by local quantum circuits. We explore both quantum circuit matrix product states and the quantum circuit multi-scale entanglement renormalization ansatz, and introduce an adaptive method to optimize the resulting circuits to high fidelity with more than $10^4$ parameters. We benchmark their expressiveness against standard tensor networks, as well as other common circuit architectures, for both the energy and correlation functions of the 1D Heisenberg and Fermi-Hubbard models in the gapless regime. We find quantum circuit tensor networks to be substantially more expressive than other quantum circuits for these problems, and that they can even be more compact than standard tensor networks. Extrapolating to circuit depths which can no longer be emulated classically, this suggests a region of quantum advantage with respect to expressiveness in the representation of physical ground-states.
72 - Glen Evenbly 2019
Tensor network methods are a conceptually elegant framework for encoding complicated datasets, where high-order tensors are approximated as networks of low-order tensors. In practice, however, the numeric implementation of tensor network algorithms is often a labor-intensive and error-prone task, even for experienced researchers in this area. emph{TensorTrace} is application designed to alleviate the burden of contracting tensor networks: it provides a graphic drawing interface specifically tailored for the construction of tensor network diagrams, from which the code for their optimal contraction can then be automatically generated (in the users choice of the MATLAB, Python or Julia languages). emph{TensorTrace} is freely available at url{https://www.tensortrace.com} wi
We study the realization of anyon-permuting symmetries of topological phases on the lattice using tensor networks. Working on the virtual level of a projected entangled pair state, we find matrix product operators (MPOs) that realize all unitary topological symmetries for the toric and color codes. These operators act as domain walls that enact the symmetry transformation on anyons as they cross. By considering open boundary conditions for these domain wall MPOs, we show how to introduce symmetry twists and defect lines into the state.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا