ﻻ يوجد ملخص باللغة العربية
Cluster expansions for the exponential of local operators are constructed using tensor networks. In contrast to other approaches, the cluster expansion does not break any spatial or internal symmetries and exhibits a very favourable prefactor to the error scaling versus bond dimension. This is illustrated by time evolving a matrix product state using very large time steps, and by constructing a novel robust algorithm for finding ground states of 2-dimensional Hamiltonians using projected entangled pair states as fixed points of 2-dimensional transfer matrices.
We introduce a graphical user interface for constructing arbitrary tensor networks and specifying common operations like contractions or splitting, denoted GuiTeNet. Tensors are represented as nodes with attached legs, corresponding to the ordered di
In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state
Tensor network methods are a conceptually elegant framework for encoding complicated datasets, where high-order tensors are approximated as networks of low-order tensors. In practice, however, the numeric implementation of tensor network algorithms i
We study the realization of anyon-permuting symmetries of topological phases on the lattice using tensor networks. Working on the virtual level of a projected entangled pair state, we find matrix product operators (MPOs) that realize all unitary topo
We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size $L$, similar