ترغب بنشر مسار تعليمي؟ اضغط هنا

Symmetric cluster expansions with tensor networks

102   0   0.0 ( 0 )
 نشر من قبل Bram Vanhecke
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Cluster expansions for the exponential of local operators are constructed using tensor networks. In contrast to other approaches, the cluster expansion does not break any spatial or internal symmetries and exhibits a very favourable prefactor to the error scaling versus bond dimension. This is illustrated by time evolving a matrix product state using very large time steps, and by constructing a novel robust algorithm for finding ground states of 2-dimensional Hamiltonians using projected entangled pair states as fixed points of 2-dimensional transfer matrices.



قيم البحث

اقرأ أيضاً

We introduce a graphical user interface for constructing arbitrary tensor networks and specifying common operations like contractions or splitting, denoted GuiTeNet. Tensors are represented as nodes with attached legs, corresponding to the ordered di mensions of the tensor. GuiTeNet visualizes the current network, and instantly generates Python/NumPy source code for the hitherto sequence of user actions. Support for additional programming languages is planned for the future. We discuss the elementary operations on tensor networks used by GuiTeNet, together with high-level optimization strategies. The software runs directly in web browsers and is available online at http://guitenet.org.
In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when mapping a neural network, the resulting tensor network is highly constrained and thus the neural network states do in general not deliver the naive expected drastic improvement against the state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model, addressing the lack of a detailed comparison of the expressiveness of these increasingly popular, variational ansatze.
72 - Glen Evenbly 2019
Tensor network methods are a conceptually elegant framework for encoding complicated datasets, where high-order tensors are approximated as networks of low-order tensors. In practice, however, the numeric implementation of tensor network algorithms i s often a labor-intensive and error-prone task, even for experienced researchers in this area. emph{TensorTrace} is application designed to alleviate the burden of contracting tensor networks: it provides a graphic drawing interface specifically tailored for the construction of tensor network diagrams, from which the code for their optimal contraction can then be automatically generated (in the users choice of the MATLAB, Python or Julia languages). emph{TensorTrace} is freely available at url{https://www.tensortrace.com} wi
We study the realization of anyon-permuting symmetries of topological phases on the lattice using tensor networks. Working on the virtual level of a projected entangled pair state, we find matrix product operators (MPOs) that realize all unitary topo logical symmetries for the toric and color codes. These operators act as domain walls that enact the symmetry transformation on anyons as they cross. By considering open boundary conditions for these domain wall MPOs, we show how to introduce symmetry twists and defect lines into the state.
We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size $L$, similar ly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law $L^ u$ with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. For images, the scaling is close to an area law, hinting at 2D tensor networks such as PEPS could have an adequate expressibility. For the numerical analysis, we introduce a mutual information estimator based on autoregressive networks, and we also use convolutional neural networks in a neural estimator method.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا