ﻻ يوجد ملخص باللغة العربية
We design generative neural networks that generate Monte Carlo configurations with complete absence of autocorrelation and from which direct measurements of physical observables can be employed, irrespective of the system locating at the classical critical point, fermionic Mott insulator, Dirac semimetal and quantum critical point. We further propose a generic parallel-chain Monte Carlo scheme based on such neural networks, which provides independent samplings and accelerates the Monte Carlo simulations by reducing the thermalization process. We demonstrate the performance of our approach on the two-dimensional Ising and fermion Hubbard models.
Efficient sampling of complex high-dimensional probability densities is a central task in computational science. Machine Learning techniques based on autoregressive neural networks have been recently shown to provide good approximations of probabilit
We propose a minimal generalization of the celebrated Markov-Chain Monte Carlo algorithm which allows for an arbitrary number of configurations to be visited at every Monte Carlo step. This is advantageous when a parallel computing machine is availab
Population annealing is a recent addition to the arsenal of the practitioner in computer simulations in statistical physics and beyond that is found to deal well with systems with complex free-energy landscapes. Above all else, it promises to deliver
We consider a monolayer of graphene under uniaxial, tensile strain and simulate Bloch oscillations for different electric field orientations parallel to the plane of the monolayer using several values of the components of the uniform strain tensor, b
We introduce a semistochastic implementation of the power method to compute, for very large matrices, the dominant eigenvalue and expectation values involving the corresponding eigenvector. The method is semistochastic in that the matrix multiplicati