ﻻ يوجد ملخص باللغة العربية
We propose a minimal generalization of the celebrated Markov-Chain Monte Carlo algorithm which allows for an arbitrary number of configurations to be visited at every Monte Carlo step. This is advantageous when a parallel computing machine is available, or when many biased configurations can be evaluated at little additional computational cost. As an example of the former case, we report a significant reduction of the thermalization time for the paradigmatic Sherrington-Kirkpatrick spin-glass model. For the latter case, we show that, by leveraging on the exponential number of biased configurations automatically computed by Diagrammatic Monte Carlo, we can speed up computations in the Fermi-Hubbard model by two orders of magnitude.
Differentiable programming has emerged as a key programming paradigm empowering rapid developments of deep learning while its applications to important computational methods such as Monte Carlo remain largely unexplored. Here we present the general t
Quantum Monte Carlo belongs to the most accurate simulation techniques for quantum many-particle systems. However, for fermions, these simulations are hampered by the sign problem that prohibits simulations in the regime of strong degeneracy. The sit
An important task in machine learning and statistics is the approximation of a probability measure by an empirical measure supported on a discrete point set. Stein Points are a class of algorithms for this task, which proceed by sequentially minimisi
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers. Like related methods, iPMCMC is a Markov chain Monte Carlo sampler on an ext
A novel class of non-reversible Markov chain Monte Carlo schemes relying on continuous-time piecewise-deterministic Markov Processes has recently emerged. In these algorithms, the state of the Markov process evolves according to a deterministic dynam