ترغب بنشر مسار تعليمي؟ اضغط هنا

Geometrically frustrated spin-chain compounds such as Ca3Co2O6 exhibit extremely slow relaxation under a changing magnetic field. Consequently, both low-temperature laboratory experiments and Monte Carlo simulations have shown peculiar out-of-equilib rium magnetization curves, which arise from trapping in metastable configurations. In this work we simulate this phenomenon in a superconducting quantum annealing processor, allowing us to probe the impact of quantum fluctuations on both equilibrium and dynamics of the system. Increasing the quantum fluctuations with a transverse field reduces the impact of metastable traps in out-of-equilibrium samples, and aids the development of three-sublattice ferrimagnetic (up-up-down) long-range order. At equilibrium we identify a finite-temperature shoulder in the 1/3-to-saturated phase transition, promoted by quantum fluctuations but with entropic origin. This work demonstrates the viability of dynamical as well as equilibrium studies of frustrated magnetism using large-scale programmable quantum systems, and is therefore an important step toward programmable simulation of dynamics in materials using quantum hardware.
Optimization or sampling of arbitrary pairwise Ising models, in a quantum annealing protocol of constrained interaction topology, can be enabled by a minor-embedding procedure. The logical problem of interest is transformed to a physical (device prog rammable) problem, where one binary variable is represented by a logical qubit consisting of multiple physical qubits. In this paper we discuss tuning of this transformation for the cases of clique, biclique, and cubic lattice problems on the D-Wave 2000Q quantum computer. We demonstrate parameter tuning protocols in spin glasses and channel communication problems, focusing on anneal duration, chain strength, and mapping from the result on physical qubits back to the logical space. Inhomogeneities in effective coupling strength arising from minor-embedding are shown to be mitigated by an efficient reweighting of programmed couplings, accounting for logical qubit topology.
The promise of quantum computing lies in harnessing programmable quantum devices for practical applications such as efficient simulation of quantum materials and condensed matter systems. One important task is the simulation of geometrically frustrat ed magnets in which topological phenomena can emerge from competition between quantum and thermal fluctuations. Here we report on experimental observations of relaxation in such simulations, measured on up to 1440 qubits with microsecond resolution. By initializing the system in a state with topological obstruction, we observe quantum annealing (QA) relaxation timescales in excess of one microsecond. Measurements indicate a dynamical advantage in the quantum simulation over the classical approach of path-integral Monte Carlo (PIMC) fixed-Hamiltonian relaxation with multiqubit cluster updates. The advantage increases with both system size and inverse temperature, exceeding a million-fold speedup over a CPU. This is an important piece of experimental evidence that in general, PIMC does not mimic QA dynamics for stoquastic Hamiltonians. The observed scaling advantage, for simulation of frustrated magnetism in quantum condensed matter, demonstrates that near-term quantum devices can be used to accelerate computational tasks of practical relevance.
We investigate the computational hardness of spin-glass instances on a square lattice, generated via a recently introduced tunable and scalable approach for planting solutions. The method relies on partitioning the problem graph into edge-disjoint su bgraphs, and planting frustrated, elementary subproblems that share a common local ground state, which guarantees that the ground state of the entire problem is known a priori. Using population annealing Monte Carlo, we compare the typical hardness of problem classes over a large region of the multi-dimensional tuning parameter space. Our results show that the problems have a wide range of tunable hardness. Moreover, we observe multiple transitions in the hardness phase space, which we further corroborate using simulated annealing and simulated quantum annealing. By investigating thermodynamic properties of these planted systems, we demonstrate that the harder samples undergo magnetic ordering transitions which are also ultimately responsible for the observed hardness transitions on changing the sample composition.
We propose the Wishart planted ensemble, a class of zero-field Ising models with tunable algorithmic hardness and specifiable (or planted) ground state. The problem class arises from a simple procedure for generating a family of random integer progra mming problems with specific statistical symmetry properties, but turns out to have intimate connections to a sign-inverted variant of the Hopfield model. The Hamiltonian contains only 2-spin interactions, with the coupler matrix following a type of Wishart distribution. The class exhibits a classical first-order phase transition in temperature. For some parameter settings the model has a locally-stable paramagnetic state, a feature which correlates strongly with difficulty in finding the ground state and suggests an extremely rugged energy landscape. We analytically probe the ensemble thermodynamic properties by deriving the Thouless-Anderson-Palmer equations and free energy and corroborate the results with a replica and annealed approximation analysis; extensive Monte Carlo simulations confirm our predictions of the first-order transition temperature. The class exhibits a wide variation in algorithmic hardness as a generation parameter is varied, with a pronounced easy-hard-easy profile and peak in solution time towering many orders of magnitude over that of the easy regimes. By deriving the ensemble-averaged energy distribution and taking into account finite-precision representation, we propose an analytical expression for the location of the hardness peak and show that at fixed precision, the number of constraints in the integer program must increase with system size to yield truly hard problems. The Wishart planted ensemble is interesting for its peculiar physical properties and provides a useful and analytically-transparent set of problems for benchmarking optimization algorithms.
322 - Jack Raymond , David Saad 2017
Sparse Code Division Multiple Access (CDMA), a variation on the standard CDMA method in which the spreading (signature) matrix contains only a relatively small number of non-zero elements, is presented and analysed using methods of statistical physic s. The analysis provides results on the performance of maximum likelihood decoding for sparse spreading codes in the large system limit. We present results for both cases of regular and irregular spreading matrices for the binary additive white Gaussian noise channel (BIAWGN) with a comparison to the canonical (dense) random spreading code.
A recent Google study [Phys. Rev. X, 6:031015 (2016)] compared a D-Wave 2X quantum processing unit (QPU) to two classical Monte Carlo algorithms: simulated annealing (SA) and quantum Monte Carlo (QMC). The study showed the D-Wave 2X to be up to 100 m illion times faster than the classical algorithms. The Google inputs are designed to demonstrate the value of collective multiqubit tunneling, a resource available to D-Wave QPUs but not to simulated annealing. But the computational hardness in these inputs is highly localized in gadgets, with only a small amount of complexity coming from global interactions, meaning that the relevance to real-world problems is limited. In this study we provide a new synthetic problem class that addresses the limitations of the Google inputs while retaining their strengths. We use simple clusters instead of more complex gadgets and more emphasis is placed on creating computational hardness through frustrated global interactions like those seen in interesting real-world inputs. The logical problems used to generate these inputs can be solved in polynomial time [J. Phys. A, 15:10 (1982)]. However, for general heuristic algorithms that are unaware of the planted problem class, the frustration creates meaningful difficulty in a controlled environment ideal for study. We use these inputs to evaluate the new 2000-qubit D-Wave QPU. We include the HFS algorithm---the best performer in a broader analysis of Google inputs---and we include state-of-the-art GPU implementations of SA and QMC. The D-Wave QPU solidly outperforms the software solvers: when we consider pure annealing time (computation time), the D-Wave QPU reaches ground states up to 2600 times faster than the competition. In the task of zero-temperature Boltzmann sampling from challenging multimodal inputs, the D-Wave QPU holds a similar advantage as quantum sampling bias does not seem significant.
Inference methods are often formulated as variational approximations: these approximations allow easy evaluation of statistics by marginalization or linear response, but these estimates can be inconsistent. We show that by introducing constraints on covariance, one can ensure consistency of linear response with the variational parameters, and in so doing inference of marginal probability distributions is improved. For the Bethe approximation and its generalizations, improvements are achieved with simple choices of the constraints. The approximations are presented as variational frameworks; iterative procedures related to message passing are provided for finding the minima.
Sampling from a Boltzmann distribution is NP-hard and so requires heuristic approaches. Quantum annealing is one promising candidate. The failure of annealing dynamics to equilibrate on practical time scales is a well understood limitation, but does not always prevent a heuristically useful distribution from being generated. In this paper we evaluate several methods for determining a useful operational temperature range for annealers. We show that, even where distributions deviate from the Boltzmann distribution due to ergodicity breaking, these estimates can be useful. We introduce the concepts of local and global temperatures that are captured by different estimation methods. We argue that for practical application it often makes sense to analyze annealers that are subject to post-processing in order to isolate the macroscopic distribution deviations that are a practical barrier to their application.
Variational inference is a powerful concept that underlies many iterative approximation algorithms; expectation propagation, mean-field methods and belief propagations were all central themes at the school that can be perceived from this unifying fra mework. The lectures of Manfred Opper introduce the archetypal example of Expectation Propagation, before establishing the connection with the other approximation methods. Corrections by expansion about the expectation propagation are then explained. Finally some advanced inference topics and applications are explored in the final sections.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا