Do you want to publish a course? Click here

Quantum Natural Gradient

73   0   0.0 ( 0 )
 Added by James Stokes
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

A quantum generalization of Natural Gradient Descent is presented as part of a general-purpose optimization framework for variational quantum circuits. The optimization dynamics is interpreted as moving in the steepest descent direction with respect to the Quantum Information Geometry, corresponding to the real part of the Quantum Geometric Tensor (QGT), also known as the Fubini-Study metric tensor. An efficient algorithm is presented for computing a block-diagonal approximation to the Fubini-Study metric tensor for parametrized quantum circuits, which may be of independent interest.



rate research

Read More

Variational Quantum Eigensolvers (VQEs) have recently attracted considerable attention. Yet, in practice, they still suffer from the efforts for estimating cost function gradients for large parameter sets or resource-demanding reinforcement strategies. Here, we therefore consider recent advances in weight-agnostic learning and propose a strategy that addresses the trade-off between finding appropriate circuit architectures and parameter tuning. We investigate the use of NEAT-inspired algorithms which evaluate circuits via genetic competition and thus circumvent issues due to exceeding numbers of parameters. Our methods are tested both via simulation and on real quantum hardware and are used to solve the transverse Ising Hamiltonian and the Sherrington-Kirkpatrick spin model.
68 - Naoki Yamamoto 2019
The variational quantum eigensolver is a hybrid algorithm composed of quantum state driving and classical parameter optimization, for finding the ground state of a given Hamiltonian. The natural gradient method is an optimization method taking into account the geometric structure of the parameter space. Very recently, Stokes et al. developed the general method for employing the natural gradient for the variational quantum eigensolver. This paper gives some simple case-studies of this optimization method, to see in detail how the natural gradient optimizer makes use of the geometric property to change and improve the ordinary gradient method.
Variational quantum circuits are promising tools whose efficacy depends on their optimisation method. For noise-free unitary circuits, the quantum generalisation of natural gradient descent was recently introduced. The method can be shown to be equivalent to imaginary time evolution, and is highly effective due to a metric tensor reconciling the classical parameter space to the devices Hilbert space. Here we generalise quantum natural gradient to consider arbitrary quantum states (both mixed and pure) via completely positive maps; thus our circuits can incorporate both imperfect unitary gates and fundamentally non-unitary operations such as measurements. Whereas the unitary variant relates to classical Fisher information, here we find that quantum Fisher information defines the core metric in the space of density operators. Numerical simulations indicate that our approach can outperform other variational techniques when circuit noise is present. We finally assess the practical feasibility of our implementation and argue that its scalability is only limited by the number and quality of imperfect gates and not by the number of qubits.
Many machine learning problems can be expressed as the optimization of some cost functional over a parametric family of probability distributions. It is often beneficial to solve such optimization problems using natural gradient methods. These methods are invariant to the parametrization of the family, and thus can yield more effective optimization. Unfortunately, computing the natural gradient is challenging as it requires inverting a high dimensional matrix at each iteration. We propose a general framework to approximate the natural gradient for the Wasserstein metric, by leveraging a dual formulation of the metric restricted to a Reproducing Kernel Hilbert Space. Our approach leads to an estimator for gradient direction that can trade-off accuracy and computational cost, with theoretical guarantees. We verify its accuracy on simple examples, and show the advantage of using such an estimator in classification tasks on Cifar10 and Cifar100 empirically.
Within the context of hybrid quantum-classical optimization, gradient descent based optimizers typically require the evaluation of expectation values with respect to the outcome of parameterized quantum circuits. In this work, we explore the consequences of the prior observation that estimation of these quantities on quantum hardware results in a form of stochastic gradient descent optimization. We formalize this notion, which allows us to show that in many relevant cases, including VQE, QAOA and certain quantum classifiers, estimating expectation values with $k$ measurement outcomes results in optimization algorithms whose convergence properties can be rigorously well understood, for any value of $k$. In fact, even using single measurement outcomes for the estimation of expectation values is sufficient. Moreover, in many settings the required gradients can be expressed as linear combinations of expectation values -- originating, e.g., from a sum over local terms of a Hamiltonian, a parameter shift rule, or a sum over data-set instances -- and we show that in these cases $k$-shot expectation value estimation can be combined with sampling over terms of the linear combination, to obtain doubly stochastic gradient descent optimizers. For all algorithms we prove convergence guarantees, providing a framework for the derivation of rigorous optimization results in the context of near-term quantum devices. Additionally, we explore numerically these methods on benchmark VQE, QAOA and quantum-enhanced machine learning tasks and show that treating the stochastic settings as hyper-parameters allows for state-of-the-art results with significantly fewer circuit executions and measurements.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا