Do you want to publish a course? Click here

Quantum Analytic Descent

43   0   0.0 ( 0 )
 Added by Balint Koczor
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Variational algorithms have particular relevance for near-term quantum computers but require non-trivial parameter optimisations. Here we propose Analytic Descent: Given that the energy landscape must have a certain simple form in the local region around any reference point, it can be efficiently approximated in its entirety by a classical model -- we support these observations with rigorous, complexity-theoretic arguments. One can classically analyse this approximate function in order to directly `jump to the (estimated) minimum, before determining a more refined function if necessary. We derive an optimal measurement strategy and generally prove that the asymptotic resource cost of a `jump corresponds to only a single gradient vector evaluation.



rate research

Read More

Variational quantum algorithms (VQAs) are promising methods that leverage noisy quantum computers and classical computing techniques for practical applications. In VQAs, the classical optimizers such as gradient-based optimizers are utilized to adjust the parameters of the quantum circuit so that the objective function is minimized. However, they often suffer from the so-called vanishing gradient or barren plateau issue. On the other hand, the normalized gradient descent (NGD) method, which employs the normalized gradient vector to update the parameters, has been successfully utilized in several optimization problems. Here, we study the performance of the NGD methods in the optimization of VQAs for the first time. Our goal is two-fold. The first is to examine the effectiveness of NGD and its variants for overcoming the vanishing gradient problems. The second is to propose a new NGD that can attain the faster convergence than the ordinary NGD. We performed numerical simulations of these gradient-based optimizers in the context of quantum chemistry where VQAs are used to find the ground state of a given Hamiltonian. The results show the effective convergence property of the NGD methods in VQAs, compared to the relevant optimizers without normalization. Moreover, we make use of some normalized gradient vectors at the past iteration steps to propose the novel historical NGD that has a theoretical guarantee to accelerate the convergence speed, which is observed in the numerical experiments as well.
Within the context of hybrid quantum-classical optimization, gradient descent based optimizers typically require the evaluation of expectation values with respect to the outcome of parameterized quantum circuits. In this work, we explore the consequences of the prior observation that estimation of these quantities on quantum hardware results in a form of stochastic gradient descent optimization. We formalize this notion, which allows us to show that in many relevant cases, including VQE, QAOA and certain quantum classifiers, estimating expectation values with $k$ measurement outcomes results in optimization algorithms whose convergence properties can be rigorously well understood, for any value of $k$. In fact, even using single measurement outcomes for the estimation of expectation values is sufficient. Moreover, in many settings the required gradients can be expressed as linear combinations of expectation values -- originating, e.g., from a sum over local terms of a Hamiltonian, a parameter shift rule, or a sum over data-set instances -- and we show that in these cases $k$-shot expectation value estimation can be combined with sampling over terms of the linear combination, to obtain doubly stochastic gradient descent optimizers. For all algorithms we prove convergence guarantees, providing a framework for the derivation of rigorous optimization results in the context of near-term quantum devices. Additionally, we explore numerically these methods on benchmark VQE, QAOA and quantum-enhanced machine learning tasks and show that treating the stochastic settings as hyper-parameters allows for state-of-the-art results with significantly fewer circuit executions and measurements.
We present pulse sequences for two-qubit gates acting on encoded qubits for exchange-only quantum computation. Previous work finding such sequences has always required numerical methods due to the large search space of unitary operators acting on the space of the encoded qubits. By contrast, our construction can be understood entirely in terms of three-dimensional rotations of effective spin-1/2 pseudospins which allows us to use geometric intuition to determine the required sequence of operations analytically. The price we pay for this simplification is that, at 39 pulses, our sequences are significantly longer than the best numerically obtained sequences.
We generalize past work on quantum sensor networks to show that, for $d$ input parameters, entanglement can yield a factor $mathcal O(d)$ improvement in mean squared error when estimating an analytic function of these parameters. We show that the protocol is optimal for qubit sensors, and conjecture an optimal protocol for photons passing through interferometers. Our protocol is also applicable to continuous variable measurements, such as one quadrature of a field operator. We outline a few potential applications, including calibration of laser operations in trapped ion quantum computing.
It has recently been discovered that the optical analogue of a gradient echo in an optically thick material could form the basis of a optical memory that is both completely efficient and noise free. Here we present analytical calculation showing this is the case. There is close analogy between the operation of the memory and an optical system with two beam splitters. We can use this analogy to calculate efficiencies as a function of optical depth for a number of quantum memory schemes based on controlled inhomogeneous broadening. In particular we show that multiple switching leads to a net 100% retrieval efficiency for the optical gradient echo even in the optically thin case.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا