Do you want to publish a course? Click here

Fundamentals In Quantum Algorithms: A Tutorial Series Using Qiskit Continued

178   0   0.0 ( 0 )
 Added by Daniel Koch Dr.
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

With the increasing rise of publicly available high level quantum computing languages, the field of Quantum Computing has reached an important milestone of separation of software from hardware. Consequently, the study of Quantum Algorithms is beginning to emerge as university courses and disciplines around the world, spanning physics, math, and computer science departments alike. As a continuation to its predecessor: Introduction to Coding Quantum Algorithms: A Tutorial Series Using Qiskit, this tutorial series aims to help understand several of the most promising quantum algorithms to date, including Phase Estimation, Shors, QAOA, VQE, and several others. Accompanying each algorithms theoretical foundations are coding examples utilizing IBMs Qiskit, demonstrating the strengths and challenges of implementing each algorithm in gate-based quantum computing.



rate research

Read More

The quantum circuit model is an abstraction that hides the underlying physical implementation of gates and measurements on a quantum computer. For precise control of real quantum hardware, the ability to execute pulse and readout-level instructions is required. To that end, we introduce Qiskit Pulse, a pulse-level programming paradigm implemented as a module within Qiskit-Terra cite{Qiskit}. To demonstrate the capabilities of Qiskit Pulse, we calibrate both un-echoed and echoed variants of the cross-resonance entangling gate with a pair of qubits on an IBM Quantum system accessible through the cloud. We perform Hamiltonian characterization of both single and two-pulse variants of the cross-resonance entangling gate with varying amplitudes on a cloud-based IBM Quantum system. We then transform these calibrated sequences into a high-fidelity CNOT gate by applying pre and post local-rotations to the qubits, achieving average gate fidelities of $F=0.981$ and $F=0.979$ for the un-echoed and echoed respectively. This is comparable to the standard backend CNOT fidelity of $F_{CX}=0.984$. Furthermore, to illustrate how users can access their results at different levels of the readout chain, we build a custom discriminator to investigate qubit readout correlations. Qiskit Pulse allows users to explore advanced control schemes such as optimal control theory, dynamical decoupling, and error mitigation that are not available within the circuit model.
101 - Yi-Kai Liu 2009
The curvelet transform is a directional wavelet transform over R^n, which is used to analyze functions that have singularities along smooth surfaces (Candes and Donoho, 2002). I demonstrate how this can lead to new quantum algorithms. I give an efficient implementation of a quantum curvelet transform, together with two applications: a single-shot measurement procedure for approximately finding the center of a ball in R^n, given a quantum-sample over the ball; and, a quantum algorithm for finding the center of a radial function over R^n, given oracle access to the function. I conjecture that these algorithms succeed with constant probability, using one quantum-sample and O(1) oracle queries, respectively, independent of the dimension n -- this can be interpreted as a quantum speed-up. To support this conjecture, I prove rigorous bounds on the distribution of probability mass for the continuous curvelet transform. This shows that the above algorithms work in an idealized continuous model.
Over the past years, machine learning has emerged as a powerful computational tool to tackle complex problems over a broad range of scientific disciplines. In particular, artificial neural networks have been successfully deployed to mitigate the exponential complexity often encountered in quantum many-body physics, the study of properties of quantum systems built out of a large number of interacting particles. In this Article, we overview some applications of machine learning in condensed matter physics and quantum information, with particular emphasis on hands-on tutorials serving as a quick-start for a newcomer to the field. We present supervised machine learning with convolutional neural networks to learn a phase transition, unsupervised learning with restricted Boltzmann machines to perform quantum tomography, and variational Monte Carlo with recurrent neural-networks for approximating the ground state of a many-body Hamiltonian. We briefly review the key ingredients of each algorithm and their corresponding neural-network implementation, and show numerical experiments for a system of interacting Rydberg atoms in two dimensions.
In this tutorial, we introduce basic conceptual elements to understand and build a gate-based superconducting quantum computing system.
Logical entropy gives a measure, in the sense of measure theory, of the distinctions of a given partition of a set, an idea that can be naturally generalized to classical probability distributions. Here, we analyze how fundamental concepts of this entropy and other related definitions can be applied to the study of quantum systems, leading to the introduction of the quantum logical entropy. Moreover, we prove several properties of this entropy for generic density matrices that may be relevant to various areas of quantum mechanics and quantum information. Furthermore, we extend the notion of quantum logical entropy to post-selected systems.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا