ﻻ يوجد ملخص باللغة العربية
The Quantum Fourier Transformation ($QFT$) is a key building block for a whole wealth of quantum algorithms. Despite its proven efficiency, only a few proof-of-principle demonstrations have been reported. Here we utilize $QFT$ to enhance the performance of a quantum sensor. We implement the $QFT$ algorithm in a hybrid quantum register consisting of a nitrogen-vacancy (NV) center electron spin and three nuclear spins. The $QFT$ runs on the nuclear spins and serves to process the sensor - NV electron spin signal. We demonstrate $QFT$ for quantum (spins) and classical signals (radio frequency (RF) ) with near Heisenberg limited precision scaling. We further show the application of $QFT$ for demultiplexing the nuclear magnetic resonance (NMR) signal of two distinct target nuclear spins. Our results mark the application of a complex quantum algorithm in sensing which is of particular interest for high dynamic range quantum sensing and nanoscale NMR spectroscopy experiments.
Quantum computers will allow calculations beyond existing classical computers. However, current technology is still too noisy and imperfect to construct a universal digital quantum computer with quantum error correction. Inspired by the evolution of
It is called blind quantum computation(BQC) that a client who has limited quantum technologies can delegate her quantum computing to a server who has fully-advanced quantum computers. But the privacy of the clients quantum inputs, algorithms and outp
Quantum Fourier transform (QFT) is a key ingredient of many quantum algorithms where a considerable amount of ancilla qubits and gates are often needed to form a Hilbert space large enough for high-precision results. Qubit recycling reduces the numbe
Fourier transform spectroscopy with classical interferometry corresponds to the measurement of a single-photon intensity spectrum from the viewpoint of the particle nature of light. In contrast, the Fourier transform of two-photon quantum interferenc
The self-learning Metropolis-Hastings algorithm is a powerful Monte Carlo method that, with the help of machine learning, adaptively generates an easy-to-sample probability distribution for approximating a given hard-to-sample distribution. This pape