Do you want to publish a course? Click here

Stable quantum memories with limited measurement

236   0   0.0 ( 0 )
 Added by Daniel Freeman
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

We demonstrate the existence of a finite temperature threshold for a 1D stabilizer code under an error correcting protocol that requires only a fraction of the syndrome measurements. Below the threshold temperature, encoded states have exponentially long lifetimes, as demonstrated by numerical and analytical arguments. We sketch how this algorithm generalizes to higher dimensional stabilizer codes with string-like excitations, like the toric code.



rate research

Read More

Central to the success of adaptive systems is their ability to interpret signals from their environment and respond accordingly -- they act as agents interacting with their surroundings. Such agents typically perform better when able to execute increasingly complex strategies. This comes with a cost: the more information the agent must recall from its past experiences, the more memory it will need. Here we investigate the power of agents capable of quantum information processing. We uncover the most general form a quantum agent need adopt to maximise memory compression advantages, and provide a systematic means of encoding their memory states. We show these encodings can exhibit extremely favourable scaling advantages relative to memory-minimal classical agents when information must be retained about events increasingly far into the past.
To use quantum systems for technological applications we first need to preserve their coherence for macroscopic timescales, even at finite temperature. Quantum error correction has made it possible to actively correct errors that affect a quantum memory. An attractive scenario is the construction of passive storage of quantum information with minimal active support. Indeed, passive protection is the basis of robust and scalable classical technology, physically realized in the form of the transistor and the ferromagnetic hard disk. The discovery of an analogous quantum system is a challenging open problem, plagued with a variety of no-go theorems. Several approaches have been devised to overcome these theorems by taking advantage of their loopholes. Here we review the state-of-the-art developments in this field in an informative and pedagogical way. We give the main principles of self-correcting quantum memories and we analyze several milestone examples from the literature of two-, three- and higher-dimensional quantum memories.
As the minituarization of electronic devices, which are sensitive to temperature, grows apace, sensing of temperature with ever smaller probes is more important than ever. Genuinely quantum mechanical schemes of thermometry are thus expected to be crucial to future technological progress. We propose a new method to measure the temperature of a bath using the weak measurement scheme with a finite dimensional probe. The precision offered by the present scheme not only shows similar qualitative features as the usual Quantum Fisher Information based thermometric protocols, but also allows for flexibility over setting the optimal thermometric window through judicious choice of post selection measurements.
We review the use of an external auxiliary detector for measuring the full distribution of the work performed on or extracted from a quantum system during a unitary thermodynamic process. We first illustrate two paradigmatic schemes that allow one to measure the work distribution: a Ramsey technique to measure the characteristic function and a positive operator valued measure (POVM) scheme to directly measure the work probability distribution. Then, we show that these two ideas can be understood in a unified framework for assessing work fluctuations through a generic quantum detector and describe two protocols that are able to yield complementary information. This allows us also to highlight how quantum work is affected by the presence of coherences in the systems initial state. Finally, we describe physical implementations and experimental realisations of the first two schemes.
Single photons are a vital resource for optical quantum information processing. Efficient and deterministic single photon sources do not yet exist, however. To date, experimental demonstrations of quantum processing primitives have been implemented using non-deterministic sources combined with heralding and/or postselection. Unfortunately, even for eight photons, the data rates are already so low as to make most experiments impracticable. It is well known that quantum memories, capable of storing photons until they are needed, are a potential solution to this `scaling catastrophe. Here, we analyze in detail the benefits of quantum memories for producing multiphoton states, showing how the production rates can be enhanced by many orders of magnitude. We identify the quantity $eta B$ as the most important figure of merit in this connection, where $eta$ and $B$ are the efficiency and time-bandwidth product of the memories, respectively.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا