ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantum Error Correction and Holographic Information from Bilocal Holography

57   0   0.0 ( 0 )
 نشر من قبل Robert de Mello Koch
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Bilocal holography is a constructive approach to the higher spin theory holographically dual to $O(N)$ vector models. In contrast to other approaches to bulk reconstruction, bilocal holography does not take input from the dual gravitational theory. The resulting map is a complete bulk/boundary mapping in that it maps the complete set of $O(N)$ invariant degrees of freedom in the CFT, to the complete set of higher spin degrees of freedom. After restricting to a suitable code subspace we demonstrate that bilocal holography naturally reproduces the quantum error correcting properties of holography and it gives a robust bulk (entanglement wedge) reconstruction. A gauge invariant entangled pair of CFT degrees of freedom are naturally smeared over a semicircle in the bulk spacetime, which is highly suggestive of bit threads. Finally, we argue that finite $N$ relations in the CFT, when interpreted in the dual AdS spacetime, can provide relations between degrees of freedom located near the boundary and degrees of freedom deep in the bulk.

قيم البحث

اقرأ أيضاً

We study thermalization in the holographic (1+1)-dimensional CFT after simultaneous generation of two high-energy excitations in the antipodal points on the circle. The holographic picture of such quantum quench is the creation of BTZ black hole from a collision of two massless particles. We perform holographic computation of entanglement entropy and mutual information in the boundary theory and analyze their evolution with time. We show that equilibration of the entanglement in the regions which contained one of the initial excitations is generally similar to that in other holographic quench models, but with some important distinctions. We observe that entanglement propagates along a sharp effective light cone from the points of initial excitations on the boundary. The characteristics of entanglement propagation in the global quench models such as entanglement velocity and the light cone velocity also have a meaning in the bilocal quench scenario. We also observe the loss of memory about the initial state during the equilibration process. We find that the memory loss reflects on the time behavior of the entanglement similarly to the global quench case, and it is related to the universal linear growth of entanglement, which comes from the interior of the forming black hole. We also analyze general two-point correlation functions in the framework of the geodesic approximation, focusing on the study of the late time behavior.
140 - Daniel Harlow 2016
I argue that a version of the quantum-corrected Ryu-Takayanagi formula holds in any quantum error-correcting code. I present this result as a series of theorems of increasing generality, with the final statement expressed in the language of operator- algebra quantum error correction. In AdS/CFT this gives a purely boundary interpretation of the formula. I also extend a recent theorem, which established entanglement-wedge reconstruction in AdS/CFT, when interpreted as a subsystem code, to the more general, and I argue more physical, case of subalgebra codes. For completeness, I include a self-contained presentation of the theory of von Neumann algebras on finite-dimensional Hilbert spaces, as well as the algebraic definition of entropy. The results confirm a close relationship between bulk gauge transformations, edge-modes/soft-hair on black holes, and the Ryu-Takayanagi formula. They also suggest a new perspective on the homology constraint, which basically is to get rid of it in a way that preserves the validity of the formula, but which removes any tension with the linearity of quantum mechanics. Moreover they suggest a boundary interpretation of the bit threads recently introduced by Freedman and Headrick.
116 - T. Banks 2020
The formalism of Holographic Space-time (HST) is a translation of the principles of Lorentzian geometry into the language of quantum information. Intervals along time-like trajectories, and their associated causal diamonds, completely characterize a Lorentzian geometry. The Bekenstein-Hawking-Gibbons-t Hooft-Jacobson-Fischler-Susskind-Bousso Covariant Entropy Principle, equates the logarithm of the dimension of the Hilbert space associated with a diamond to one quarter of the area of the diamonds holographic screen, measured in Planck units. The most convincing argument for this principle is Jacobsons derivation of Einsteins equations as the hydrodynamic expression of this entropy law. In that context, the null energy condition (NEC) is seen to be the analog of the local law of entropy increase. The quantum version of Einsteins relativity principle is a set of constraints on the mutual quantum information shared by causal diamonds along different time-like trajectories. The implementation of this constraint for trajectories in relative motion is the greatest unsolved problem in HST. The other key feature of HST is its claim that, for non-negative cosmological constant or causal diamonds much smaller than the asymptotic radius of curvature for negative c.c., the degrees of freedom localized in the bulk of a diamond are constrained states of variables defined on the holographic screen. This principle gives a simple explanation of otherwise puzzling features of BH entropy formulae, and resolves the firewall problem for black holes in Minkowski space. It motivates a covariant version of the CKNcite{ckn} bound on the regime of validity of quantum field theory (QFT) and a detailed picture of the way in which QFT emerges as an approximation to the exact theory.
The typical model for measurement noise in quantum error correction is to randomly flip the binary measurement outcome. In experiments, measurements yield much richer information - e.g., continuous current values, discrete photon counts - which is th en mapped into binary outcomes by discarding some of this information. In this work, we consider methods to incorporate all of this richer information, typically called soft information, into the decoding of quantum error correction codes, and in particular the surface code. We describe how to modify both the Minimum Weight Perfect Matching and Union-Find decoders to leverage soft information, and demonstrate these soft decoders outperform the standard (hard) decoders that can only access the binary measurement outcomes. Moreover, we observe that the soft decoder achieves a threshold 25% higher than any hard decoder for phenomenological noise with Gaussian soft measurement outcomes. We also introduce a soft measurement error model with amplitude damping, in which measurement time leads to a trade-off between measurement resolution and additional disturbance of the qubits. Under this model we observe that the performance of the surface code is very sensitive to the choice of the measurement time - for a distance-19 surface code, a five-fold increase in measurement time can lead to a thousand-fold increase in logical error rate. Moreover, the measurement time that minimizes the physical error rate is distinct from the one that minimizes the logical performance, pointing to the benefits of jointly optimizing the physical and quantum error correction layers.
Holographic quantum error-correcting codes have been proposed as toy models that describe key aspects of the AdS/CFT correspondence. In this work, we introduce a versatile framework of Majorana dimers capturing the intersection of stabilizer and Gaus sian Majorana states. This picture allows for an efficient contraction with a simple diagrammatic interpretation and is amenable to analytical study of holographic quantum error-correcting codes. Equipped with this framework, we revisit the recently proposed hyperbolic pentagon code (HyPeC). Relating its logical code basis to Majorana dimers, we efficiently compute boundary state properties even for the non-Gaussian case of generic logical input. The dimers characterizing these boundary states coincide with discrete bulk geodesics, leading to a geometric picture from which properties of entanglement, quantum error correction, and bulk/boundary operator mapping immediately follow. We also elaborate upon the emergence of the Ryu-Takayanagi formula from our model, which realizes many of the properties of the recent bit thread proposal. Our work thus elucidates the connection between bulk geometry, entanglement, and quantum error correction in AdS/CFT, and lays the foundation for new models of holography.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا