ترغب بنشر مسار تعليمي؟ اضغط هنا

Bounding the convergence time of local probabilistic evolution

61   0   0.0 ( 0 )
 نشر من قبل Simon Apers
 تاريخ النشر 2017
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Isoperimetric inequalities form a very intuitive yet powerful characterization of the connectedness of a state space, that has proven successful in obtaining convergence bounds. Since the seventies they form an essential tool in differential geometry, graph theory and Markov chain analysis. In this paper we use isoperimetric inequalities to construct a bound on the convergence time of any local probabilistic evolution that leaves its limit distribution invariant. We illustrate how this general result leads to new bounds on convergence times beyond the explicit Markovian setting, among others on quantum dynamics.

قيم البحث

اقرأ أيضاً

We develop tools for analyzing focused stochastic local search algorithms. These are algorithms which search a state space probabilistically by repeatedly selecting a constraint that is violated in the current state and moving to a random nearby stat e which, hopefully, addresses the violation without introducing many new ones. A large class of such algorithms arise from the algorithmization of the Lovasz Local Lemma, a non-constructive tool for proving the existence of satisfying states. Here we give tools that provide a unified analysis of such algorithms and of many more, expressing them as instances of a general framework.
49 - Eric Goles 2000
This paper is devoted to the study of the dynamics of a discrete system related to some self stabilizing protocol on a ring of processors.
Many experiments in the field of quantum foundations seek to adjudicate between quantum theory and speculative alternatives to it. This requires one to analyze the experimental data in a manner that does not presume the correctness of the quantum for malism. The mathematical framework of generalized probabilistic theories (GPTs) provides a means of doing so. We present a scheme for determining which GPTs are consistent with a given set of experimental data. It proceeds by performing tomography on the preparations and measurements in a self-consistent manner, i.e., without presuming a prior characterization of either. We illustrate the scheme by analyzing experimental data for a large set of preparations and measurements on the polarization degree of freedom of a single photon. We find that the smallest and largest GPT state spaces consistent with our data are a pair of polytopes, each approximating the shape of the Bloch Sphere and having a volume ratio of $0.977 pm 0.001$, which provides a quantitative bound on the scope for deviations from quantum theory. We also demonstrate how our scheme can be used to bound the extent to which nature might be more nonlocal than quantum theory predicts, as well as the extent to which it might be more or less contextual. Specifically, we find that the maximal violation of the CHSH inequality can be at most $1.3% pm 0.1$ greater than the quantum prediction, and the maximal violation of a particular inequality for universal noncontextuality can not differ from the quantum prediction by more than this factor on either side. The most significant loophole in this sort of analysis is that the set of preparations and measurements one implements might fail to be tomographically complete for the system of interest.
We consider the variation of von Neumann entropy of subsystem reduced states of general many- body lattice spin systems due to local quantum quenches. We obtain Lieb-Robinson-like bounds that are independent of the subsystem volume. The main assumpti ons are that the Hamiltonian satisfies a Lieb-Robinson bound and that the volume of spheres on the lattice grows at most exponentially with their radius. More specifically, the bound exponentially increases with time but exponentially decreases with the distance between the subsystem and the region where the quench takes place. The fact that the bound is independent of the subsystem volume leads to stronger constraints (than previously known) on the propagation of information throughout many-body systems. In particular, it shows that bipartite entanglement satisfies an effective light cone, regardless of system size. Further implications to t density-matrix renormalization-group simulations of quantum spin chains and limitations to the propagation of information are discussed.
Non-idempotent intersection types are used in order to give a bound of the length of the normalization beta-reduction sequence of a lambda term: namely, the bound is expressed as a function of the size of the term.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا