Do you want to publish a course? Click here

Incompatibility in general probabilistic theories, generalized spectrahedra, and tensor norms

103   0   0.0 ( 0 )
 Added by Ion Nechita
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

In this work, we investigate measurement incompatibility in general probabilistic theories (GPTs). We show several equivalent characterizations of compatible measurements. The first is in terms of the positivity of associated maps. The second relates compatibility to the inclusion of certain generalized spectrahedra. For this, we extend the theory of free spectrahedra to ordered vector spaces. The third characterization connects the compatibility of dichotomic measurements to the ratio of tensor crossnorms of Banach spaces. We use these characterizations to study the amount of incompatibility present in different GPTs, i.e. their compatibility regions. For centrally symmetric GPTs, we show that the compatibility degree is given as the ratio of the injective and the projective norm of the tensor product of associated Banach spaces. This allows us to completely characterize the compatibility regions of several GPTs, and to obtain optimal universal bounds on the compatibility degree in terms of the 1-summing constants of the associated Banach spaces. Moreover, we find new bounds on the maximal incompatibility present in more than three qubit measurements.



rate research

Read More

We introduce and study a class of entanglement criteria based on the idea of applying local contractions to an input multipartite state, and then computing the projective tensor norm of the output. More precisely, we apply to a mixed quantum state a tensor product of contractions from the Schatten class $S_1$ to the Euclidean space $ell_2$, which we call entanglement testers. We analyze the performance of this type of criteria on bipartite and multipartite systems, for general pure and mixed quantum states, as well as on some important classes of symmetric quantum states. We also show that previously studied entanglement criteria, such as the realignment and the SIC POVM criteria, can be viewed inside this framework. This allows us to answer in the positive two conjectures of Shang, Asadian, Zhu, and Guhne by deriving systematic relations between the performance of these two criteria.
In this note we lay some groundwork for the resource theory of thermodynamics in general probabilistic theories (GPTs). We consider theories satisfying a purely convex abstraction of the spectral decomposition of density matrices: that every state has a decomposition, with unique probabilities, into perfectly distinguishable pure states. The spectral entropy, and analogues using other Schur-concave functions, can be defined as the entropy of these probabilities. We describe additional conditions under which the outcome probabilities of a fine-grained measurement are majorized by those for a spectral measurement, and therefore the spectral entropy is the measurement entropy (and therefore concave). These conditions are (1) projectivity, which abstracts aspects of the Lueders-von Neumann projection postulate in quantum theory, in particular that every face of the state space is the positive part of the image of a certain kind of projection operator called a filter; and (2) symmetry of transition probabilities. The conjunction of these, as shown earlier by Araki, is equivalent to a strong geometric property of the unnormalized state cone known as perfection: that there is an inner product according to which every face of the cone, including the cone itself, is self-dual. Using some assumptions about the thermodynamic cost of certain processes that are partially motivated by our postulates, especially projectivity, we extend von Neumanns argument that the thermodynamic entropy of a quantum system is its spectral entropy to generalized probabilistic systems satisfying spectrality.
Resource theories provide a general framework for the characterization of properties of physical systems in quantum mechanics and beyond. Here, we introduce methods for the quantification of resources in general probabilistic theories (GPTs), focusing in particular on the technical issues associated with infinite-dimensional state spaces. We define a universal resource quantifier based on the robustness measure, and show it to admit a direct operational meaning: in any GPT, it quantifies the advantage that a given resource state enables in channel discrimination tasks over all resourceless states. We show that the robustness acts as a faithful and strongly monotonic measure in any resource theory described by a convex and closed set of free states, and can be computed through a convex conic optimization problem. Specializing to continuous-variable quantum mechanics, we obtain additional bounds and relations, allowing an efficient computation of the measure and comparison with other monotones. We demonstrate applications of the robustness to several resources of physical relevance: optical nonclassicality, entanglement, genuine non-Gaussianity, and coherence. In particular, we establish exact expressions for various classes of states, including Fock states and squeezed states in the resource theory of nonclassicality and general pure states in the resource theory of entanglement, as well as tight bounds applicable in general cases.
In this paper, we investigate a characterization of Quantum Mechanics by two physical principles based on general probabilistic theories. We first give the operationally motivated definition of the physical equivalence of states and consider the principle of the physical equivalence of pure states, which turns out to be equivalent to the symmetric structure of the state space. We further consider another principle of the decomposability with distinguishable pure states. We give classification theorems of the state spaces for each principle, and derive the Bloch ball in 2 and 3 dimensional systems by these principles.
261 - Matthias Kleinmann 2014
We define a simple rule that allows to describe sequences of projective measurements for a broad class of generalized probabilistic models. This class embraces quantum mechanics and classical probability theory, but, for example, also the hypothetical Popescu-Rohrlich box. For quantum mechanics, the definition yields the established Luderss rule, which is the standard rule how to update the quantum state after a measurement. In the general case it can be seen as the least disturbing or most coherent way to perform sequential measurements. As example we show that Spekkenss toy model is an instance of our definition. We also demonstrate the possibility of strong post-quantum correlations as well as the existence of triple-slit correlations for certain non-quantum toy models.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا