Do you want to publish a course? Click here

Sequences of projective measurements in generalized probabilistic models

276   0   0.0 ( 0 )
 Added by Matthias Kleinmann
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

We define a simple rule that allows to describe sequences of projective measurements for a broad class of generalized probabilistic models. This class embraces quantum mechanics and classical probability theory, but, for example, also the hypothetical Popescu-Rohrlich box. For quantum mechanics, the definition yields the established Luderss rule, which is the standard rule how to update the quantum state after a measurement. In the general case it can be seen as the least disturbing or most coherent way to perform sequential measurements. As example we show that Spekkenss toy model is an instance of our definition. We also demonstrate the possibility of strong post-quantum correlations as well as the existence of triple-slit correlations for certain non-quantum toy models.



rate research

Read More

Standard projective measurements represent a subset of all possible measurements in quantum physics, defined by positive-operator-valued measures. We study what quantum measurements are projective simulable, that is, can be simulated by using projective measurements and classical randomness. We first prove that every measurement on a given quantum system can be realised by classical processing of projective measurements on the system plus an ancilla of the same dimension. Then, given a general measurement in dimension two or three, we show that deciding whether it is projective-simulable can be solved by means of semi-definite programming. We also establish conditions for the simulation of measurements using projective ones valid for any dimension. As an application of our formalism, we improve the range of visibilities for which two-qubit Werner states do not violate any Bell inequality for all measurements. From an implementation point of view, our work provides bounds on the amount of noise a measurement tolerates before losing any advantage over projective ones.
In this work, we investigate measurement incompatibility in general probabilistic theories (GPTs). We show several equivalent characterizations of compatible measurements. The first is in terms of the positivity of associated maps. The second relates compatibility to the inclusion of certain generalized spectrahedra. For this, we extend the theory of free spectrahedra to ordered vector spaces. The third characterization connects the compatibility of dichotomic measurements to the ratio of tensor crossnorms of Banach spaces. We use these characterizations to study the amount of incompatibility present in different GPTs, i.e. their compatibility regions. For centrally symmetric GPTs, we show that the compatibility degree is given as the ratio of the injective and the projective norm of the tensor product of associated Banach spaces. This allows us to completely characterize the compatibility regions of several GPTs, and to obtain optimal universal bounds on the compatibility degree in terms of the 1-summing constants of the associated Banach spaces. Moreover, we find new bounds on the maximal incompatibility present in more than three qubit measurements.
116 - Bartosz Regula 2021
The difficulty in manipulating quantum resources deterministically often necessitates the use of probabilistic protocols, but the characterization of their capabilities and limitations has been lacking. Here, we develop two general approaches to this problem. First, we introduce a new resource monotone based on the Hilbert projective metric and we show that it obeys a very strong type of monotonicity: it can rule out all transformations, probabilistic or deterministic, between states in any quantum resource theory. This allows us to place fundamental limitations on state transformations and restrict the advantages that probabilistic protocols can provide over deterministic ones, significantly strengthening previous findings and extending recent no-go theorems. We apply our results to obtain a substantial improvement in lower bounds for the errors and overheads of probabilistic distillation protocols, directly applicable to tasks such as entanglement or magic state distillation, and computable through convex optimization. In broad classes of resources, we show that no better restrictions on probabilistic protocols are possible -- our monotone can provide a necessary and sufficient condition for probabilistic resource transformations, thus allowing us to quantify exactly the highest fidelity achievable in resource distillation tasks by means of any probabilistic manipulation protocol. Complementing this approach, we introduce a general method for bounding achievable probabilities in resource transformations through a family of convex optimization problems. We show it to tightly characterize single-shot probabilistic distillation in broad types of resource theories, allowing an exact analysis of the trade-offs between the probabilities and errors in distilling maximally resourceful states.
In this note we lay some groundwork for the resource theory of thermodynamics in general probabilistic theories (GPTs). We consider theories satisfying a purely convex abstraction of the spectral decomposition of density matrices: that every state has a decomposition, with unique probabilities, into perfectly distinguishable pure states. The spectral entropy, and analogues using other Schur-concave functions, can be defined as the entropy of these probabilities. We describe additional conditions under which the outcome probabilities of a fine-grained measurement are majorized by those for a spectral measurement, and therefore the spectral entropy is the measurement entropy (and therefore concave). These conditions are (1) projectivity, which abstracts aspects of the Lueders-von Neumann projection postulate in quantum theory, in particular that every face of the state space is the positive part of the image of a certain kind of projection operator called a filter; and (2) symmetry of transition probabilities. The conjunction of these, as shown earlier by Araki, is equivalent to a strong geometric property of the unnormalized state cone known as perfection: that there is an inner product according to which every face of the cone, including the cone itself, is self-dual. Using some assumptions about the thermodynamic cost of certain processes that are partially motivated by our postulates, especially projectivity, we extend von Neumanns argument that the thermodynamic entropy of a quantum system is its spectral entropy to generalized probabilistic systems satisfying spectrality.
In this paper, we investigate a characterization of Quantum Mechanics by two physical principles based on general probabilistic theories. We first give the operationally motivated definition of the physical equivalence of states and consider the principle of the physical equivalence of pure states, which turns out to be equivalent to the symmetric structure of the state space. We further consider another principle of the decomposability with distinguishable pure states. We give classification theorems of the state spaces for each principle, and derive the Bloch ball in 2 and 3 dimensional systems by these principles.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا