No Arabic abstract
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments.
We introduce a framework for simulating quantum measurements based on classical processing of a set of accessible measurements. Well-known concepts such as joint measurability and projective simulability naturally emerge as particular cases of our framework, but our study also leads to novel results and questions. First, a generalisation of joint measurability is derived, which yields a hierarchy for the incompatibility of sets of measurements. A similar hierarchy is defined based on the number of outcomes used to perform the simulation of a given measurement. This general approach also allows us to identify connections between different types of simulability and, in particular, we characterise the qubit measurements that are projective-simulable in terms of joint measurability. Finally, we discuss how our framework can be interpreted in the context of resource theories.
The prevalent modus operandi within the framework of quantum resource theories has been to characterise and harness the resources within single objects, in what we can call emph{single-object} quantum resource theories. One can wonder however, whether the resources contained within multiple different types of objects, now in a emph{multi-object} quantum resource theory, can simultaneously be exploited for the benefit of an operational task. In this work, we introduce examples of such multi-object operational tasks in the form of subchannel discrimination and subchannel exclusion games, in which the player harnesses the resources contained within a state-measurement pair. We prove that for any state-measurement pair in which either of them is resourceful, there exist discrimination and exclusion games for which such a pair outperforms any possible free state-measurement pair. These results hold for arbitrary convex resources of states, and arbitrary convex resources of measurements for which classical post-processing is a free operation. Furthermore, we prove that the advantage in these multi-object operational tasks is determined, in a multiplicative manner, by the resource quantifiers of: emph{generalised robustness of resource} of both state and measurement for discrimination games and emph{weight of resource} of both state and measurement for exclusion games.
As quantum computers improve in the number of qubits and fidelity, the question of when they surpass state-of-the-art classical computation for a well-defined computational task is attracting much attention. The leading candidate task for this milestone entails sampling from the output distribution defined by a random quantum circuit. We develop a massively-parallel simulation tool Rollright that does not require inter-process communication (IPC) or proprietary hardware. We also develop two ways to trade circuit fidelity for computational speedups, so as to match the fidelity of a given quantum computer --- a task previously thought impossible. We report massive speedups for the sampling task over prior software from Microsoft, IBM, Alibaba and Google, as well as supercomputer and GPU-based simulations. By using publicly available Google Cloud Computing, we price such simulations and enable comparisons by total cost across hardware platforms. We simulate approximate sampling from the output of a circuit with 7x8 qubits and depth 1+40+1 by producing one million bitstring probabilities with fidelity 0.5%, at an estimated cost of $35184. The simulation costs scale linearly with fidelity, and using this scaling we estimate that extending circuit depth to 1+48+1 increases costs to one million dollars. Scaling the simulation to 10M bitstring probabilities needed for sampling 1M bitstrings helps comparing simulation to quantum computers. We describe refinements in benchmarks that slow down leading simulators, halving the circuit depth that can be simulated within the same time.
The term measurement in quantum theory (as well as in other physical theories) is ambiguous: It is used to describe both an experience - e.g., an observation in an experiment - and an interaction with the system under scrutiny. If doing physics is regarded as a creative activity to develop a meaningful description of the world, then one has to carefully discriminate between the two notions: An observers account of experience - consitutive to meaning - is hardly expressed exhaustively by the formal framework of an interaction within one particular theory. We develop a corresponding perspective onto central terms in quantum mechanics in general, and onto the measurement problem in particular.
Entangled K0 anti-K0 pairs are shown to be suitable to discuss extensions and tests of Bohrs complementarity principle through the quantum marking and quantum erasure techniques suggested by M. O. Scully and K. Druehl [Phys. Rev. A 25, 2208 (1982)]. Strangeness oscillations play the role of the traditional interference pattern linked to wave-like behaviour, whereas the distinct propagation in free space of the K_S and K_L components mimics the two possible interferometric paths taken by particle-like objects.