No Arabic abstract
We present a source of entangled photons that violates a Bell inequality free of the fair-sampling assumption, by over 7 standard deviations. This violation is the first experiment with photons to close the detection loophole, and we demonstrate enough efficiency overhead to eventually perform a fully loophole-free test of local realism. The entanglement quality is verified by maximally violating additional Bell tests, testing the upper limit of quantum correlations. Finally, we use the source to generate secure private quantum random numbers at rates over 4 orders of magnitude beyond previous experiments.
We discuss the problem of finding the most favorable conditions for closing the detection loophole in a test of local realism with a Bell inequality. For a generic non-maximally entangled two-qubit state and two alternative measurement bases we apply Hardys proof of non-locality without inequality and derive an Eberhard-like inequality. For an infinity of non-maximally entangled states we find that it is possible to refute local realism by requiring perfect detection efficiency for only one of the two measurements: the test is free from the detection loophole for any value of the detection efficiency corresponding to the other measurement. The maximum tolerable noise in a loophole-free test is also evaluated.
We propose a feasible optical setup allowing for a loophole-free Bell test with efficient homodyne detection. A non-gaussian entangled state is generated from a two-mode squeezed vacuum by subtracting a single photon from each mode, using beamsplitters and standard low-efficiency single-photon detectors. A Bell violation exceeding 1% is achievable with 6-dB squeezed light and an homodyne efficiency around 95%. A detailed feasibility analysis, based upon the recent generation of single-mode non-gaussian states, confirms that this method opens a promising avenue towards a complete experimental Bell test.
We present a loophole-free violation of local realism using entangled photon pairs. We ensure that all relevant events in our Bell test are spacelike separated by placing the parties far enough apart and by using fast random number generators and high-speed polarization measurements. A high-quality polarization-entangled source of photons, combined with high-efficiency, low-noise, single-photon detectors, allows us to make measurements without requiring any fair-sampling assumptions. Using a hypothesis test, we compute p-values as small as $5.9times 10^{-9}$ for our Bell violation while maintaining the spacelike separation of our events. We estimate the degree to which a local realistic system could predict our measurement choices. Accounting for this predictability, our smallest adjusted p-value is $2.3 times 10^{-7}$. We therefore reject the hypothesis that local realism governs our experiment.
We provide a detailed analysis of the recently proposed setup for a loophole-free test of Bell inequality using conditionally generated non-Gaussian states of light and balanced homodyning. In the proposed scheme, a two-mode squeezed vacuum state is de-gaussified by subtracting a single photon from each mode with the use of an unbalanced beam splitter and a standard low-efficiency single-photon detector. We thoroughly discuss the dependence of the achievable Bell violation on the various relevant experimental parameters such as the detector efficiencies, the electronic noise and the mixedness of the initial Gaussian state. We also consider several alternative schemes involving squeezed states, linear optical elements, conditional photon subtraction and homodyne detection.
We show unambiguous violations of different macrorealist inequalities, like the LGI and the WLGI using a heralded, single-photon based experimental setup comprising one Mach-Zehnder interferometer followed by a displaced Sagnac one. The negative result measurements (NRM) are implemented in order to validate the presumption of non-invasive measurability used in defining macrorealism. Among all the experiments to date testing macrorealism, the present experiment stands out in comprehensively addressing the relevant loopholes. The clumsiness loophole is addressed through precision testing of any classical invasiveness involved in the implementation of NRMs. This is done by suitably choosing the experimental parameters so that the quantum mechanically predicted validity of all the relevant two-time no-signalling in time (NSIT) conditions is maintained in all the three pairwise experiments performed to show LGI/WLGI violation. Further, importantly, the detection efficiency loophole is addressed by adopting suitable modifications in the measurement strategy enabling the demonstration of the violation of LGI/WLGI for any non-zero detection efficiency. We also show how other relevant loopholes like the multiphoton emission loophole, coincidence loophole, and the preparation state loophole are all closed in the present experiment. We report the LGI violation of $1.32pm 0.04$ and the WLGI violation of $0.10pm 0.02$, where the magnitudes of violation are respectively 8 times and 5 times the corresponding error values, while agreeing perfectly with the ranges of the quantum mechanically predicted values of the LGI, WLGI expressions that we estimate by taking into account the non-idealities of the actual experiment. Simultaneously, the experimentally observed probabilities satisfy all the two-time NSIT conditions up to the order of $10^{-2}$, which ensures non-invasiveness in the implemented NRMs.