Phase sensitivity of gain-unbalanced nonlinear interferometers


Abstract in English

The phase uncertainty of an unseeded nonlinear interferometer, where the output of one nonlinear crystal is transmitted to the input of a second crystal that analyzes it, is commonly said to be below the shot-noise level but highly dependent on detection and internal loss. Unbalancing the gains of the first (source) and second (analyzer) crystals leads to a configuration that is tolerant against detection loss. However, in terms of sensitivity, there is no advantage in choosing a stronger analyzer over a stronger source, and hence the comparison to a shot-noise level is not straightforward. Internal loss breaks this symmetry and shows that it is crucial whether the source or analyzer is dominating. Based on these results, claiming a Heisenberg scaling of the sensitivity is more subtle than in a balanced setup.

Download