No Arabic abstract
Computing size and credibility of Bayesian credible regions for certifying the reliability of any point estimator of an unknown parameter (such as a quantum state, channel, phase, emph{etc.}) relies on rejection sampling from the entire parameter space that is practically infeasible for large datasets. We reformulate the Bayesian credible-region theory to show that both properties can be obtained solely from the average of log-likelihood over the region itself, which is computable with direct region sampling. Neither rejection sampling nor any geometrical knowledge about the whole parameter space is necessary, so that general error certification now becomes feasible. We take this region-average theory to the next level by generalizing size to the average $l_p$-norm distance $(p>0)$ between a random region point and the estimator, and present analytical formulas for $p=2$ to estimate distance-induced size and credibility for any physical system and large datasets, thus implying that asymptotic Bayesian error certification is possible without any Monte~Carlo computation. All results are discussed in the context of quantum-state tomography.
Standard Bayesian credible-region theory for constructing an error region on the unique estimator of an unknown state in general quantum-state tomography to calculate its size and credibility relies on heavy Monte~Carlo sampling of the state space followed by sample rejection. This conventional method typically gives negligible yield for very small error regions originating from large datasets. We propose an operational reformulated theory to compute both size and credibility from region-average quantities that in principle convey information about behavior of these two properties as the credible-region changes. We next suggest the accelerated hit-and-run Monte~Carlo sampling, customized to the construction of Bayesian error-regions, to efficiently compute region-average quantities, and provide its complexity estimates for quantum states. Finally by understanding size as the region-average distance between two states in the region (measured for instance with either the Hilbert-Schmidt, trace-class or Bures distance), we derive approximation formulas to analytically estimate both distance-induced size and credibility under the pseudo-Bloch parametrization without resorting to any Monte~Carlo computation.
Encoding a qubit in logical quantum states with wavefunctions characterized by disjoint support and robust energies can offer simultaneous protection against relaxation and pure dephasing. Using a circuit-quantum-electrodynamics architecture, we experimentally realize a superconducting $0-pi$ qubit, which hosts protected states suitable for quantum-information processing. Multi-tone spectroscopy measurements reveal the energy level structure of the system, which can be precisely described by a simple two-mode Hamiltonian. We find that the parity symmetry of the qubit results in charge-insensitive levels connecting the protected states, allowing for logical operations. The measured relaxation (1.6 ms) and dephasing times (25 $mu$s) demonstrate that our implementation of the $0-pi$ circuit not only broadens the family of superconducting qubits, but also represents a promising candidate for the building block of a fault-tolerant quantum processor.
Rather than point estimators, states of a quantum system that represent ones best guess for the given data, we consider optimal regions of estimators. As the natural counterpart of the popular maximum-likelihood point estimator, we introduce the maximum-likelihood region---the region of largest likelihood among all regions of the same size. Here, the size of a region is its prior probability. Another concept is the smallest credible region---the smallest region with pre-chosen posterior probability. For both optimization problems, the optimal region has constant likelihood on its boundary. We discuss criteria for assigning prior probabilities to regions, and illustrate the concepts and methods with several examples.
An important problem in quantum information processing is the certification of the dimension of quantum systems without making assumptions about the devices used to prepare and measure them, that is, in a device-independent manner. A crucial question is whether such certification is experimentally feasible for high-dimensional quantum systems. Here we experimentally witness in a device-independent manner the generation of six-dimensional quantum systems encoded in the orbital angular momentum of single photons and show that the same method can be scaled, at least, up to dimension 13.
We investigate the frequentist coverage properties of Bayesian credible sets in a general, adaptive, nonparametric framework. It is well known that the construction of adaptive and honest confidence sets is not possible in general. To overcome this problem we introduce an extra assumption on the functional parameters, the so called general polished tail condition. We then show that under standard assumptions both the hierarchical and empirical Bayes methods results in honest confidence sets for sieve type of priors in general settings and we characterize their size. We apply the derived abstract results to various examples, including the nonparametric regression model, density estimation using exponential families of priors, density estimation using histogram priors and nonparametric classification model, for which we show that their size is near minimax adaptive with respect to the considered specific semi-metrics.