Do you want to publish a course? Click here

Quantum Anomaly Detection with Density Estimation and Multivariate Gaussian Distribution

73   0   0.0 ( 0 )
 Added by Shu-Qian Shen
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

We study quantum anomaly detection with density estimation and multivariate Gaussian distribution. Both algorithms are constructed using the standard gate-based model of quantum computing. Compared with the corresponding classical algorithms, the resource complexities of our quantum algorithm are logarithmic in the dimensionality of quantum states and the number of training quantum states. We also present a quantum procedure for efficiently estimating the determinant of any Hermitian operators $mathcal{A}inmathcal{R}^{Ntimes N}$ with time complexity $O(polylog N)$ which forms an important subroutine in our quantum anomaly detection with multivariate Gaussian distribution. Finally, our results also include the modified quantum kernel principal component analysis (PCA) and the quantum one-class support vector machine (SVM) for detecting classical data.



rate research

Read More

53 - Alessandro Roggero 2020
The spectral density operator $hat{rho}(omega)=delta(omega-hat{H})$ plays a central role in linear response theory as its expectation value, the dynamical response function, can be used to compute scattering cross-sections. In this work, we describe a near optimal quantum algorithm providing an approximation to the spectral density with energy resolution $Delta$ and error $epsilon$ using $mathcal{O}left(sqrt{logleft(1/epsilonright)left(logleft(1/Deltaright)+logleft(1/epsilonright)right)}/Deltaright)$ operations. This is achieved without using expensive approximations to the time-evolution operator but exploiting instead qubitization to implement an approximate Gaussian Integral Transform (GIT) of the spectral density. We also describe appropriate error metrics to assess the quality of spectral function approximations more generally.
This paper revisits the problem of computing empirical cumulative distribution functions (ECDF) efficiently on large, multivariate datasets. Computing an ECDF at one evaluation point requires $mathcal{O}(N)$ operations on a dataset composed of $N$ data points. Therefore, a direct evaluation of ECDFs at $N$ evaluation points requires a quadratic $mathcal{O}(N^2)$ operations, which is prohibitive for large-scale problems. Two fast and exact methods are proposed and compared. The first one is based on fast summation in lexicographical order, with a $mathcal{O}(N{log}N)$ complexity and requires the evaluation points to lie on a regular grid. The second one is based on the divide-and-conquer principle, with a $mathcal{O}(Nlog(N)^{(d-1){vee}1})$ complexity and requires the evaluation points to coincide with the input points. The two fast algorithms are described and detailed in the general $d$-dimensional case, and numerical experiments validate their speed and accuracy. Secondly, the paper establishes a direct connection between cumulative distribution functions and kernel density estimation (KDE) for a large class of kernels. This connection paves the way for fast exact algorithms for multivariate kernel density estimation and kernel regression. Numerical tests with the Laplacian kernel validate the speed and accuracy of the proposed algorithms. A broad range of large-scale multivariate density estimation, cumulative distribution estimation, survival function estimation and regression problems can benefit from the proposed numerical methods.
We describe a simple multivariate technique of likelihood ratios for improved discrimination of signal and background in multi-dimensional quantum target detection. The technique combines two independent variables, time difference and summed energy, of a photon pair from the spontaneous parametric down-conversion source into an optimal discriminant. The discriminant performance was studied in experimental data and in Monte-Carlo modelling with clear improvement shown compared to previous techniques. As novel detectors become available, we expect this type of multivariate analysis to become increasingly important in multi-dimensional quantum optics.
209 - Yang Gao , Hwang Lee 2014
We investigate the quantum Cramer-Rao bounds on the joint multiple-parameter estimation with the Gaussian state as a probe. We derive the explicit right logarithmic derivative and symmetric logarithmic derivative operators in such a situation. We compute the corresponding quantum Fisher information matrices, and find that they can be fully expressed in terms of the mean displacement and covariance matrix of the Gaussian state. Finally, we give some examples to show the utility of our analytical results.
116 - Edmondo Trentin 2020
Albeit worryingly underrated in the recent literature on machine learning in general (and, on deep learning in particular), multivariate density estimation is a fundamental task in many applications, at least implicitly, and still an open issue. With a few exceptions, deep neural networks (DNNs) have seldom been applied to density estimation, mostly due to the unsupervised nature of the estimation task, and (especially) due to the need for constrained training algorithms that ended up realizing proper probabilistic models that satisfy Kolmogorovs axioms. Moreover, in spite of the well-known improvement in terms of modeling capabilities yielded by mixture models over plain single-density statistical estimators, no proper mixtures of multivariate DNN-based component densities have been investigated so far. The paper fills this gap by extending our previous work on Neural Mixture Densities (NMMs) to multivariate DNN mixtures. A maximum-likelihood (ML) algorithm for estimating Deep NMMs (DNMMs) is handed out, which satisfies numerically a combination of hard and soft constraints aimed at ensuring satisfaction of Kolmogorovs axioms. The class of probability density functions that can be modeled to any degree of precision via DNMMs is formally defined. A procedure for the automatic selection of the DNMM architecture, as well as of the hyperparameters for its ML training algorithm, is presented (exploiting the probabilistic nature of the DNMM). Experimental results on univariate and multivariate data are reported on, corroborating the effectiveness of the approach and its superiority to the most popular statistical estimation techniques.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا