ترغب بنشر مسار تعليمي؟ اضغط هنا

Kernel-Matrix Determinant Estimates from stopped Cholesky Decomposition

51   0   0.0 ( 0 )
 نشر من قبل Simon Bartels
 تاريخ النشر 2021
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Algorithms involving Gaussian processes or determinantal point processes typically require computing the determinant of a kernel matrix. Frequently, the latter is computed from the Cholesky decomposition, an algorithm of cubic complexity in the size of the matrix. We show that, under mild assumptions, it is possible to estimate the determinant from only a sub-matrix, with probabilistic guarantee on the relative error. We present an augmentation of the Cholesky decomposition that stops under certain conditions before processing the whole matrix. Experiments demonstrate that this can save a considerable amount of time while having an overhead of less than $5%$ when not stopping early. More generally, we present a probabilistic stopping strategy for the approximation of a sum of known length where addends are revealed sequentially. We do not assume independence between addends, only that they are bounded from below and decrease in conditional expectation.



قيم البحث

اقرأ أيضاً

Spatial statistics often involves Cholesky decomposition of covariance matrices. To ensure scalability to high dimensions, several recent approximations have assumed a sparse Cholesky factor of the precision matrix. We propose a hierarchical Vecchia approximation, whose conditional-independence assumptions imply sparsity in the Cholesky factors of both the precision and the covariance matrix. This remarkable property is crucial for applications to high-dimensional spatio-temporal filtering. We present a fast and simple algorithm to compute our hierarchical Vecchia approximation, and we provide extensions to non-linear data assimilation with non-Gaussian data based on the Laplace approximation. In several numerical comparisons, our methods strongly outperformed alternative approaches.
We introduce two equations expressing the inverse determinant of a full rank matrix $mathbf{A} in mathbb{R}^{n times n}$ in terms of expectations over matrix-vector products. The first relationship is $|mathrm{det} (mathbf{A})|^{-1} = mathbb{E}_{math bf{s} sim mathcal{S}^{n-1}}bigl[, Vert mathbf{As}Vert^{-n} bigr]$, where expectations are over vectors drawn uniformly on the surface of an $n$-dimensional radius one hypersphere. The second relationship is $|mathrm{det}(mathbf{A})|^{-1} = mathbb{E}_{mathbf{x} sim q}[,p(mathbf{Ax}) /, q(mathbf{x})]$, where $p$ and $q$ are smooth distributions, and $q$ has full support.
207 - P. Deift , A. Its , I. Krasovsky 2006
The authors use Riemann-Hilbert methods to compute the constant that arises in the asymptotic behavior of the Airy-kernel determinant of random matrix theory.
Riemann manifold Hamiltonian Monte Carlo (RMHMC) has the potential to produce high-quality Markov chain Monte Carlo-output even for very challenging target distributions. To this end, a symmetric positive definite scaling matrix for RMHMC, which deri ves, via a modified Cholesky factorization, from the potentially indefinite negative Hessian of the target log-density is proposed. The methodology is able to exploit the sparsity of the Hessian, stemming from conditional independence modeling assumptions, and thus admit fast implementation of RMHMC even for high-dimensional target distributions. Moreover, the methodology can exploit log-concave conditional target densities, often encountered in Bayesian hierarchical models, for faster sampling and more straight forward tuning. The proposed methodology is compared to alternatives for some challenging targets, and is illustrated by applying a state space model to real data.
213 - P. Deift , I. Krasovsky , 2010
We obtain large gap asymptotics for a Fredholm determinant with a confluent hypergeometric kernel. We also obtain asymptotics for determinants with two types of Bessel kernels which appeared in random matrix theory.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا