Do you want to publish a course? Click here

SNR-enhanced diffusion MRI with structure-preserving low-rank denoising in reproducing kernel Hilbert spaces

56   0   0.0 ( 0 )
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Purpose: To introduce, develop, and evaluate a novel denoising technique for diffusion MRI that leverages non-linear redundancy in the data to boost the SNR while preserving signal information. Methods: We exploit non-linear redundancy of the dMRI data by means of Kernel Principal Component Analysis (KPCA), a non-linear generalization of PCAto reproducing kernel Hilbert spaces. By mapping the signal to a high-dimensional space, better redundancy is achieved despite nonlinearities in the data thereby enabling better denoising than linear PCA. We implement KPCA with a Gaussian kernel, with parameters automatically selected from knowledge of the noise statistics, and validate it on realistic Monte-Carlo simulations as well as with in-vivo human brain submillimeter resolution dMRI data. We demonstrate KPCA denoising using multi-coil dMRI data also. Results: SNR improvements up to 2.7 X were obtained in real in-vivo datasets denoised with KPCA, in comparison to SNR gains of up to 1.8 X when using state-of-the-art PCA denoising, e.g., Marchenko- Pastur PCA (MPPCA). Compared to gold-standard dataset references created from averaged data, we showed that lower normalized root mean squared error (NRMSE) was achieved with KPCA compared to MPPCA. Statistical analysis of residuals shows that only noise is removed. Improvements in the estimation of diffusion model parameters such as fractional anisotropy, mean diffusivity, and fiber orientation distribution functions (fODFs)were demonstrated. Conclusion:Non-linear redundancy of the dMRI signal can be exploited with KPCA, which allows superior noise reduction/ SNR improvements than state-of-the-art PCA methods, without loss of signal information.

rate research

Read More

The geometry of spaces with indefinite inner product, known also as Krein spaces, is a basic tool for developing Operator Theory therein. In the present paper we establish a link between this geometry and the algebraic theory of *-semigroups. It goes via the positive definite functions and related to them reproducing kernel Hilbert spaces. Our concern is in describing properties of elements of the semigroup which determine shift operators which serve as Pontryagin fundamental symmetries
Let $G$ be a locally compact abelian group with a Haar measure, and $Y$ be a measure space. Suppose that $H$ is a reproducing kernel Hilbert space of functions on $Gtimes Y$, such that $H$ is naturally embedded into $L^2(Gtimes Y)$ and is invariant under the translations associated with the elements of $G$. Under some additional technical assumptions, we study the W*-algebra $mathcal{V}$ of translation-invariant bounded linear operators acting on $H$. First, we decompose $mathcal{V}$ into the direct integral of the W*-algebras of bounded operators acting on the reproducing kernel Hilbert spaces $widehat{H}_xi$, $xiinwidehat{G}$, generated by the Fourier transform of the reproducing kernel. Second, we give a constructive criterion for the commutativity of $mathcal{V}$. Third, in the commutative case, we construct a unitary operator that simultaneously diagonalizes all operators belonging to $mathcal{V}$, i.e., converts them into some multiplication operators. Our scheme generalizes many examples previously studied by Nikolai Vasilevski and other authors.
132 - Sneh Lata , Vern I. Paulsen 2010
We prove two new equivalences of the Feichtinger conjecture that involve reproducing kernel Hilbert spaces. We prove that if for every Hilbert space, contractively contained in the Hardy space, each Bessel sequence of normalized kernel functions can be partitioned into finitely many Riesz basic sequences, then a general bounded Bessel sequence in an arbitrary Hilbert space can be partitioned into finitely many Riesz basic sequences. In addition, we examine some of these spaces and prove that for these spaces bounded Bessel sequences of normalized kernel functions are finite unions of Riesz basic sequences.
The Gaussian kernel plays a central role in machine learning, uncertainty quantification and scattered data approximation, but has received relatively little attention from a numerical analysis standpoint. The basic problem of finding an algorithm for efficient numerical integration of functions reproduced by Gaussian kernels has not been fully solved. In this article we construct two classes of algorithms that use $N$ evaluations to integrate $d$-variate functions reproduced by Gaussian kernels and prove the exponential or super-algebraic decay of their worst-case errors. In contrast to earlier work, no constraints are placed on the length-scale parameter of the Gaussian kernel. The first class of algorithms is obtained via an appropriate scaling of the classical Gauss-Hermite rules. For these algorithms we derive lower and upper bounds on the worst-case error of the forms $exp(-c_1 N^{1/d}) N^{1/(4d)}$ and $exp(-c_2 N^{1/d}) N^{-1/(4d)}$, respectively, for positive constants $c_1 > c_2$. The second class of algorithms we construct is more flexible and uses worst-case optimal weights for points that may be taken as a nested sequence. For these algorithms we derive upper bounds of the form $exp(-c_3 N^{1/(2d)})$ for a positive constant $c_3$.
In this paper, we introduce the notion of reproducing kernel Hilbert spaces for graphs and the Gram matrices associated with them. Our aim is to investigate the Gram matrices of reproducing kernel Hilbert spaces. We provide several bounds on the entries of the Gram matrices of reproducing kernel Hilbert spaces and characterize the graphs which attain our bounds.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا