Motion Artifact Reduction in Quantitative Susceptibility Mapping using Deep Neural Network


Abstract in English

An approach to reduce motion artifacts in Quantitative Susceptibility Mapping using deep learning is proposed. We use an affine motion model with randomly created motion profiles to simulate motion-corrupted QSM images. The simulated QSM image is paired with its motion-free reference to train a neural network using supervised learning. The trained network is tested on unseen simulated motion-corrupted QSM images, in healthy volunteers and in Parkinsons disease patients. The results show that motion artifacts, such as ringing and ghosting, were successfully suppressed.

Download