ترغب بنشر مسار تعليمي؟ اضغط هنا

Quality control (QC) of MR images is essential to ensure that downstream analyses such as segmentation can be performed successfully. Currently, QC is predominantly performed visually and subjectively, at significant time and operator cost. We aim to automate the process using a probabilistic network that estimates segmentation uncertainty through a heteroscedastic noise model, providing a measure of task-specific quality. By augmenting training images with k-space artefacts, we propose a novel CNN architecture to decouple sources of uncertainty related to the task and different k-space artefacts in a self-supervised manner. This enables the prediction of separate uncertainties for different types of data degradation. While the uncertainty predictions reflect the presence and severity of artefacts, the network provides more robust and generalisable segmentation predictions given the quality of the data. We show that models trained with artefact augmentation provide informative measures of uncertainty on both simulated artefacts and problematic real-world images identified by human raters, both qualitatively and quantitatively in the form of error bars on volume measurements. Relating artefact uncertainty to segmentation Dice scores, we observe that our uncertainty predictions provide a better estimate of MRI quality from the point of view of the task (gray matter segmentation) compared to commonly used metrics of quality including signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR), hence providing a real-time quality metric indicative of segmentation quality.
Quality control (QC) in medical image analysis is time-consuming and laborious, leading to increased interest in automated methods. However, what is deemed suitable quality for algorithmic processing may be different from human-perceived measures of visual quality. In this work, we pose MR image quality assessment from an image reconstruction perspective. We train Bayesian CNNs using a heteroscedastic uncertainty model to recover clean images from noisy data, providing measures of uncertainty over the predictions. This framework enables us to divide data corruption into learnable and non-learnable components and leads us to interpret the predictive uncertainty as an estimation of the achievable recovery of an image. Thus, we argue that quality control for visual assessment cannot be equated to quality control for algorithmic processing. We validate this statement in a multi-task experiment combining artefact recovery with uncertainty prediction and grey matter segmentation. Recognising this distinction between visual and algorithmic quality has the impact that, depending on the downstream task, less data can be excluded based on ``visual quality reasons alone.
46 - J. Richard Shaw 2013
In this paper we describe the spherical harmonic transit telescope, a novel formalism for the analysis of transit radio telescopes. This all-sky approach bypasses the curved sky complications of traditional interferometry and so is particularly well suited to the analysis of wide-field radio interferometers. It enables compact and computationally efficient representations of the data and its statistics that allow new ways of approaching important problems like map-making and foreground removal. In particular, we show how it enables the use of the Karhunen-Loeve transform as a highly effective foreground filter, suppressing realistic foreground residuals for our fiducial example by at least a factor twenty below the 21cm signal even in highly contaminated regions of the sky. This is despite the presence of the angle-frequency mode mixing inherent in real-world instruments with frequency-dependent beams. We show, using Fisher forecasting, that foreground cleaning has little effect on power spectrum constraints compared to hypothetical foreground-free measurements. Beyond providing a natural real-world data analysis framework for 21cm telescopes now under construction and future experiments, this formalism allows accurate power spectrum forecasts to be made that include the interplay of design constraints and realistic experimental systematics with twenty-first century 21cm science.
The study of extragalactic planetary nebulae (EPN) is a rapidly expanding field. The advent of powerful new instrumentation such as the PN spectrograph has led to an avalanche of new EPN discoveries both within and between galaxies. We now have thous ands of EPN detections in a heterogeneous selection of nearby galaxies and their local environments, dwarfing the combined galactic detection efforts of the last century. Key scientific motivations driving this rapid growth in EPN research and discovery have been the use of the PNLF as a standard candle, as dynamical tracers of their host galaxies and dark matter and as probes of Galactic evolution. This is coupled with the basic utility of PN as laboratories of nebula physics and the consequent comparison with theory where population differences, abundance variations and star formation history within and between stellar systems informs both stellar and galactic evolution. Here we pose some of the burning questions, discuss some of the observational challenges and outline some of the future prospects of this exciting, relatively new, research area as we strive to go fainter, image finer, see further and survey faster than ever before and over a wider wavelength regime
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا