ترغب بنشر مسار تعليمي؟ اضغط هنا

Monte Carlo tomographic reconstruction in SPECT impact of bootstrapping and number of generated events

168   0   0.0 ( 0 )
 نشر من قبل Jeanine Pellet
 تاريخ النشر 2005
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Z. El Bitar




اسأل ChatGPT حول البحث

In Single Photon Emission Computed Tomography (SPECT), 3D images usually reconstructed by performing a set of bidimensional (2D) analytical or iterative reconstructions can also be reconstructed using an iterative reconstruction algorithm involving a 3D projector. Accurate Monte Carlo (MC) simulations modeling all the physical effects that affect the imaging process can be used to estimate this projector. However, the accuracy of the projector is affected by the stochastic nature of MC simulations. In this paper, we study the accuracy of the reconstructed images with respect to the number of simulated histories used to estimate the MC projector. Furthermore, we study the impact of applying the bootstrapping technique when estimating the projector

قيم البحث

اقرأ أيضاً

Image reconstruction in Single Photon Emission Computed Tomography (SPECT) is affected by physical effects such as photon attenuation, Compton scatter and detector response. These effects can be compensated for by modeling the corresponding spread of photons in 3D within the system matrix used for tomographic reconstruction. The fully 3D Monte Carlo (F3DMC) reconstruction technique consists in calculating this system matrix using Monte Carlo simulations. The inverse problem of tomographic reconstruction is then solved using conventional iterative algorithms such as maximum likelihood expectation maximization (MLEM). Although F3DMC has already shown promising results, its use is currently limited by two major issues: huge size of the fully 3D system matrix and long computation time required for calculating a robust and accurate system matrix. To address these two issues, we propose to calculate the F3DMC system matrix using a spatial sampling matching the functional regions to be reconstructed. In this approach, different regions of interest can be reconstructed with different spatial sampling. For instance, a single value is reconstructed for a functional region assumed to contain uniform activity. To assess the value of this approach, Monte Carlo simulations have been performed using GATE. Results suggest that F3DMC reconstruction using functional regions improves quantitative accuracy compared to the F3DMC reconstruction method proposed so far. In addition, it considerably reduces disk space requirement and duration of the simulations needed to estimate the system matrix. The concept of functional regions might therefore make F3DMC reconstruction practically feasible.
Monte Carlo algorithms have a growing impact on nuclear medicine reconstruction processes. One of the main limitations of myocardial perfusion imaging (MPI) is the effective mitigation of the scattering component, which is particularly challenging in Single Photon Emission Computed Tomography (SPECT). In SPECT, no timing information can be retrieved to locate the primary source photons. Monte Carlo methods allow an event-by-event simulation of the scattering kinematics, which can be incorporated into a model of the imaging system response. This approach was adopted since the late Nineties by several authors, and recently took advantage of the increased computational power made available by high-performance CPUs and GPUs. These recent developments enable a fast image reconstruction with an improved image quality, compared to deterministic approaches. Deterministic approaches are based on energy-windowing of the detector response, and on the cumulative estimate and subtraction of the scattering component. In this paper, we review the main strategies and algorithms to correct for the scattering effect in SPECT and focus on Monte Carlo developments, which nowadays allow the three-dimensional reconstruction of SPECT cardiac images in a few seconds.
We introduce a variant of the Hybrid Monte Carlo (HMC) algorithm to address large-deviation statistics in stochastic hydrodynamics. Based on the path-integral approach to stochastic (partial) differential equations, our HMC algorithm samples space-ti me histories of the dynamical degrees of freedom under the influence of random noise. First, we validate and benchmark the HMC algorithm by reproducing multiscale properties of the one-dimensional Burgers equation driven by Gaussian and white-in-time noise. Second, we show how to implement an importance sampling protocol to significantly enhance, by orders of magnitudes, the probability to sample extreme and rare events, making it possible to estimate moments of field variables of extremely high order (up to 30 and more). By employing reweighting techniques, we map the biased configurations back to the original probability measure in order to probe their statistical importance. Finally, we show that by biasing the system towards very intense negative gradients, the HMC algorithm is able to explore the statistical fluctuations around instanton configurations. Our results will also be interesting and relevant in lattice gauge theory since they provide insight into reweighting techniques.
We describe the Monte Carlo (MC) simulation package of the Borexino detector and discuss the agreement of its output with data. The Borexino MC ab initio simulates the energy loss of particles in all detector components and generates the resulting sc intillation photons and their propagation within the liquid scintillator volume. The simulation accounts for absorption, reemission, and scattering of the optical photons and tracks them until they either are absorbed or reach the photocathode of one of the photomultiplier tubes. Photon detection is followed by a comprehensive simulation of the readout electronics response. The algorithm proceeds with a detailed simulation of the electronics chain. The MC is tuned using data collected with radioactive calibration sources deployed inside and around the scintillator volume. The simulation reproduces the energy response of the detector, its uniformity within the fiducial scintillator volume relevant to neutrino physics, and the time distribution of detected photons to better than 1% between 100 keV and several MeV. The techniques developed to simulate the Borexino detector and their level of refinement are of possible interest to the neutrino community, especially for current and future large-volume liquid scintillator experiments such as Kamland-Zen, SNO+, and Juno.
66 - M. Antonello 2018
SABRE (Sodium-iodide with Active Background REjection) is a direct dark matter search experiment based on an array of radio-pure NaI(Tl) crystals surrounded by a liquid scintillator veto. Twin SABRE experiments in the Northern and Southern Hemisphere s will differentiate a dark matter signal from seasonal and local effects. The experiment is currently in a Proof-of-Principle (PoP) phase, whose goal is to demonstrate that the background rate is low enough to carry out an independent search for a dark matter signal, with sufficient sensitivity to confirm or refute the DAMA result during the following full-scale experimental phase. The impact of background radiation from the detector materials and the experimental site needs to be carefully investigated, including both intrinsic and cosmogenically activated radioactivity. Based on the best knowledge of the most relevant sources of background, we have performed a detailed Monte Carlo study evaluating the expected background in the dark matter search spectral region. The simulation model described in this paper guides the design of the full-scale experiment and will be fundamental for the interpretation of the measured background and hence for the extraction of a possible dark matter signal.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا