Do you want to publish a course? Click here

Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

255   0   0.0 ( 0 )
 Added by Sergei Zavgorodni
 Publication date 2009
  fields Physics
and research's language is English




Ask ChatGPT about the research

Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam and patient geometries need to be transferred to MC codes, such as BEAMnrc and DOSXYZnrc. Extracting these parameters from DICOM files is not a trivial task, one that has previously been performed mostly using Matlab-based software. This paper describes the DICOM tags that contain information required for MC modeling of conformal and IMRT plans, and reports the development of an in-house DICOM interface, through a library (named Vega) of platform-independent, object-oriented C++ codes. The Vega library is small and succinct, offering just the fundamental functions for reading/modifying/writing DICOM files in a C++ program. The library, however, is flexible enough to extract all MC required data from DICOM files, and write MC produced dose distributions into DICOM files that can then be processed in a treatment planning system environment. The library can be made available upon request to the authors.



rate research

Read More

The International Electrotechnical Commission (IEC) has previously defined standard rotation operators for positive gantry, collimator and couch rotations for the radiotherapy DICOM coordinate system that is commonly used by treatment planning systems. Coordinate transformations to the coordinate systems of commonly used Monte Carlo (MC) codes (BEAMnrc/DOSXYZnrc and VMC++) have been derived and published in the literature. However, these coordinate transformations disregard patient orientation during the computed tomography (CT) scan, and assume the most commonly used head first, supine orientation. While less common, other patient orientations are used in clinics - Monte Carlo verification of such treatments can be problematic due to the lack of appropriate coordinate transformations. In this work, a solution has been obtained by correcting the CT-derived phantom orientation and deriving generalized coordinate transformations for field angles in the DOSXYZnrc and VMC++ codes. The rotation operator that includes any possible patient treatment orientation was determined using the DICOM Image Orientation tag (0020,0037). The derived transformations of the patient image and beam direction angles were verified by comparison of MC dose distributions with the Eclipse treatment planning system, calculated for each of the eight possible patient orientations.
Statistical Data Assimilation (SDA) is the transfer of information from field or laboratory observations to a user selected model of the dynamical system producing those observations. The data is noisy and the model has errors; the information transfer addresses properties of the conditional probability distribution of the states of the model conditioned on the observations. The quantities of interest in SDA are the conditional expected values of functions of the model state, and these require the approximate evaluation of high dimensional integrals. We introduce a conditional probability distribution and use the Laplace method with annealing to identify the maxima of the conditional probability distribution. The annealing method slowly increases the precision term of the model as it enters the Laplace method. In this paper, we extend the idea of precision annealing (PA) to Monte Carlo calculations of conditional expected values using Metropolis-Hastings methods.
The REST-for-Physics (Rare Event Searches Toolkit for Physics) framework is a ROOT-based solution providing the means to process and analyze experimental or Monte Carlo event data. Special care has been taken on the traceability of the code and the validation of the results produced within the framework, together with the connectivity between code and data stored registered through specific version metadata members. The framework development was originally motivated to cover the needs at Rare Event Searches experiments (experiments looking for phenomena having extremely low occurrence probability like dark matter or neutrino interactions or rare nuclear decays), and its components naturally implement tools to address the challenges in these kinds of experiments; the integration of a detector physics response, the implementation of signal processing routines, or topological algorithms for physical event identification are some examples. Despite this specialization, the framework was conceived thinking in scalability, and other event-oriented applications could benefit from the data processing routines and/or metadata description implemented in REST, being the generic framework tools completely decoupled from dedicated libraries. REST-for-Physics is a consolidated piece of software already serving the needs of different physics experiments - using gaseous Time Projection Chambers (TPCs) as detection technology - for background data analysis and detector characterization, as well as generic detector R&D. Even though REST has been exploited mainly with gaseous TPCs, the code could be easily applied or adapted to other detection technologies. We present in this work an overview of REST-for-Physics, providing a broad perspective to the infrastructure and organization of the project as a whole. The framework and its different components will be described in the text.
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofields 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library(EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofields non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebels parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthills parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
We present a new Monte Carlo Markov Chain algorithm for CMB analysis in the low signal-to-noise regime. This method builds on and complements the previously described CMB Gibbs sampler, and effectively solves the low signal-to-noise inefficiency problem of the direct Gibbs sampler. The new algorithm is a simple Metropolis-Hastings sampler with a general proposal rule for the power spectrum, C_l, followed by a particular deterministic rescaling operation of the sky signal. The acceptance probability for this joint move depends on the sky map only through the difference of chi-squared between the original and proposed sky sample, which is close to unity in the low signal-to-noise regime. The algorithm is completed by alternating this move with a standard Gibbs move. Together, these two proposals constitute a computationally efficient algorithm for mapping out the full joint CMB posterior, both in the high and low signal-to-noise regimes.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا