Do you want to publish a course? Click here

Refactoring, reengineering and evolution: paths to Geant4 uncertainty quantification and performance improvement

145   0   0.0 ( 0 )
 Added by Maria Grazia Pia
 Publication date 2012
  fields Physics
and research's language is English




Ask ChatGPT about the research

Ongoing investigations for the improvement of Geant4 accuracy and computational performance resulting by refactoring and reengineering parts of the code are discussed. Issues in refactoring that are specific to the domain of physics simulation are identified and their impact is elucidated. Preliminary quantitative results are reported.



rate research

Read More

An accurate description of interactions between thermal neutrons (below 4 eV) and materials is key to simulate the transport of neutrons in a wide range of applications such as criticality-safety, reactor physics, compact accelerator-driven neutron sources, radiological shielding or nuclear instrumentation, just to name a few. While the Monte Carlo transport code Geant4 was initially developed to simulate particle physics experiments, %-with a large emphasis given on modeled cross-sections for all known particles at all conceivable energies-, its use has spread to neutronics applications, requiring evaluated cross-sections for neutrons and gammas between $0$ and $20$ MeV (the so-called neutron High Precision -HP- package), as well as a proper offline or on-the-flight treatment of these cross-sections. In this paper we will point out limitations affecting Geant4 (version 10.07.p01) thermal neutron treatment and associated nuclear data libraries, by using comparisons with the reference Monte Carlo neutron transport code tripoli, version 11, and we will present the results of various modifications of the Geant4 neutron-HP package, required to overcome these limitations. Also, in order to broaden the support of nuclear data libraries compatible with Geant4, a nuclear processing tool has been developed and validated allowing the use of the code together with ENDF-BVIII.0 and JEFF-3.3 libraries for example. These changes should be taken into account in an upcoming Geant4 release.
The macroscopic behavior of many materials is complex and the end result of mechanisms that operate across a broad range of disparate scales. An imperfect knowledge of material behavior across scales is a source of epistemic uncertainty of the overall material behavior. However, assessing this uncertainty is difficult due to the complex nature of material response and the prohibitive computational cost of integral calculations. In this paper, we exploit the multiscale and hierarchical nature of material response to develop an approach to quantify the overall uncertainty of material response without the need for integral calculations. Specifically, we bound the uncertainty at each scale and then combine the partial uncertainties in a way that provides a bound on the overall or integral uncertainty. The bound provides a conservative estimate on the uncertainty. Importantly, this approach does not require integral calculations that are prohibitively expensive. We demonstrate the framework on the problem of ballistic impact of a polycrystalline magnesium plate. Magnesium and its alloys are of current interest as promising light-weight structural and protective materials. Finally, we remark that the approach can also be used to study the sensitivity of the overall response to particular mechanisms at lower scales in a materials-by-design approach.
The forward problems of pattern formation have been greatly empowered by extensive theoretical studies and simulations, however, the inverse problem is less well understood. It remains unclear how accurately one can use images of pattern formation to learn the functional forms of the nonlinear and nonlocal constitutive relations in the governing equation. We use PDE-constrained optimization to infer the governing dynamics and constitutive relations and use Bayesian inference and linearization to quantify their uncertainties in different systems, operating conditions, and imaging conditions. We discuss the conditions to reduce the uncertainty of the inferred functions and the correlation between them, such as state-dependent free energy and reaction kinetics (or diffusivity). We present the inversion algorithm and illustrate its robustness and uncertainties under limited spatiotemporal resolution, unknown boundary conditions, blurry initial conditions, and other non-ideal situations. Under certain situations, prior physical knowledge can be included to constrain the result. Phase-field, reaction-diffusion, and phase-field-crystal models are used as model systems. The approach developed here can find applications in inferring unknown physical properties of complex pattern-forming systems and in guiding their experimental design.
146 - M. Augelli 2009
The Geant4 toolkit offers a rich variety of electromagnetic physics models; so far the evaluation of this Geant4 domain has been mostly focused on its physics functionality, while the features of its design and their impact on simulation accuracy, computational performance and facilities for verification and validation have not been the object of comparable attention yet, despite the critical role they play in many experimental applications. A new project is in progress to study the application of new design concepts and software techniques in Geant4 electromagnetic physics, and to evaluate how they can improve on the current simulation capabilities. The application of a policy-based class design is investigated as a means to achieve the objective of granular decomposition of processes; this design technique offers various advantages in terms of flexibility of configuration and computational performance. The current Geant4 physics models have been re-implemented according to the new design as a pilot project. The main features of the new design and first results of performance improvement and testing simplification are presented; they are relevant to many Geant4 applications, where computational speed and the containment of resources invested in simulation production and quality assurance play a critical role.
We present a simple and robust strategy for the selection of sampling points in Uncertainty Quantification. The goal is to achieve the fastest possible convergence in the cumulative distribution function of a stochastic output of interest. We assume that the output of interest is the outcome of a computationally expensive nonlinear mapping of an input random variable, whose probability density function is known. We use a radial function basis to construct an accurate interpolant of the mapping. This strategy enables adding new sampling points one at a time, adaptively. This takes into full account the previous evaluations of the target nonlinear function. We present comparisons with a stochastic collocation method based on the Clenshaw-Curtis quadrature rule, and with an adaptive method based on hierarchical surplus, showing that the new method often results in a large computational saving.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا