No Arabic abstract
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is a versatile Monte Carlo (MC) neutron ray-tracing program that provides researchers with tools for performing computer modeling and simulations that mirror real neutron scattering experiments. By adopting modern software engineering practices such as using composite and visitor design patterns for representing and accessing neutron scatterers, and using recursive algorithms for multiple scattering, MCViNE is flexible enough to handle sophisticated neutron scattering problems including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can take advantage of simulation components in linear-chain-based MC ray tracing packages widely used in instrument design and optimization, as well as NumPy-based components that make prototypes useful and easy to develop. These developments have enabled us to carry out detailed simulations of neutron scattering experiments with non-trivial samples in time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.
Monte Carlo simulations are widely used in many areas including particle accelerators. In this lecture, after a short introduction and reviewing of some statistical backgrounds, we will discuss methods such as direct inversion, rejection method, and Markov chain Monte Carlo to sample a probability distribution function, and methods for variance reduction to evaluate numerical integrals using the Monte Carlo simulation. We will also briefly introduce the quasi-Monte Carlo sampling at the end of this lecture.
Multi-object adaptive optics (MOAO) has been demonstrated by the CANARY instrument on the William Herschel Telescope. However, for proposed MOAO systems on the next generation Extremely Large Telescopes, such as EAGLE, many challenges remain. Here we investigate requirements that MOAO operation places on deformable mirrors (DMs) using a full end-to-end Monte-Carlo AO simulation code. By taking into consideration a prior global ground-layer (GL) correction, we show that actuator density for the MOAO DMs can be reduced with little performance loss. We note that this reduction is only possible with the addition of a GL DM, whose order is greater than or equal to that of the original MOAO mirrors. The addition of a GL DM of lesser order does not affect system performance (if tip/tilt star sharpening is ignored). We also quantify the maximum mechanical DM stroke requirements (3.5 $mu$m desired) and provide tolerances for the DM alignment accuracy, both lateral (to within an eighth of a sub-aperture) and rotational (to within 0.2$^circ$). By presenting results over a range of laser guide star asterism diameters, we ensure that these results are equally applicable for laser tomographic AO systems. We provide the opportunity for significant cost savings to be made in the implementation of MOAO systems, resulting from the lower requirement for DM actuator density.
In this work we demonstrate the usage of the VegasFlow library on multidevice situations: multi-GPU in one single node and multi-node in a cluster. VegasFlow is a new software for fast evaluation of highly parallelizable integrals based on Monte Carlo integration. It is inspired by the Vegas algorithm, very often used as the driver of cross section integrations and based on Googles powerful TensorFlow library. In this proceedings we consider a typical multi-GPU configuration to benchmark how different batch sizes can increase (or decrease) the performance on a Leading Order example integration.
The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.
Several models for the Monte Carlo simulation of Compton scattering on electrons are quantitatively evaluated with respect to a large collection of experimental data retrieved from the literature. Some of these models are currently implemented in general purpose Monte Carlo systems; some have been implemented and evaluated for possible use in Monte Carlo particle transport for the first time in this study. Here we present first and preliminary results concerning total and differential Compton scattering cross sections.