No Arabic abstract
Pathways-reduced analysis is one of the techniques used by the Fispact-II nuclear activation and transmutation software to study the sensitivity of the computed inventories to uncertainties in reaction cross-sections. Although deciding which pathways are most important is very helpful in for example determining which nuclear data would benefit from further refinement, pathways-reduced analysis need not necessarily define the most critical reaction, since one reaction may contribute to several different pathways. This work examines three different techniques for ranking reactions in their order of importance in determining the final inventory, comparing the pathways based metric (PBM), the direct method and one based on the Pearson correlation coefficient. Reasons why the PBM is to be preferred are presented.
The paper explores the numerical stability and the computational efficiency of a direct method for unfolding the resolution function from the measurements of the neutron induced reactions. A detailed resolution function formalism is laid out, followed by an overview of challenges present in a practical implementation of the method. A special matrix storage scheme is developed in order to facilitate both the memory management of the resolution function matrix, and to increase the computational efficiency of the matrix multiplication and decomposition procedures. Due to its admirable computational properties, a Cholesky decomposition is at the heart of the unfolding procedure. With the smallest but necessary modification of the matrix to be decomposed, the method is successfully applied to system of $10^5times10^5$. However, the amplification of the uncertainties during the direct inversion procedures limits the applicability of the method to high-precision measurements of neutron induced reactions.
We present a new method for sampling rare and large fluctuations in a non-equilibrium system governed by a stochastic partial differential equation (SPDE) with additive forcing. To this end, we deploy the so-called instanton formalism that corresponds to a saddle-point approximation of the action in the path integral formulation of the underlying SPDE. The crucial step in our approach is the formulation of an alternative SPDE that incorporates knowledge of the instanton solution such that we are able to constrain the dynamical evolutions around extreme flow configurations only. Finally, a reweighting procedure based on the Girsanov theorem is applied to recover the full distribution function of the original system. The entire procedure is demonstrated on the example of the one-dimensional Burgers equation. Furthermore, we compare our method to conventional direct numerical simulations as well as to Hybrid Monte Carlo methods. It will be shown that the instanton-based sampling method outperforms both approaches and allows for an accurate quantification of the whole probability density function of velocity gradients from the core to the very far tails.
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofields 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library(EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofields non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebels parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthills parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
Bayesian inference involves two main computational challenges. First, in estimating the parameters of some model for the data, the posterior distribution may well be highly multi-modal: a regime in which the convergence to stationarity of traditional Markov Chain Monte Carlo (MCMC) techniques becomes incredibly slow. Second, in selecting between a set of competing models the necessary estimation of the Bayesian evidence for each is, by definition, a (possibly high-dimensional) integration over the entire parameter space; again this can be a daunting computational task, although new Monte Carlo (MC) integration algorithms offer solutions of ever increasing efficiency. Nested sampling (NS) is one such contemporary MC strategy targeted at calculation of the Bayesian evidence, but which also enables posterior inference as a by-product, thereby allowing simultaneous parameter estimation and model selection. The widely-used MultiNest algorithm presents a particularly efficient implementation of the NS technique for multi-modal posteriors. In this paper we discuss importance nested sampling (INS), an alternative summation of the MultiNest draws, which can calculate the Bayesian evidence at up to an order of magnitude higher accuracy than `vanilla NS with no change in the way MultiNest explores the parameter space. This is accomplished by treating as a (pseudo-)importance sample the totality of points collected by MultiNest, including those previously discarded under the constrained likelihood sampling of the NS algorithm. We apply this technique to several challenging test problems and compare the accuracy of Bayesian evidences obtained with INS against those from vanilla NS.
The REST-for-Physics (Rare Event Searches Toolkit for Physics) framework is a ROOT-based solution providing the means to process and analyze experimental or Monte Carlo event data. Special care has been taken on the traceability of the code and the validation of the results produced within the framework, together with the connectivity between code and data stored registered through specific version metadata members. The framework development was originally motivated to cover the needs at Rare Event Searches experiments (experiments looking for phenomena having extremely low occurrence probability like dark matter or neutrino interactions or rare nuclear decays), and its components naturally implement tools to address the challenges in these kinds of experiments; the integration of a detector physics response, the implementation of signal processing routines, or topological algorithms for physical event identification are some examples. Despite this specialization, the framework was conceived thinking in scalability, and other event-oriented applications could benefit from the data processing routines and/or metadata description implemented in REST, being the generic framework tools completely decoupled from dedicated libraries. REST-for-Physics is a consolidated piece of software already serving the needs of different physics experiments - using gaseous Time Projection Chambers (TPCs) as detection technology - for background data analysis and detector characterization, as well as generic detector R&D. Even though REST has been exploited mainly with gaseous TPCs, the code could be easily applied or adapted to other detection technologies. We present in this work an overview of REST-for-Physics, providing a broad perspective to the infrastructure and organization of the project as a whole. The framework and its different components will be described in the text.