No Arabic abstract
Geant4Reweight is an open-source C++ framework that allows users to 1) weight tracks produced by the GEANT4 particle transport Monte Carlo simulation according to hadron interaction cross section variations and 2) estimate uncertainties in GEANT4 interaction models by comparing the simulations hadron interaction cross section predictions to data. The ability to weight hadron transport as simulated by GEANT4 is crucial to the propagation of systematic uncertainties related to secondary hadronic interactions in current and upcoming neutrino oscillation experiments, including MicroBooNE, NOvA, and DUNE, as well as hadron test beam experiments such as ProtoDUNE. We provide motivation for weighting hadron tracks in GEANT4 in the context of systematic uncertainty propagation, a description of GEANT4s transport simulation technique, and a description of our weighting technique and fitting framework in the momentum range 0--10 GeV/c, which is typical for the hadrons produced by neutrino interactions in these experiments.
Recent statistical evaluations for High-Energy Physics measurements, in particular those at the Large Hadron Collider, require careful evaluation of many sources of systematic uncertainties at the same time. While the fundamental aspects of the statistical treatment are now consolidated, both using a frequentist or a Bayesian approach, the management of many sources of uncertainties and their corresponding nuisance parameters in analyses that combine multiple control regions and decay channels, in practice, may pose challenging implementation issues, that make the analysis infrastructure complex and hard to manage, eventually resulting in simplifications in the treatment of systematics, and in limitations to the result interpretation. Typical cases will be discussed, having in mind the most popular implementation tool, RooStats, with possible ideas about improving the management of such cases in future software implementations.
texttt{GooStats} is a software framework that provides a flexible environment and common tools to implement multi-variate statistical analysis. The framework is built upon the texttt{CERN ROOT}, texttt{MINUIT} and texttt{GooFit} packages. Running a multi-variate analysis in parallel on graphics processing units yields a huge boost in performance and opens new possibilities. The design and benchmark of texttt{GooStats} are presented in this article along with illustration of its application to statistical problems.
GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.
This article presents the motivation for developing a comprehensive modeling framework in which different models and parameter inputs can be compared and evaluated for a large range of jet-quenching observables measured in relativistic heavy-ion collisions at RHIC and the LHC. The concept of a framework us discussed within the context of recent efforts by the JET Collaboration, the authors of JEWEL, and the JETSCAPE collaborations. The framework ingredients for each of these approaches is presented with a sample of important results from each. The role of advanced statistical tools in comparing models to data is also discussed, along with the need for a more detailed accounting of correlated errors in experimental results.
Evaluated nuclear data uncertainties are often perceived as unrealistic, most often because they are thought to be too small. The impact of this issue in applied nuclear science has been discussed widely in recent years. Commonly suggested causes are: poor estimates of specific error components, neglect of uncertainty correlations, and overlooked known error sources. However, instances have been reported where very careful, objective assessments of all known error sources have been made with realistic error magnitudes and correlations provided, yet the resulting evaluated uncertainties still appear to be inconsistent with observed scatter of predicted mean values. These discrepancies might be attributed to significant unrecognized sources of uncertainty (USU) that limit the accuracy to which these physical quantities can be determined. The objective of our work has been to develop procedures for revealing and including USU estimates in nuclear data evaluations involving experimental input data. We conclude that the presence of USU may be revealed, and estimates of magnitudes made, through quantitative analyses. This paper identifies several specific clues that can be explored by evaluators in identifying the existence of USU. It then describes numerical procedures to generate quantitative estimates of USU magnitudes. Key requirements for these procedures to be viable are that sufficient numbers of data points be available, for statistical reasons, and that additional supporting information about the measurements be provided by the experimenters. Realistic examples are described to illustrate these procedures and demonstrate their outcomes as well as limitations. Our work strongly supports the view that USU is an important issue in nuclear data evaluation, with significant consequences for applications, and that this topic warrants further investigation by the nuclear science community.