ترغب بنشر مسار تعليمي؟ اضغط هنا

swordfish: Efficient Forecasting of New Physics Searches without Monte Carlo

64   0   0.0 ( 0 )
 نشر من قبل Thomas Edwards
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce swordfish, a Monte-Carlo-free Python package to predict expected exclusion limits, the discovery reach and expected confidence contours for a large class of experiments relevant for particle- and astrophysics. The tool is applicable to any counting experiment, supports general correlated background uncertainties, and gives exact results in both the signal- and systematics-limited regimes. Instead of time-intensive Monte Carlo simulations and likelihood maximization, it internally utilizes new approximation methods that are built on information geometry. Out of the box, swordfish provides straightforward methods for accurately deriving many of the common sensitivity measures. In addition, it allows one to examine experimental abilities in great detail by employing the notion of information flux. This new concept generalizes signal-to-noise ratios to situations where background uncertainties and component mixing cannot be neglected. The user interface of swordfish is designed with ease-of-use in mind, which we demonstrate by providing typical examples from indirect and direct dark matter searches as jupyter notebooks.



قيم البحث

اقرأ أيضاً

152 - Louis Lyons 2014
Given the cost, both financial and even more importantly in terms of human effort, in building High Energy Physics accelerators and detectors and running them, it is important to use good statistical techniques in analysing data. Some of the statisti cal issues that arise in searches for New Physics are discussed briefly. They include topics such as: Should we insist on the 5 sigma criterion for discovery claims? The probability of A, given B, is not the same as the probability of B, given A. The meaning of p-values. What is Wilks Theorem and when does it not apply? How should we deal with the `Look Elsewhere Effect? Dealing with systematics such as background parametrisation. Coverage: What is it and does my method have the correct coverage? The use of p0 versus p1 plots.
The REST-for-Physics (Rare Event Searches Toolkit for Physics) framework is a ROOT-based solution providing the means to process and analyze experimental or Monte Carlo event data. Special care has been taken on the traceability of the code and the v alidation of the results produced within the framework, together with the connectivity between code and data stored registered through specific version metadata members. The framework development was originally motivated to cover the needs at Rare Event Searches experiments (experiments looking for phenomena having extremely low occurrence probability like dark matter or neutrino interactions or rare nuclear decays), and its components naturally implement tools to address the challenges in these kinds of experiments; the integration of a detector physics response, the implementation of signal processing routines, or topological algorithms for physical event identification are some examples. Despite this specialization, the framework was conceived thinking in scalability, and other event-oriented applications could benefit from the data processing routines and/or metadata description implemented in REST, being the generic framework tools completely decoupled from dedicated libraries. REST-for-Physics is a consolidated piece of software already serving the needs of different physics experiments - using gaseous Time Projection Chambers (TPCs) as detection technology - for background data analysis and detector characterization, as well as generic detector R&D. Even though REST has been exploited mainly with gaseous TPCs, the code could be easily applied or adapted to other detection technologies. We present in this work an overview of REST-for-Physics, providing a broad perspective to the infrastructure and organization of the project as a whole. The framework and its different components will be described in the text.
Selection among alternative theoretical models given an observed data set is an important challenge in many areas of physics and astronomy. Reversible-jump Markov chain Monte Carlo (RJMCMC) is an extremely powerful technique for performing Bayesian m odel selection, but it suffers from a fundamental difficulty: it requires jumps between model parameter spaces, but cannot efficiently explore both parameter spaces at once. Thus, a naive jump between parameter spaces is unlikely to be accepted in the MCMC algorithm and convergence is correspondingly slow. Here we demonstrate an interpolation technique that uses samples from single-model MCMCs to propose inter-model jumps from an approximation to the single-model posterior of the target parameter space. The interpolation technique, based on a kD-tree data structure, is adaptive and efficient in modest dimensionality. We show that our technique leads to improved convergence over naive jumps in an RJMCMC, and compare it to other proposals in the literature to improve the convergence of RJMCMCs. We also demonstrate the use of the same interpolation technique as a way to construct efficient global proposal distributions for single-model MCMCs without prior knowledge of the structure of the posterior distribution, and discuss improvements that permit the method to be used in higher-dimensional spaces efficiently.
Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literatu re to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofields 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library(EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofields non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebels parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthills parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.
127 - M. Augelli 2010
The role of data libraries as a collaborative tool across Monte Carlo codes is discussed. Some new contributions in this domain are presented; they concern a data library of proton and alpha ionization cross sections, the development in progress of a data library of electron ionization cross sections and proposed improvements to the EADL (Evaluated Atomic Data Library), the latter resulting from an extensive data validation process.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا