ترغب بنشر مسار تعليمي؟ اضغط هنا

Modelling theoretical uncertainties in phenomenological analyses for particle physics

70   0   0.0 ( 0 )
 نشر من قبل S. Descotes-Genon
 تاريخ النشر 2016
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding $p$-values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive $p$-value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavour physics.

قيم البحث

اقرأ أيضاً

Our predictions for particle physics processes are realized in a chain of complex simulators. They allow us to generate high-fidelity simulated data, but they are not well-suited for inference on the theory parameters with observed data. We explain w hy the likelihood function of high-dimensional LHC data cannot be explicitly evaluated, why this matters for data analysis, and reframe what the field has traditionally done to circumvent this problem. We then review new simulation-based inference methods that let us directly analyze high-dimensional data by combining machine learning techniques and information from the simulator. Initial studies indicate that these techniques have the potential to substantially improve the precision of LHC measurements. Finally, we discuss probabilistic programming, an emerging paradigm that lets us extend inference to the latent process of the simulator.
124 - E. Lisi 2015
Theoretical estimates for the half life of neutrinoless double beta decay in candidate nuclei are affected by both particle and nuclear physics uncertainties, which may complicate the interpretation of decay signals or limits. We study such uncertain ties and their degeneracies in the following context: three nuclei of great interest for large-scale experiments (76-Ge, 130-Te, 136-Xe), two representative particle physics mechanisms (light and heavy Majorana neutrino exchange), and a large set of nuclear matrix elements (NME), computed within the quasiparticle random phase approximation (QRPA). It turns out that the main theoretical uncertainties, associated with the effective axial coupling g_A and with the nucleon-nucleon potential, can be parametrized in terms of NME rescaling factors, up to small residuals. From this parametrization, the following QRPA features emerge: (1) the NME dependence on g_A is milder than quadratic; (2) in each of the two mechanisms, the relevant lepton number violating parameter is largely degenerate with the NME rescaling factors; and (3) the light and heavy neutrino exchange mechanisms are basically degenerate in the above three nuclei. We comment on the challenging theoretical and experimental improvements required to reduce such particle and nuclear physics uncertainties and their degeneracies.
A new paradigm for data-driven, model-agnostic new physics searches at colliders is emerging, and aims to leverage recent breakthroughs in anomaly detection and machine learning. In order to develop and benchmark new anomaly detection methods within this framework, it is essential to have standard datasets. To this end, we have created the LHC Olympics 2020, a community challenge accompanied by a set of simulated collider events. Participants in these Olympics have developed their methods using an R&D dataset and then tested them on black boxes: datasets with an unknown anomaly (or not). This paper will review the LHC Olympics 2020 challenge, including an overview of the competition, a description of methods deployed in the competition, lessons learned from the experience, and implications for data analyses with future datasets as well as future colliders.
100 - Matteo Cacciari 2015
We review the history of jets in high energy physics, and describe in more detail the developments of the past ten years, discussing new algorithms for jet finding and their main characteristics, and summarising the status of perturbative calculation s for jet cross sections in hadroproduction. We also describe the emergence of jet grooming and tagging techniques and their application to boosted jets analyses.
The problem of estimating the effect of missing higher orders in perturbation theory is analyzed with emphasis in the application to Higgs production in gluon-gluon fusion. Well-known mathematical methods for an approximated completion of the perturb ative series are applied with the goal to not truncate the series, but complete it in a well-defined way, so as to increase the accuracy - if not the precision - of theoretical predictions. The uncertainty arising from the use of the completion procedure is discussed and a recipe for constructing a corresponding probability distribution function is proposed.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا