Do you want to publish a course? Click here

A Statistical Analysis of the COHERENT Data and Applications to New Physics

125   0   0.0 ( 0 )
 Added by Julia Gehrlein
 Publication date 2020
  fields
and research's language is English




Ask ChatGPT about the research

The observation of coherent elastic neutrino nucleus scattering (CE$ u$NS) by the COHERENT collaboration in 2017 has opened a new window to both test Standard Model predictions at relatively low energies and probe new physics scenarios. Our investigations show, however, that a careful treatment of the statistical methods used to analyze the data is essential to derive correct constraints and bounds on new physics parameters. In this manuscript we perform a detailed analysis of the publicly available COHERENT CsI data making use of all available background data. We point out that Wilks theorem is not fulfilled in general and a calculation of the confidence regions via Monte Carlo simulations following a Feldman-Cousins procedure is necessary. As an example for the necessity of this approach to test new physics scenarios we quantify the allowed ranges for several scenarios with neutrino non-standard interactions. Furthermore, we provide accompanying code to enable an easy implementation of other new physics scenarios as well as data files of our results.



rate research

Read More

We investigate the semi-leptonic decays of $bar B to D^{(*)} ellbar u$ in terms of the Heavy-Quark-Effective-Theory (HQET) parameterization for the form factors, which is described with the heavy quark expansion up to $mathcal O(1/m_c^2)$ beyond the simple approximation considered in the original CLN parameterization. An analysis with this setup was first given in the literature, and then we extend it to the comprehensive analyses including (i) simultaneous fit of $|V_{cb}|$ and the HQET parameters to available experimental full distribution data and theory constraints, and (ii) New Physics (NP) contributions of the $V_2$ and $T$ types to the decay distributions and rates. For this purpose, we perform Bayesian fit analyses by using Stan program, a state-of-the-art public platform for statistical computation. Then, we show that our $|V_{cb}|$ fit results for the SM scenarios are close to the PDG combined average from the exclusive mode, and indicate significance of the angular distribution data. In turn, for the $text{SM} + text{NP}$ scenarios, our fit analyses find that non-zero NP contribution is favored at the best fit point for both $text{SM} + V_2$ and $text{SM} + T$ depending on the HQET parameterization model. A key feature is then realized in the $bar B to D^{(*)} taubar u$ observables. Our fit result of the HQET parameters in the $text{SM} (+T)$ produces a consistent value for $R_D$ while smaller for $R_{D^*}$, compared with the previous SM prediction in the HFLAV report. On the other hand, $text{SM}+V_2$ points to smaller and larger values for $R_D$ and $R_{D^*}$ than the SM predictions. In particular, the $R_{D^*}$ deviation from the experimental measurement becomes smaller, which could be interesting for future improvement on measurements at the Belle II experiment.
We provide a comprehensive and pedagogical introduction to the MadAnalysis 5 framework, with a particular focus on its usage for reinterpretation studies. To this end, we first review the main features of the normal mode of the program and how a detector simulation can be handled. We then detail, step-by-step, how to implement and validate an existing LHC analysis in the MadAnalysis 5 framework and how to use this reimplementation, possibly together with other recast codes available from the MadAnalysis 5 Public Analysis Database, for reinterpreting ATLAS and CMS searches in the context of a new model. Each of these points is illustrated by concrete examples. Moreover, complete reference cards for the normal and expert modes of MadAnalysis 5 are provided in two technical appendices.
106 - Q. H. Liu 2021
For a discrete function $fleft( xright) $ on a discrete set, the finite difference can be either forward and backward. However, we observe that if $ fleft( xright) $ is a sum of two functions $fleft( xright) =f_{1}left( xright) +f_{2}left( xright) $ defined on the discrete set, the first order difference of $Delta fleft( xright) $ is equivocal for we may have $ Delta ^{f}f_{1}left( xright) +Delta ^{b}f_{2}left( xright) $ where $ Delta ^{f}$ and $Delta ^{b}$ denotes the forward and backward difference respectively. Thus, the first order variation equation for this function $ fleft( xright) $ gives many solutions which include both true and false one. A proper formalism of the discrete calculus of variations is proposed to single out the true one by examination of the second order variations, and is capable of yielding the exact form of the distributions for Boltzmann, Bose and Fermi system without requiring the numbers of particle to be infinitely large. The advantage and peculiarity of our formalism are explicitly illustrated by the derivation of the Bose distribution.
We revisit the status of the new-physics interpretations of the anomalies in semileptonic $B$ decays in light of the new data reported by Belle on the lepton-universality ratios $R_{D^{(*)}}$ using the semileptonic tag and on the longitudinal polarization of the $D^*$ in $Bto D^*tau u$, $F_L^{D^*}$. The preferred solutions involve new left-handed currents or tensor contributions. Interpretations with pure right-handed currents are disfavored by the LHC data, while pure scalar models are disfavored by the upper limits derived either from the LHC or from the $B_c$ lifetime. The observable $F_L^{D^*}$ also gives an important constraint leading to the exclusion of large regions of parameter space. Finally, we investigate the sensitivity of different observables to the various scenarios and conclude that a measurement of the tau polarization in the decay mode $Bto Dtau u$ would effectively discriminate among them.
In this paper we describe a novel, model-independent technique of rectangular aggregations for mining the LHC data for hints of new physics. A typical (CMS) search now has hundreds of signal regions, which can obscure potentially interesting anomalies. Applying our technique to the two CMS jets+MET SUSY searches, we identify a set of previously overlooked $sim 3sigma$ excesses. Among these, four excesses survive tests of inter- and intra-search compatibility, and two are especially interesting: they are largely overlapping between the jets+MET searches and are characterized by low jet multiplicity, zero $b$-jets, and low MET and $H_T$. We find that resonant color-triplet production decaying to a quark plus an invisible particle provides an excellent fit to these two excesses and all other data -- including the ATLAS jets+MET search, which actually sees a correlated excess. We discuss the additional constraints coming from dijet resonance searches, monojet searches and pair production. Based on these results, we believe the wide-spread view that the LHC data contains no interesting excesses is greatly exaggerated.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا