No Arabic abstract
Bayesian modeling techniques enable sensitivity analyses that incorporate detailed expectations regarding future experiments. A model-based approach also allows one to evaluate inferences and predicted outcomes, by calibrating (or measuring) the consequences incurred when certain results are reported. We present procedures for calibrating predictions of an experiments sensitivity to both continuous and discrete parameters. Using these procedures and a new Bayesian model of the $beta$-decay spectrum, we assess a high-precision $beta$-decay experiments sensitivity to the neutrino mass scale and ordering, for one assumed design scenario. We find that such an experiment could measure the electron-weighted neutrino mass within $sim40,$meV after 1 year (90$%$ credibility). Neutrino masses $>500,$meV could be measured within $approx5,$meV. Using only $beta$-decay and external reactor neutrino data, we find that next-generation $beta$-decay experiments could potentially constrain the mass ordering using a two-neutrino spectral model analysis. By calibrating mass ordering results, we identify reporting criteria that can be tuned to suppress false ordering claims. In some cases, a two-neutrino analysis can reveal that the mass ordering is inverted, an unobtainable result for the traditional one-neutrino analysis approach.
The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both GERDA and Majorana.
We perform a statistical analysis with the prospective results of future experiments on neutrino-less double beta decay, direct searches for neutrino mass (KATRIN) and cosmological observations. Realistic errors are used and the nuclear matrix element uncertainty for neutrino-less double beta decay is also taken into account. Three benchmark scenarios are introduced, corresponding to quasi-degenerate, inverse hierarchical neutrinos, and an intermediate case. We investigate to what extend these scenarios can be reconstructed. Furthermore, we check the compatibility of the scenarios with the claimed evidence of neutrino-less double beta decay.
We quantify the extent to which future experiments will test the existence of neutrinoless double-beta decay mediated by light neutrinos with inverted-ordered masses. While it remains difficult to compare measurements performed with different isotopes, we find that future searches will fully test the inverted ordering scenario, as a global, multi-isotope endeavor. They will also test other possible mechanisms driving the decay, including a large uncharted region of the allowed parameter space assuming that neutrino masses follow the normal ordering.
Past and current direct neutrino mass experiments set limits on the so-called effective neutrino mass, which is an incoherent sum of neutrino masses and lepton mixing matrix elements. The electron energy spectrum which neglects the relativistic and nuclear recoil effects is often assumed. Alternative definitions of effective masses exist, and an exact relativistic spectrum is calculable. We quantitatively compare the validity of those different approximations as function of energy resolution and exposure in view of tritium beta decays in the KATRIN, Project 8 and PTOLEMY experiments. Furthermore, adopting the Bayesian approach, we present the posterior distributions of the effective neutrino mass by including current experimental information from neutrino oscillations, beta decay, neutrinoless double-beta decay and cosmological observations. Both linear and logarithmic priors for the smallest neutrino mass are assumed.
The objective of the Karlsruhe Tritium Neutrino (KATRIN) experiment is to determine the effective electron neutrino mass $m( u_text{e})$ with an unprecedented sensitivity of $0.2,text{eV}$ (90% C.L.) by precision electron spectroscopy close to the endpoint of the $beta$ decay of tritium. We present a consistent theoretical description of the $beta$ electron energy spectrum in the endpoint region, an accurate model of the apparatus response function, and the statistical approaches suited to interpret and analyze tritium $beta$ decay data observed with KATRIN with the envisaged precision. In addition to providing detailed analytical expressions for all formulae used in the presented model framework with the necessary detail of derivation, we discuss and quantify the impact of theoretical and experimental corrections on the measured $m( u_text{e})$. Finally, we outline the statistical methods for parameter inference and the construction of confidence intervals that are appropriate for a neutrino mass measurement with KATRIN. In this context, we briefly discuss the choice of the $beta$ energy analysis interval and the distribution of measuring time within that range.