No Arabic abstract
We present a novel technique, called DSVP (Discrimination through Singular Vectors Projections), to discriminate spurious events within a dataset. The purpose of this paper is to lay down a general procedure which can be tailored for a broad variety of applications. After describing the general concept, we apply the algorithm to the problem of identifying nearly coincident events in low temperature microcalorimeters in order to push the time resolution close to its intrinsic limit. In fact, from simulated datasets it was possible to achieve an effective time resolution even shorter than the sampling time of the system considered. The obtained results are contextualized in the framework of the HOLMES experiment, which aims at directly measuring the neutrino mass with the calorimetric approach, allowing to significally improve its statistical sensitivity.
The OLYMPUS experiment measured the cross-section ratio of positron-proton elastic scattering relative to electron-proton elastic scattering to look for evidence of hard two-photon exchange. To make this measurement, the experiment alternated between electron beam and positron beam running modes, with the relative integrated luminosities of the two running modes providing the crucial normalization. For this reason, OLYMPUS had several redundant luminosity monitoring systems, including a pair of electromagnetic calorimeters positioned downstream from the target to detect symmetric M{o} ller and Bhabha scattering from atomic electrons in the hydrogen gas target. Though this system was designed to monitor the rate of events with single M{o} ller/Bhabha interactions, we found that a more accurate determination of relative luminosity could be made by additionally considering the rate of events with both a M{o} ller/Bhabha interaction and a concurrent elastic $ep$ interaction. This method was improved by small corrections for the variance of the current within bunches in the storage ring and for the probability of three interactions occurring within a bunch. After accounting for systematic effects, we estimate that the method is accurate in determining the relative luminosity to within 0.36%. This precise technique can be employed in future electron-proton and positron-proton scattering experiments to monitor relative luminosity between different running modes.
The paper presents a novel instrumentation for rare events selection which was tested in our research of short lived super heavy elements production and detection. The instrumentation includes an active catcher multi elements system and dedicated electronics. The active catcher located in the forward hemisphere is composed of 63 scintillator detection modules. Reaction products of damped collisions between heavy ion projectiles and heavy target nuclei are implanted in the fast plastic scintillators of the active catcher modules. The acquisition system trigger delivered by logical branch of the electronics allows to record the reaction products which decay via the alpha particle emissions or spontaneous fission which take place between beam bursts. One microsecond wave form signal from FADCs contains information on heavy implanted nucleus as well as its decays.
Random coincidences of nuclear events can be one of the main background sources in low-temperature calorimetric experiments looking for neutrinoless double-beta decay, especially in those searches based on scintillating bolometers embedding the promising double-beta candidate $^{100}$Mo, because of the relatively short half-life of the two-neutrino double-beta decay of this nucleus. We show in this work that randomly coinciding events of the two-neutrino double decay of $^{100}$Mo in enriched Li$_2$$^{100}$MoO$_4$ detectors can be effectively discriminated by pulse-shape analysis in the light channel if the scintillating bolometer is provided with a Neganov-Luke light detector, which can improve the signal-to-noise ratio by a large factor, assumed here at the level of $sim 750$ on the basis of preliminary experimental results obtained with these devices. The achieved pile-up rejection efficiency results in a very low contribution, of the order of $sim 6times10^{-5}$ counts/(keV$cdot$kg$cdot$y), to the background counting rate in the region of interest for a large volume ($sim 90$ cm$^3$) Li$_2$$^{100}$MoO$_4$ detector. This background level is very encouraging in view of a possible use of the Li$_2$$^{100}$MoO$_4$ solution for a bolometric tonne-scale next-generation experiment as that proposed in the CUPID project.
For experiments with high arrival rates, reliable identification of nearly-coincident events can be crucial. For calorimetric measurements to directly measure the neutrino mass such as HOLMES, unidentified pulse pile-ups are expected to be a leading source of experimental error. Although Wiener filtering can be used to recognize pile-up, it suffers errors due to pulse-shape variation from detector nonlinearity, readout dependence on sub-sample arrival times, and stability issues from the ill-posed deconvolution problem of recovering Dirac delta-functions from smooth data. Due to these factors, we have developed a processing method that exploits singular value decomposition to (1) separate single-pulse records from piled-up records in training data and (2) construct a model of single-pulse records that accounts for varying pulse shape with amplitude, arrival time, and baseline level, suitable for detecting nearly-coincident events. We show that the resulting processing advances can reduce the required performance specifications of the detectors and readout system or, equivalently, enable larger sensor arrays and better constraints on the neutrino mass.
Pulse shape discriminating scintillator materials in many cases allow the user to identify two basic kinds of pulses arising from two kinds of particles: neutrons and gammas. An uncomplicated solution for building a classifier consists of a two-component mixture model learned from a collection of pulses from neutrons and gammas at a range of energies. Depending on the conditions of data gathered to be classified, multiple classes of events besides neutrons and gammas may occur, most notably pileup events. All these kinds of events are anomalous and, in cases where the class of the particle is in doubt, it is preferable to remove them from the analysis. This study compares the performance of several machine learning and analytical methods for using the scores from the two-component model to identify anomalous events and in particular to remove pileup events. A specific outcome of this study is to propose a novel anomaly score, denoted G, from an unsupervised two-component model that is conveniently distributed on the interval [-1,1].