Do you want to publish a course? Click here

Signal estimation in On/Off measurements including event-by-event variables

103   0   0.0 ( 0 )
 Added by Giacomo D'Amico
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

Signal estimation in the presence of background noise is a common problem in several scientific disciplines. An On/Off measurement is performed when the background itself is not known, being estimated from a background control sample. The frequentist and Bayesian approaches for signal estimation in On/Off measurements are reviewed and compared, focusing on the weakness of the former and on the advantages of the latter in correctly addressing the Poissonian nature of the problem. In this work, we devise a novel reconstruction method, dubbed BASiL (Bayesian Analysis including Single-event Likelihoods), for estimating the signal rate based on the Bayesian formalism. It uses information on event-by-event individual parameters and their distribution for the signal and background population. Events are thereby weighted according to their likelihood of being a signal or a background event and background suppression can be achieved without performing fixed fiducial cuts. Throughout the work, we maintain a general notation, that allows to apply the method generically, and provide a performance test using real data and simulations of observations with the MAGIC telescopes, as demonstration of the performance for Cherenkov telescopes. BASiL allows to estimate the signal more precisely, avoiding loss of exposure due to signal extraction cuts. We expect its applicability to be straightforward in similar cases.



rate research

Read More

Oscillation probability calculations are becoming increasingly CPU intensive in modern neutrino oscillation analyses. The independency of reweighting individual events in a Monte Carlo sample lends itself to parallel implementation on a Graphics Processing Unit. The library Prob3++ was ported to the GPU using the CUDA C API, allowing for large scale parallelized calculations of neutrino oscillation probabilities through matter of constant density, decreasing the execution time by a factor of 75, when compared to performance on a single CPU.
The new event generator TWOPEG for the channel $e p rightarrow e p pi^{+} pi^{-}$ has been developed. It uses an advanced method of event generation with weights and employs the five-fold differential structure functions from the rece
Quantifying synchronization phenomena based on the timing of events has recently attracted a great deal of interest in various disciplines such as neuroscience or climatology. A multitude of similarity measures has been proposed for this purpose, including Event Synchronization (ES) and Event Coincidence Analysis (ECA) as two widely applicable examples. While ES defines synchrony in a data adaptive local way that does not distinguish between different time scales, ECA requires selecting a specific scale for analysis. In this paper, we use slightly modifi
We have studied the distribution of traffic flow $q$ for the Nagel-Schreckenberg model by computer simulations. We applied a large-deviation approach, which allowed us to obtain the distribution $P(q)$ over more than one hundred decades in probability, down to probabilities like $10^{-140}$. This allowed us to characterize the flow distribution over a large range of the support and identify the characteristics of rare and even very rare traffic situations. We observe a change of the distribution shape when increasing the density of cars from the free flow to the congestion phase. Furthermore, we characterize typical and rare traffic situations by measuring correlations of $q$ to other quantities like density of standing cars or number and size of traffic jams.
We provide a method to correct the observed azimuthal anisotropy in heavy-ion collisions for the event plane resolution in a wide centrality bin. This new procedure is especially useful for rare particles, such as Omega baryons and J/psi mesons, which are difficult to measure in small intervals of centrality. Based on a Monte Carlo calculation with simulated v_2 and multiplicity, we show that some of the commonly used methods have a bias of up to 15%.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا