ترغب بنشر مسار تعليمي؟ اضغط هنا

Sequential Analysis Techniques for Correlation Studies in Particle Astronomy

262   0   0.0 ( 0 )
 نشر من قبل Brian Connolly M.
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Searches for statistically significant correlations between arrival directions of ultra-high energy cosmic rays and classes of astrophysical objects are common in astroparticle physics. We present a method to test potential correlation signals of a priori unknown strength and evaluate their statistical significance sequentially, i.e., after each incoming new event in a running experiment. The method can be applied to data taken after the test has concluded, allowing for further monitoring of the signal significance. It adheres to the likelihood principle and rigorously accounts for our ignorance of the signal strength.

قيم البحث

اقرأ أيضاً

248 - A.W. Jones 1999
The data from Cosmic Microwave Background (CMB) experiments are becoming more complex with each new experiment. A consistent way of analysing these data sets is required so that direct comparison is possible between the various experimental results. This thesis presents several techniques that can be used to analyse CMB data.
We present an analysis technique that uses the timing information of Cherenkov images from extensive air showers (EAS). Our emphasis is on distant, or large core distance gamma-ray induced showers at multi-TeV energies. Specifically, combining pixel timing information with an improved direction reconstruction algorithm, leads to improvements in angular and core resolution as large as ~40% and ~30%, respectively, when compared with the same algorithm without the use of timing. Above 10 TeV, this results in an angular resolution approaching 0.05 degrees, together with a core resolution better than ~15 m. The off-axis post-cut gamma-ray acceptance is energy dependent and its full width at half maximum ranges from 4 degrees to 8 degrees. For shower directions that are up to ~6 degrees off-axis, the angular resolution achieved by using timing information is comparable, around 100 TeV, to the on-axis angular resolution. The telescope specifications and layout we describe here are geared towards energies above 10 TeV. However, the methods can in principle be applied to other energies, given suitable telescope parameters. The 5-telescope cell investigated in this study could initially pave the way for a larger array of sparsely spaced telescopes in an effort to push the collection area to >10 km2. These results highlight the potential of a `sparse array approach in effectively opening up the energy range above 10 TeV.
The analysis of gravitational wave data involves many model selection problems. The most important example is the detection problem of selecting between the data being consistent with instrument noise alone, or instrument noise and a gravitational wa ve signal. The analysis of data from ground based gravitational wave detectors is mostly conducted using classical statistics, and methods such as the Neyman-Pearson criteria are used for model selection. Future space based detectors, such as the emph{Laser Interferometer Space Antenna} (LISA), are expected to produced rich data streams containing the signals from many millions of sources. Determining the number of sources that are resolvable, and the most appropriate description of each source poses a challenging model selection problem that may best be addressed in a Bayesian framework. An important class of LISA sources are the millions of low-mass binary systems within our own galaxy, tens of thousands of which will be detectable. Not only are the number of sources unknown, but so are the number of parameters required to model the waveforms. For example, a significant subset of the resolvable galactic binaries will exhibit orbital frequency evolution, while a smaller number will have measurable eccentricity. In the Bayesian approach to model selection one needs to compute the Bayes factor between competing models. Here we explore various methods for computing Bayes factors in the context of determining which galactic binaries have measurable frequency evolution. The methods explored include a Reverse Jump Markov Chain Monte Carlo (RJMCMC) algorithm, Savage-Dickie density ratios, the Schwarz-Bayes Information Criterion (BIC), and the Laplace approximation to the model evidence. We find good agreement between all of the approaches.
61 - Ji Qiang 2020
The nonlinear space-charge effects play an important role in high intensity/high brightness accelerators. These effects can be self-consistently studied using multi-particle simulations. In this lecture, we will discuss the particle-in-cell method an d the symplectic tracking model for self-consistent multi-particle simulations.
Astronomy is increasingly encountering two fundamental truths: (1) The field is faced with the task of extracting useful information from extremely large, complex, and high dimensional datasets; (2) The techniques of astroinformatics and astrostatist ics are the only way to make this tractable, and bring the required level of sophistication to the analysis. Thus, an approach which provides these tools in a way that scales to these datasets is not just desirable, it is vital. The expertise required spans not just astronomy, but also computer science, statistics, and informatics. As a computer scientist and expert in machine learning, Alexs contribution of expertise and a large number of fast algorithms designed to scale to large datasets, is extremely welcome. We focus in this discussion on the questions raised by the practical application of these algorithms to real astronomical datasets. That is, what is needed to maximally leverage their potential to improve the science return? This is not a trivial task. While computing and statistical expertise are required, so is astronomical expertise. Precedent has shown that, to-date, the collaborations most productive in producing astronomical science results (e.g, the Sloan Digital Sky Survey), have either involved astronomers expert in computer science and/or statistics, or astronomers involved in close, long-term collaborations with experts in those fields. This does not mean that the astronomers are giving the most important input, but simply that their input is crucial in guiding the effort in the most fruitful directions, and coping with the issues raised by real data. Thus, the tools must be useable and understandable by those whose primary expertise is not computing or statistics, even though they may have quite extensive knowledge of those fields.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا