ﻻ يوجد ملخص باللغة العربية
In the field of gamma-ray astronomy, irregular and noisy datasets make difficult the characterization of light-curve features in terms of statistical significance while properly accounting for trial factors associated with the search for variability at different times and over different timescales. In order to address these difficulties, we propose a method based on the Haar wavelet decomposition of the data. It allows statistical characterization of possible variability, embedded in a white noise background, in terms of a confidence level. The method is applied to artificially generated data for characterization as well as to the the very high energy M87 light curve recorded with VERITAS in 2008 which serves here as a realistic application example.
Several tools have been developed in the past few years for the statistical analysis of the exoplanet search surveys, mostly using a combination of Monte-Carlo simulations or a Bayesian approach.Here we present the Quick-MESS, a grid-based, non-Monte
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunders
X-ray Charge Coupled Devices (CCDs) have been the workhorse for soft X-ray astronomical instruments for the past quarter century. They provide broad energy response, extremely low electronic read noise, and good energy resolution in soft X-rays. Thes
In this paper we investigate the performance of the likelihood ratio method as a tool for identifying optical and infrared counterparts to proposed radio continuum surveys with SKA precursor and pathfinder telescopes. We present a comparison of the i
Because of the recent technological advances, the key technologies needed for precision space optical astrometry are now in hand. The Microarcsecond Astrometry Probe (MAP) mission concept is designed to find 1 Earth mass planets at 1AU orbit (scaled