ترغب بنشر مسار تعليمي؟ اضغط هنا

Haar wavelets as a tool for the statistical characterization of variability

43   0   0.0 ( 0 )
 نشر من قبل Stephan LeBohec
 تاريخ النشر 2011
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

In the field of gamma-ray astronomy, irregular and noisy datasets make difficult the characterization of light-curve features in terms of statistical significance while properly accounting for trial factors associated with the search for variability at different times and over different timescales. In order to address these difficulties, we propose a method based on the Haar wavelet decomposition of the data. It allows statistical characterization of possible variability, embedded in a white noise background, in terms of a confidence level. The method is applied to artificially generated data for characterization as well as to the the very high energy M87 light curve recorded with VERITAS in 2008 which serves here as a realistic application example.

قيم البحث

اقرأ أيضاً

Several tools have been developed in the past few years for the statistical analysis of the exoplanet search surveys, mostly using a combination of Monte-Carlo simulations or a Bayesian approach.Here we present the Quick-MESS, a grid-based, non-Monte Carlo tool aimed to perform statistical analyses on results from and help with the planning of direct imaging surveys. Quick-MESS uses the (expected) contrast curves for direct imaging surveys to assess for each target the probability that a planet of a given mass and semi-major axis can be detected. By using a grid-based approach Quick-MESS is typically more than an order of magnitude faster than tools based on Monte-Carlo sampling of the planet distribution. In addition, Quick-MESS is extremely flexible, enabling the study of a large range of parameter space for the mass and semi-major axes distributions without the need of re-simulating the planet distribution. In order to show examples of the capabilities of the Quick-MESS, we present the analysis of the Gemini Deep Planet Survey and the predictions for upcoming surveys with extreme-AO instruments.
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunders tanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is the only indicator that is continuous and linear, that it quantifies the number of yes/no questions (i.e. bits) that are needed to identify an element within the distribution, and we will see how applying this concept to statistical mechanics in different ways leads to the Boltzmann, Gibbs and von Neumann entropies.
X-ray Charge Coupled Devices (CCDs) have been the workhorse for soft X-ray astronomical instruments for the past quarter century. They provide broad energy response, extremely low electronic read noise, and good energy resolution in soft X-rays. Thes e properties, along with the large arrays and small pixel sizes available with modern-day CCDs, make them a potential candidate for next generation astronomical X-ray missions equipped with large collecting areas, high angular resolutions and wide fields of view, enabling observation of the faint, diffuse and high redshift X-ray universe. However, such high collecting area (about 30 times Chandra) requires these detectors to have an order of magnitude faster readout than current CCDs to avoid saturation and pile up effects. In this context, Stanford University and MIT have initiated the development of fast readout X-ray cameras. As a tool for this development, we have designed a fast readout, low noise electronics board (intended to work at a 5 Megapixel per second data rate) coupled with an STA Archon controller to readout a 512 x 512 CCD (from MIT Lincoln Laboratory). This versatile setup allows us to study a number of parameters and operation conditions including the option for digital shaping. In this paper, we describe the characterization test stand, the concept and development of the readout electronics, and simulation results. We also report the first measurements of read noise, energy resolution and other parameters from this set up. While this is very much a prototype, we plan to use larger, multi-node CCD devices in the future with dedicated ASIC readout systems to enable faster, parallel readout of the CCDs.
In this paper we investigate the performance of the likelihood ratio method as a tool for identifying optical and infrared counterparts to proposed radio continuum surveys with SKA precursor and pathfinder telescopes. We present a comparison of the i nfrared counterparts identified by the likelihood ratio in the VISTA Deep Extragalactic Observations (VIDEO) survey to radio observations with 6, 10 and 15 arcsec resolution. We cross-match a deep radio catalogue consisting of radio sources with peak flux density $>$ 60 $mu$Jy with deep near-infrared data limited to $K_{mathrm{s}}lesssim$ 22.6. Comparing the infrared counterparts from this procedure to those obtained when cross-matching a set of simulated lower resolution radio catalogues indicates that degrading the resolution from 6 arcsec to 10 and 15 arcsec decreases the completeness of the cross-matched catalogue by approximately 3 and 7 percent respectively. When matching against shallower infrared data, comparable to that achieved by the VISTA Hemisphere Survey, the fraction of radio sources with reliably identified counterparts drops from $sim$89%, at $K_{mathrm{s}}lesssim$22.6, to 47% with $K_{mathrm{s}}lesssim$20.0. Decreasing the resolution at this shallower infrared limit does not result in any further decrease in the completeness produced by the likelihood ratio matching procedure. However, we note that radio continuum surveys with the MeerKAT and eventually the SKA, will require long baselines in order to ensure that the resulting maps are not limited by instrumental confusion noise.
Because of the recent technological advances, the key technologies needed for precision space optical astrometry are now in hand. The Microarcsecond Astrometry Probe (MAP) mission concept is designed to find 1 Earth mass planets at 1AU orbit (scaled to solar luminosity) around the nearest ~90 FGK stars. The MAP payload includes i) a single three-mirror anastigmatic telescope with a 1-m primary mirror and metrology subsystems, and ii) a camera. The camera focal plane consists of 42 detectors, providing a Nyquist sampled FOV of 0.4-deg. Its metrology subsystems ensure that MAP can achieve the 0.8 uas astrometric precision in 1 hr, which is required to detect Earth-like exoplanets in our stellar neighborhood. MAP mission could provide ~10 specific targets for a much larger coronagraphic mission that would measure its spectra. We argue for the development of the space astrometric missions capable of finding Earth-2.0. Given the current technology readiness such missions relying on precision astrometry could be flown in the next decade, perhaps in collaboration with other national space agencies.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا