ترغب بنشر مسار تعليمي؟ اضغط هنا

A comparative analysis of denoising algorithms for extragalactic imaging surveys

80   0   0.0 ( 0 )
 نشر من قبل Valerio Roscani
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a comprehensive analysis of the performance of noise-reduction (``denoising) algorithms to determine whether they provide advantages in source detection on extragalactic survey images. The methods under analysis are Perona-Malik filtering, Bilateral filter, Total Variation denoising, Structure-texture image decomposition, Non-local means, Wavelets, and Block-matching. We tested the algorithms on simulated images of extragalactic fields with resolution and depth typical of the Hubble, Spitzer, and Euclid Space Telescopes, and of ground-based instruments. After choosing their best internal parameters configuration, we assess their performance as a function of resolution, background level, and image type, also testing their ability to preserve the objects fluxes and shapes. We analyze in terms of completeness and purity the catalogs extracted after applying denoising algorithms on a simulated Euclid Wide Survey VIS image, on real H160 (HST) and K-band (HAWK-I) observations of the CANDELS GOODS-South field. Denoising algorithms often outperform the standard approach of filtering with the Point Spread Function (PSF) of the image. Applying Structure-Texture image decomposition, Perona-Malik filtering, the Total Variation method by Chambolle, and Bilateral filtering on the Euclid-VIS image, we obtain catalogs that are both more pure and complete by 0.2 magnitudes than those based on the standard approach. The same result is achieved with the Structure-Texture image decomposition algorithm applied on the H160 image. The advantage of denoising techniques with respect to PSF filtering increases at increasing depth. Moreover, these techniques better preserve the shape of the detected objects with respect to PSF smoothing. Denoising algorithms provide significant improvements in the detection of faint objects and enhance the scientific return of current and future extragalactic surveys.



قيم البحث

اقرأ أيضاً

109 - Roger P. Deane 2017
The past decade has seen significant advances in cm-wave VLBI extragalactic observations due to a wide range of technical successes, including the increase in processed field-of-view and bandwidth. The future inclusion of MeerKAT into global VLBI net works would provide further enhancement, particularly the dramatic sensitivity boost to >7000 km baselines. This will not be without its limitations, however, considering incomplete MeerKAT band overlap with current VLBI arrays and the small (real-time) field-of-view afforded by the phased up MeerKAT array. We provide a brief overview of the significant contributions MeerKAT-VLBI could make, with an emphasis on the scientific output of several MeerKAT extragalactic Large Survey Projects.
Current and future continuum surveys being undertaken by the new generation of radio telescopes are now poised to address many important science questions, ranging from the earliest galaxies, to the physics of nearby AGN, as well as potentially provi ding new and unexpected discoveries. However, how to efficiently analyse the large quantities of data collected by these studies in order to maximise their scientific output remains an open question. In these proceedings we present details of the surveys module for the Broadband Radio Astronomy Tools (BRATS) software package which will combine new observations with existing multi-frequency data in order to automatically analyse and select sources based on their spectrum. We show how these methods can been applied to investigate objects observed on a variety of spatial scales, and suggest a pathway for how this can be used in the wider context of surveys and large samples.
Several tools have been developed in the past few years for the statistical analysis of the exoplanet search surveys, mostly using a combination of Monte-Carlo simulations or a Bayesian approach.Here we present the Quick-MESS, a grid-based, non-Monte Carlo tool aimed to perform statistical analyses on results from and help with the planning of direct imaging surveys. Quick-MESS uses the (expected) contrast curves for direct imaging surveys to assess for each target the probability that a planet of a given mass and semi-major axis can be detected. By using a grid-based approach Quick-MESS is typically more than an order of magnitude faster than tools based on Monte-Carlo sampling of the planet distribution. In addition, Quick-MESS is extremely flexible, enabling the study of a large range of parameter space for the mass and semi-major axes distributions without the need of re-simulating the planet distribution. In order to show examples of the capabilities of the Quick-MESS, we present the analysis of the Gemini Deep Planet Survey and the predictions for upcoming surveys with extreme-AO instruments.
We present a series of new, publicly available mock catalogs of X-ray selected active galactic nuclei (AGNs), non-active galaxies, and clusters of galaxies. They are based on up-to-date observational results on the demographic of extragalactic X-ray sources and their extrapolations.These mocks reach fluxes below 1E-20 erg s-1 cm-2 in the 0.5-2 keV band, i.e., more than an order of magnitude below the predicted limits of future deep fields, and therefore represent an important tool for simulating extragalactic X-ray surveys with both current and future telescopes. We use our mocks to perform a set of end-to-end simulations of X-ray surveys with the forthcoming Athena mission and with the AXIS probe, a sub-arcsecond resolution X-ray mission concept proposed to the Astro 2020 Decadal Survey. We find that these proposed, next generation surveys may transform our knowledge of the deep X-ray Universe. As an example, in a total observing time of 15,Ms, AXIS would detect ~225,000 AGNs and ~$50,000 non-active galaxies, reaching a flux limit f_0.5-2~5E-19 erg/s/cm2 in the 0.5-2 keV band, with an improvement of over an order of magnitude with respect to surveys with current X-ray facilities. Consequently, 90% of these sources would be detected for the first time in the X-rays. Furthermore, we show that deep and wide X-ray surveys with instruments like AXIS and Athena are expected to detect ~20,000 z>3 AGNs and ~250 sources at redshift z>6, thus opening a new window of knowledge on the evolution of AGNs over cosmic time and putting strong constraints on the predictions of theoretical models of black hole seed accretion in the early universe.
We study the problem of periodicity detection in massive data sets of photometric or radial velocity time series, as presented by ESAs Gaia mission. Periodicity detection hinges on the estimation of the false alarm probability (FAP) of the extremum o f the periodogram of the time series. We consider the problem of its estimation with two main issues in mind. First, for a given number of observations and signal-to-noise ratio, the rate of correct periodicity detections should be constant for all realized cadences of observations regardless of the observational time patterns, in order to avoid sky biases that are difficult to assess. Second, the computational loads should be kept feasible even for millions of time series. Using the Gaia case, we compare the $F^M$ method (Paltani 2004, Schwarzenberg-Czerny 2012), the Baluev method (Baluev 2008) and the GEV method (Suveges 2014), as well as a method for the direct estimation of a threshold. Three methods involve some unknown parameters, which are obtained by fitting a regression-type predictive model using easily obtainable covariates derived from observational time series. We conclude that the GEV and the Baluev methods both provide good solutions to the issues posed by a large-scale processing. The first of these yields the best scientific quality at the price of some moderately costly pre-processing. When this pre-processing is impossible for some reason (e.g. the computational costs are prohibitive or good regression models cannot be constructed), the Baluev method provides a computationally inexpensive alternative with slight biases in regions where time samplings exhibit strong aliases.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا