ترغب بنشر مسار تعليمي؟ اضغط هنا

Bad pixel modified interpolation for astronomical images

135   0   0.0 ( 0 )
 نشر من قبل Aleksander Kurek
 تاريخ النشر 2013
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a new method of interpolation for the pixel brightness estimation in astronomical images. Our new method is simple and easily implementable. We show the comparison of this method with the widely used linear interpolation and other interpolation algorithms using one thousand astronomical images obtained from the Sloan Digital Sky Survey. The comparison shows that our method improves bad pixels brightness estimation with four times lower mean error than the presently most popular linear interpolation and has a better performance than any other examined method. The presented idea is flexible and can be also applied to presently used and future interpolation methods. The proposed method is especially useful for large sky surveys image reduction but can be also applied to single image correction.



قيم البحث

اقرأ أيضاً

121 - Eran O. Ofek 2019
I present a software tool for solving the astrometry of astronomical images. The code puts emphasis on robustness against failures for correctly matching the sources in the image to a reference catalog, and on the stability of the solutions over the field of view (e.g., using orthogonal polynomials for the fitted transformation). The code was tested on over 50,000 images from various sources, including the Palomar Transient Factory (PTF) and the Zwicky Transient Facility (ZTF). The tested images equally represent low and high Galactic latitude fields and exhibit failure/bad-solution rate of <2x10^-5. Running on PTF 60-s integration images, and using the GAIA-DR2 as a reference catalog, the typical two-axes-combined astrometric root-mean square (RMS) is 14 mas at the bright end, presumably due to astrometric scintillation noise and systematic errors. I discuss the effects of seeing, airmass and the order of the transformation on the astrometric accuracy. The software, available online, is developed in MATLAB as part of an astronomical image processing environment and it can be run also as a stand-alone code.
The world astronomical image archives represent huge opportunities to time-domain astronomy sciences and other hot topics such as space defense, and astronomical observatories should improve this wealth and make it more accessible in the big data era . In 2010 we introduced the Mega-Archive database and the Mega-Precovery server for data mining images containing Solar system bodies, with focus on near Earth asteroids (NEAs). This paper presents the improvements and introduces some new related data mining tools developed during the last five years. Currently, the Mega-Archive has indexed 15 million images available from six major collections (CADC, ESO, ING, LCOGT, NVO and SMOKA) and other instrument archives and surveys. This meta-data index collection is daily updated (since 2014) by a crawler which performs automated query of five major collections. Since 2016, these data mining tools run to the new dedicated EURONEAR server, and the database migrated to SQL engine which supports robust and fast queries. To constrain the area to search moving or fixed objects in images taken by large mosaic cameras, we built the graphical tools FindCCD and FindCCD for Fixed Objects which overlay the targets across one of seven mosaic cameras (Subaru-SuprimeCam, VST-OmegaCam, INT-WFC, VISTA-VIRCAM, CFHT-MegaCam, Blanco-DECam and Subaru-HSC), also plotting the uncertainty ellipse for poorly observed NEAs. In 2017 we improved Mega-Precovery, which offers now two options for calculus of the ephemerides and three options for the input (objects defined by designation, orbit or observations). Additionally, we developed Mega-Archive for Fixed Objects (MASFO) and Mega-Archive Search for Double Stars (MASDS). We believe that the huge potential of science imaging archives is still insufficiently exploited.
ASTRONIRCAM is an infrared camera-spectrograph installed at the 2.5-meter telescope of the CMO SAI. The instrument is equipped with the HAWAII-2RG array. A bad pixels classification of the ASTRONIRCAM detector is proposed. The classification is based on histograms of the difference of consecutive non-destructive readouts of a flat field. Bad pixels are classified into 5 groups: hot (saturated on the first readout), warm (the signal accumulation rate is above the mean value by more than 5 standard deviations), cold (the rate is under the mean value by more than 5 standard deviations), dead (no signal accumulation), and inverse (having a negative signal accumulation in the first readouts). Normal pixels of the ASTRONIRCAM detector account for 99.6% of the total. We investigated the dependence between the amount of bad pixels and the number of cooldown cycles of the instrument. While hot pixels remain the same, the bad pixels of other types may migrate between groups. The number of pixels in each group stays roughly constant. We found that the mean and variance of the bad pixels amount in each group and the transitions between groups do not differ noticeably between normal or slow cooldowns.
Astronomical images from optical photometric surveys are typically contaminated with transient artifacts such as cosmic rays, satellite trails and scattered light. We have developed and tested an algorithm that removes these artifacts using a deep, a rtifact free, static sky coadd image built up through the median combination of point spread function (PSF) homogenized, overlapping single epoch images. Transient artifacts are detected and masked in each single epoch image through comparison with an artifact free, PSF-matched simulated image that is constructed using the PSF-corrected, model fitting catalog from the artifact free coadd image together with the position variable PSF model of the single epoch image. This approach works well not only for cleaning single epoch images with worse seeing than the PSF homogenized coadd, but also the traditionally much more challenging problem of cleaning single epoch images with better seeing. In addition to masking transient artifacts, we have developed an interpolation approach that uses the local PSF and performs well in removing artifacts whose widths are smaller than the PSF full width at half maximum, including cosmic rays, the peaks of saturated stars and bleed trails. We have tested this algorithm on Dark Energy Survey Science Verification data and present performance metrics. More generally, our algorithm can be applied to any survey which images the same part of the sky multiple times.
Astronomical images are essential for exploring and understanding the universe. Optical telescopes capable of deep observations, such as the Hubble Space Telescope, are heavily oversubscribed in the Astronomical Community. Images also often contain a dditive noise, which makes de-noising a mandatory step in post-processing the data before further data analysis. In order to maximise the efficiency and information gain in the post-processing of astronomical imaging, we turn to machine learning. We propose Astro U-net, a convolutional neural network for image de-noising and enhancement. For a proof-of-concept, we use Hubble space telescope images from WFC3 instrument UVIS with F555W and F606W filters. Our network is able to produce images with noise characteristics as if they are obtained with twice the exposure time, and with minimum bias or information loss. From these images, we are able to recover 95.9% of stars with an average flux error of 2.26%. Furthermore the images have, on average, 1.63 times higher signal-to-noise ratio than the input noisy images, equivalent to the stacking of at least 3 input images, which means a significant reduction in the telescope time needed for future astronomical imaging campaigns.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا