ترغب بنشر مسار تعليمي؟ اضغط هنا

Data Reduction Pipeline for the CHARIS Integral-Field Spectrograph I: Detector Readout Calibration and Data Cube Extraction

160   0   0.0 ( 0 )
 نشر من قبل Timothy Brandt
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response, and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or $chi^2$ fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a $chi^2$-based extraction of the data cube, with typical residuals of ~5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the $chi^2$ extraction allows us to model and remove correlated read noise, dramatically improving CHARIS performance. The $chi^2$ extraction produces a data cube that has been deconvolved with the line-spread function, and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.



قيم البحث

اقرأ أيضاً

149 - Neil Zimmerman 2011
Project 1640 is a high contrast near-infrared instrument probing the vicinities of nearby stars through the unique combination of an integral field spectrograph with a Lyot coronagraph and a high-order adaptive optics system. The extraordinary data r eduction demands, similar those which several new exoplanet imaging instruments will face in the near future, have been met by the novel software algorithms described herein. The Project 1640 Data Cube Extraction Pipeline (PCXP) automates the translation of 3.8*10^4 closely packed, coarsely sampled spectra to a data cube. We implement a robust empirical model of the spectrograph focal plane geometry to register the detector image at sub-pixel precision, and map the cube extraction. We demonstrate our ability to accurately retrieve source spectra based on an observation of Saturns moon Titan.
We present the data reduction pipeline, MEAD, for Arizona Lenslets for Exoplanet Spectroscopy (ALES), the first thermal infrared integral field spectrograph designed for high-contrast imaging. ALES is an upgrade of LMIRCam, the $1-5,mu$m imaging came ra for the Large Binocular Telescope, capable of observing astronomical objects in the thermal infrared ($3-5,mu$m) to produce simultaneous spatial and spectral data cubes. The pipeline is currently designed to perform $L$-band ($2.8-4.2,mu$m) data cube reconstruction, relying on methods used extensively by current near-infrared integral field spectrographs. ALES data cube reconstruction on each spectra uses an optimal extraction method. The calibration unit comprises a thermal infrared source, a monochromator and an optical diffuser designed to inject specific wavelengths of light into LBTI to evenly illuminate the pupil plane and ALES lenslet array with monochromatic light. Not only does the calibration unit facilitate wavelength calibration for ALES and LBTI, but it also provides images of monochromatic point spread functions (PSFs). A linear combination of these monochromatic PSFs can be optimized to fit each spectrum in the least-square sense via $chi^2$ fitting.
A fully autonomous data reduction pipeline has been developed for FRODOSpec, an optical fibre-fed integral field spectrograph currently in use at the Liverpool Telescope. This paper details the process required for the reduction of data taken using a n integral field spectrograph and presents an overview of the computational methods implemented to create the pipeline. Analysis of errors and possible future enhancements are also discussed.
OSIRIS is a near-infrared (1.0--2.4 $mu$m) integral field spectrograph operating behind the adaptive optics system at Keck Observatory, and is one of the first lenslet-based integral field spectrographs. Since its commissioning in 2005, it has been a productive instrument, producing nearly half the laser guide star adaptive optics (LGS AO) papers on Keck. The complexity of its raw data format necessitated a custom data reduction pipeline (DRP) delivered with the instrument in order to iteratively assign flux in overlapping spectra to the proper spatial and spectral locations in a data cube. Other than bug fixes and updates required for hardware upgrades, the bulk of the DRP has not been updated since initial instrument commissioning. We report on the first major comprehensive characterization of the DRP using on-sky and calibration data. We also detail improvements to the DRP including characterization of the flux assignment algorithm; exploration of spatial rippling in the reduced data cubes; and improvements to several calibration files, including the rectification matrix, the bad pixel mask, and the wavelength solution. We present lessons learned from over a decade of OSIRIS data reduction that are relevant to the next generation of integral field spectrograph hardware and data reduction software design.
We describe the new spectroscopic data reduction pipeline for the multi-object MMT/Magellan Infrared Spectrograph. The pipeline is implemented in idl as a stand-alone package and is publicly available in both stable and developme
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا