No Arabic abstract
MUSE, a giant integral field spectrograph, is about to become the newest facility instrument at the VLT. It will see first light in February 2014. Here, we summarize the properties of the instrument as built and outline functionality of the data reduction system, that transforms the raw data that gets recorded separately in 24 IFUs by 4k CCDs, into a fully calibrated, scientifically usable data cube. We then describe recent work regarding geometrical calibration of the instrument and testing of the processing pipeline, before concluding with results of the Preliminary Acceptance in Europe and an outlook to the on-sky commissioning.
The SOXS is a dual-arm spectrograph (UV-VIS & NIR) and AC due to mounted on the ESO 3.6m NTT in La Silla. Designed to simultaneously cover the optical and NIR wavelength range from 350-2050 nm, the instrument will be dedicated to the study of transient and variable events with many Target of Opportunity requests expected. The goal of the SOXS Data Reduction pipeline is to use calibration data to remove all instrument signatures from the SOXS scientific data frames for each of the supported instrument modes, convert this data into physical units and deliver them with their associated error bars to the ESO SAF as Phase 3 compliant science data products, all within 30 minutes. The primary reduced product will be a detrended, wavelength and flux calibrated, telluric corrected 1D spectrum with UV-VIS + NIR arms stitched together. The pipeline will also generate QC metrics to monitor telescope, instrument and detector health. The pipeline is written in Python 3 and has been built with an agile development philosophy that includes adaptive planning and evolutionary development. The pipeline is to be used by the SOXS consortium and the general user community that may want to perform tailored processing of SOXS data. Test driven development has been used throughout the build using `extreme mock data. We aim for the pipeline to be easy to install and extensively and clearly documented.
Processing of raw data from modern astronomical instruments is nowadays often carried out using dedicated software, so-called pipelines which are largely run in automated operation. In this paper we describe the data reduction pipeline of the Multi Unit Spectroscopic Explorer (MUSE) integral field spectrograph operated at ESOs Paranal observatory. This spectrograph is a complex machine: it records data of 1152 separate spatial elements on detectors in its 24 integral field units. Efficiently handling such data requires sophisticated software, a high degree of automation and parallelization. We describe the algorithms of all processing steps that operate on calibrations and science data in detail, and explain how the raw science data gets transformed into calibrated datacubes. We finally check the quality of selected procedures and output data products, and demonstrate that the pipeline provides datacubes ready for scientific analysis.
The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.
Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.
We present the data reduction procedures being used by the GALAH survey, carried out with the HERMES fibre-fed, multi-object spectrograph on the 3.9~m Anglo-Australian Telescope. GALAH is a unique survey, targeting 1 million stars brighter than magnitude V=14 at a resolution of 28,000 with a goal to measure the abundances of 29 elements. Such a large number of high resolution spectra necessitates the development of a reduction pipeline optimized for speed, accuracy, and consistency. We outline the design and structure of the Iraf-based reduction pipeline that we developed, specifically for GALAH, to produce fully calibrated spectra aimed for subsequent stellar atmospheric parameter estimation. The pipeline takes advantage of existing Iraf routines and other readily available software so as to be simple to maintain, testable and reliable. A radial velocity and stellar atmospheric parameter estimator code is also presented, which is used for further data analysis and yields a useful verification of the reduction quality. We have used this estimator to quantify the data quality of GALAH for fibre cross-talk level ($lesssim0.5$%) and scattered light ($sim5$ counts in a typical 20 minutes exposure), resolution across the field, sky spectrum properties, wavelength solution reliability (better than $1$ $mathrm{km s^{-1}}$ accuracy) and radial velocity precision.