ترغب بنشر مسار تعليمي؟ اضغط هنا

The third version of the AMBER data reduction software

240   0   0.0 ( 0 )
 نشر من قبل Fabien Malbet
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present the third release of the AMBER data reduction software by the JMMC. This software is based on core algorithms optimized after several years of operation. An optional graphic interface in a high level language allows the user to control the process step by step or in a completely automatic manner. Ongoing improvement is the implementation of a robust calibration scheme, making use of the full calibration sets available during the night. The output products are standard OI-FITS files, which can be used directly in high level software like model fitting or image reconstruction tools. The software performances are illustrated on a full data set of calibrators observed with AMBER during 5 years taken in various instrumental setup.



قيم البحث

اقرأ أيضاً

207 - F. Schuller 2012
Together with the development of the Large APEX Bolometer Camera (LABOCA) for the Atacama Pathfinder Experiment (APEX), a new data reduction package has been written. This software naturally interfaces with the telescope control system, and provides all functionalities for the reduction, analysis and visualization of bolometer data. It is used at APEX for real time processing of observations performed with LABOCA and other bolometer arrays, providing feedback to the observer. Written in an easy-to-script language, BoA is also used offline to reduce APEX continuum data. In this paper, the general structure of this software is presented, and its online and offline capabilities are described.
The Polarimetric and Helioseismic Imager (PHI) is the first deep-space solar spectropolarimeter, on-board the Solar Orbiter (SO) space mission. It faces: stringent requirements on science data accuracy, a dynamic environment, and severe limitations o n telemetry volume. SO/PHI overcomes these restrictions through on-board instrument calibration and science data reduction, using dedicated firmware in FPGAs. This contribution analyses the accuracy of a data processing pipeline by comparing the results obtained with SO/PHI hardware to a reference from a ground computer. The results show that for the analysed pipeline the error introduced by the firmware implementation is well below the requirements of SO/PHI.
The SOXS is a dual-arm spectrograph (UV-VIS & NIR) and AC due to mounted on the ESO 3.6m NTT in La Silla. Designed to simultaneously cover the optical and NIR wavelength range from 350-2050 nm, the instrument will be dedicated to the study of transie nt and variable events with many Target of Opportunity requests expected. The goal of the SOXS Data Reduction pipeline is to use calibration data to remove all instrument signatures from the SOXS scientific data frames for each of the supported instrument modes, convert this data into physical units and deliver them with their associated error bars to the ESO SAF as Phase 3 compliant science data products, all within 30 minutes. The primary reduced product will be a detrended, wavelength and flux calibrated, telluric corrected 1D spectrum with UV-VIS + NIR arms stitched together. The pipeline will also generate QC metrics to monitor telescope, instrument and detector health. The pipeline is written in Python 3 and has been built with an agile development philosophy that includes adaptive planning and evolutionary development. The pipeline is to be used by the SOXS consortium and the general user community that may want to perform tailored processing of SOXS data. Test driven development has been used throughout the build using `extreme mock data. We aim for the pipeline to be easy to install and extensively and clearly documented.
We present the data reduction procedures being used by the GALAH survey, carried out with the HERMES fibre-fed, multi-object spectrograph on the 3.9~m Anglo-Australian Telescope. GALAH is a unique survey, targeting 1 million stars brighter than magni tude V=14 at a resolution of 28,000 with a goal to measure the abundances of 29 elements. Such a large number of high resolution spectra necessitates the development of a reduction pipeline optimized for speed, accuracy, and consistency. We outline the design and structure of the Iraf-based reduction pipeline that we developed, specifically for GALAH, to produce fully calibrated spectra aimed for subsequent stellar atmospheric parameter estimation. The pipeline takes advantage of existing Iraf routines and other readily available software so as to be simple to maintain, testable and reliable. A radial velocity and stellar atmospheric parameter estimator code is also presented, which is used for further data analysis and yields a useful verification of the reduction quality. We have used this estimator to quantify the data quality of GALAH for fibre cross-talk level ($lesssim0.5$%) and scattered light ($sim5$ counts in a typical 20 minutes exposure), resolution across the field, sky spectrum properties, wavelength solution reliability (better than $1$ $mathrm{km s^{-1}}$ accuracy) and radial velocity precision.
118 - D. N. Friedel 2013
The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they a rrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا