ترغب بنشر مسار تعليمي؟ اضغط هنا

The pipeline for the ExoMars DREAMS scientific data archiving

95   0   0.0 ( 0 )
 نشر من قبل Pietro Schipani
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

DREAMS (Dust Characterisation, Risk Assessment, and Environment Analyser on the Martian Surface) is a payload accommodated on the Schiaparelli Entry and Descent Module (EDM) of ExoMars 2016, the ESA and Roscosmos mission to Mars (Esposito (2015), Bettanini et al. (2014)). It is a meteorological station with the additional capability to perform measure- ments of the atmospheric electric fields close to the surface of Mars. The instrument package will make the first measurements of electric fields on Mars, providing data that will be of value in planning the second ExoMars mission in 2020, as well as possible future human missions to the red planet. This paper describes the pipeline to convert the raw telemetries to the final data products for the archive, with associated metadata.

قيم البحث

اقرأ أيضاً

The Palomar Transient Factory (PTF) is a multi-epochal robotic survey of the northern sky that acquires data for the scientific study of transient and variable astrophysical phenomena. The camera and telescope provide for wide-field imaging in optica l bands. In the five years of operation since first light on December 13, 2008, images taken with Mould-R and SDSS-g camera filters have been routinely acquired on a nightly basis (weather permitting), and two different H-alpha filters were installed in May 2011 (656 nm and 663 nm). The PTF image-processing and data-archival program at the Infrared Processing and Analysis Center (IPAC) is tailored to receive and reduce the data, and, from it, generate and preserve astrometrically and photometrically calibrated images, extracted source catalogs, and coadded reference images. Relational databases have been deployed to track these products in operations and the data archive. The fully automated system has benefited by lessons learned from past IPAC projects and comprises advantageous features that are potentially incorporable into other ground-based observatories. Both off-the-shelf and in-house software have been utilized for economy and rapid development. The PTF data archive is curated by the NASA/IPAC Infrared Science Archive (IRSA). A state-of-the-art custom web interface has been deployed for downloading the raw images, processed images, and source catalogs from IRSA. Access to PTF data products is currently limited to an initial public data release (M81, M44, M42, SDSS Stripe 82, and the Kepler Survey Field). It is the intent of the PTF collaboration to release the full PTF data archive when sufficient funding becomes available.
Processing of raw data from modern astronomical instruments is nowadays often carried out using dedicated software, so-called pipelines which are largely run in automated operation. In this paper we describe the data reduction pipeline of the Multi U nit Spectroscopic Explorer (MUSE) integral field spectrograph operated at ESOs Paranal observatory. This spectrograph is a complex machine: it records data of 1152 separate spatial elements on detectors in its 24 integral field units. Efficiently handling such data requires sophisticated software, a high degree of automation and parallelization. We describe the algorithms of all processing steps that operate on calibrations and science data in detail, and explain how the raw science data gets transformed into calibrated datacubes. We finally check the quality of selected procedures and output data products, and demonstrate that the pipeline provides datacubes ready for scientific analysis.
The SOXS is a dual-arm spectrograph (UV-VIS & NIR) and AC due to mounted on the ESO 3.6m NTT in La Silla. Designed to simultaneously cover the optical and NIR wavelength range from 350-2050 nm, the instrument will be dedicated to the study of transie nt and variable events with many Target of Opportunity requests expected. The goal of the SOXS Data Reduction pipeline is to use calibration data to remove all instrument signatures from the SOXS scientific data frames for each of the supported instrument modes, convert this data into physical units and deliver them with their associated error bars to the ESO SAF as Phase 3 compliant science data products, all within 30 minutes. The primary reduced product will be a detrended, wavelength and flux calibrated, telluric corrected 1D spectrum with UV-VIS + NIR arms stitched together. The pipeline will also generate QC metrics to monitor telescope, instrument and detector health. The pipeline is written in Python 3 and has been built with an agile development philosophy that includes adaptive planning and evolutionary development. The pipeline is to be used by the SOXS consortium and the general user community that may want to perform tailored processing of SOXS data. Test driven development has been used throughout the build using `extreme mock data. We aim for the pipeline to be easy to install and extensively and clearly documented.
We present the data reduction pipeline for the Hi-GAL survey. Hi-GAL is a key project of the Herschel satellite which is mapping the inner part of the Galactic plane (|l| <= 70cdot and |b| <= 1cdot), using 2 PACS and 3 SPIRE frequency bands, from 70{ mu}m to 500{mu}m. Our pipeline relies only partially on the Herschel Interactive Standard Environment (HIPE) and features several newly developed routines to perform data reduction, including accurate data culling, noise estimation and minimum variance map-making, the latter performed with the ROMAGAL algorithm, a deep modification of the ROMA code already tested on cosmological surveys. We discuss in depth the properties of the Hi-GAL Science Demonstration Phase (SDP) data.
The HIFI data processing pipeline was developed to systematically process diagnostic, calibration and astronomical observations taken with the HIFI science instrumentas part of the Herschel mission. The HIFI pipeline processed data from all HIFI obse rving modes within the Herschel automated processing environment, as well as, within an interactive environment. A common software framework was developed to best support the use cases required by the instrument teams and by the general astronomers. The HIFI pipeline was built on top of that and was designed with a high degree of modularity. This modular design provided the necessary flexibility and extensibility to deal with the complexity of batch-processing eighteen different observing modes, to support the astronomers in the interactive analysis and to cope with adjustments necessary to improve the pipeline and the quality of the end-products. This approach to the software development and data processing effort was arrived at by coalescing the lessons learned from similar research based projects with the understanding that a degree of foresight was required given the overall length of the project. In this article, both the successes and challenges of the HIFI software development process are presented. To support future similar projects and retain experience gained lessons learned are extracted.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا