ترغب بنشر مسار تعليمي؟ اضغط هنا

Data reduction for the MATISSE instrument

99   0   0.0 ( 0 )
 نشر من قبل Florentin Millour
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Florentin Millour




اسأل ChatGPT حول البحث

We present in this paper the general formalism and data processing steps used in the MATISSE data reduction software, as it has been developed by the MATISSE consortium. The MATISSE instrument is the mid-infrared new generation interferometric instrument of the Very Large Telescope Interferometer (VLTI). It is a 2-in-1 instrument with 2 cryostats and 2 detectors: one 2k x 2k Rockwell Hawaii 2RG detector for L&M-bands, and one 1k x 1k Raytheon Aquarius detector for N-band, both read at high framerates, up to 30 frames per second. MATISSE is undergoing its first tests in laboratory today.



قيم البحث

اقرأ أيضاً

The Polarimetric and Helioseismic Imager (PHI) is the first deep-space solar spectropolarimeter, on-board the Solar Orbiter (SO) space mission. It faces: stringent requirements on science data accuracy, a dynamic environment, and severe limitations o n telemetry volume. SO/PHI overcomes these restrictions through on-board instrument calibration and science data reduction, using dedicated firmware in FPGAs. This contribution analyses the accuracy of a data processing pipeline by comparing the results obtained with SO/PHI hardware to a reference from a ground computer. The results show that for the analysed pipeline the error introduced by the firmware implementation is well below the requirements of SO/PHI.
We present the data reduction pipeline for the Hi-GAL survey. Hi-GAL is a key project of the Herschel satellite which is mapping the inner part of the Galactic plane (|l| <= 70cdot and |b| <= 1cdot), using 2 PACS and 3 SPIRE frequency bands, from 70{ mu}m to 500{mu}m. Our pipeline relies only partially on the Herschel Interactive Standard Environment (HIPE) and features several newly developed routines to perform data reduction, including accurate data culling, noise estimation and minimum variance map-making, the latter performed with the ROMAGAL algorithm, a deep modification of the ROMA code already tested on cosmological surveys. We discuss in depth the properties of the Hi-GAL Science Demonstration Phase (SDP) data.
Processing of raw data from modern astronomical instruments is nowadays often carried out using dedicated software, so-called pipelines which are largely run in automated operation. In this paper we describe the data reduction pipeline of the Multi U nit Spectroscopic Explorer (MUSE) integral field spectrograph operated at ESOs Paranal observatory. This spectrograph is a complex machine: it records data of 1152 separate spatial elements on detectors in its 24 integral field units. Efficiently handling such data requires sophisticated software, a high degree of automation and parallelization. We describe the algorithms of all processing steps that operate on calibrations and science data in detail, and explain how the raw science data gets transformed into calibrated datacubes. We finally check the quality of selected procedures and output data products, and demonstrate that the pipeline provides datacubes ready for scientific analysis.
116 - Denis V. Shulyak 2010
Chemically Peculiar (CP) stars have been subject of systematic research since more than 50 years. With the discovery of pulsation of some of the cool CP stars, the availability of advanced spectropolarimetric instrumentation and high signal- to-noise , high resolution spectroscopy, a new era of CP star research emerged about 20 years ago. Together with the success in ground-based observations, new space projects are developed that will greatly benefit for future investigations of these unique objects. In this contribution we will give an overview of some interesting results obtained recently from ground-based observations and discuss on future outstanding Gaia space mission and its impact on CP star research.
The HIFI data processing pipeline was developed to systematically process diagnostic, calibration and astronomical observations taken with the HIFI science instrumentas part of the Herschel mission. The HIFI pipeline processed data from all HIFI obse rving modes within the Herschel automated processing environment, as well as, within an interactive environment. A common software framework was developed to best support the use cases required by the instrument teams and by the general astronomers. The HIFI pipeline was built on top of that and was designed with a high degree of modularity. This modular design provided the necessary flexibility and extensibility to deal with the complexity of batch-processing eighteen different observing modes, to support the astronomers in the interactive analysis and to cope with adjustments necessary to improve the pipeline and the quality of the end-products. This approach to the software development and data processing effort was arrived at by coalescing the lessons learned from similar research based projects with the understanding that a degree of foresight was required given the overall length of the project. In this article, both the successes and challenges of the HIFI software development process are presented. To support future similar projects and retain experience gained lessons learned are extracted.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا