ترغب بنشر مسار تعليمي؟ اضغط هنا

The Data Processing Pipeline for the MUSE Instrument

89   0   0.0 ( 0 )
 نشر من قبل Peter M. Weilbacher
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Processing of raw data from modern astronomical instruments is nowadays often carried out using dedicated software, so-called pipelines which are largely run in automated operation. In this paper we describe the data reduction pipeline of the Multi Unit Spectroscopic Explorer (MUSE) integral field spectrograph operated at ESOs Paranal observatory. This spectrograph is a complex machine: it records data of 1152 separate spatial elements on detectors in its 24 integral field units. Efficiently handling such data requires sophisticated software, a high degree of automation and parallelization. We describe the algorithms of all processing steps that operate on calibrations and science data in detail, and explain how the raw science data gets transformed into calibrated datacubes. We finally check the quality of selected procedures and output data products, and demonstrate that the pipeline provides datacubes ready for scientific analysis.



قيم البحث

اقرأ أيضاً

The HIFI data processing pipeline was developed to systematically process diagnostic, calibration and astronomical observations taken with the HIFI science instrumentas part of the Herschel mission. The HIFI pipeline processed data from all HIFI obse rving modes within the Herschel automated processing environment, as well as, within an interactive environment. A common software framework was developed to best support the use cases required by the instrument teams and by the general astronomers. The HIFI pipeline was built on top of that and was designed with a high degree of modularity. This modular design provided the necessary flexibility and extensibility to deal with the complexity of batch-processing eighteen different observing modes, to support the astronomers in the interactive analysis and to cope with adjustments necessary to improve the pipeline and the quality of the end-products. This approach to the software development and data processing effort was arrived at by coalescing the lessons learned from similar research based projects with the understanding that a degree of foresight was required given the overall length of the project. In this article, both the successes and challenges of the HIFI software development process are presented. To support future similar projects and retain experience gained lessons learned are extracted.
{Context}. The HIFI instrument on the Herschel Space Observatory performed over 9100 astronomical observations, almost 900 of which were calibration observations in the course of the nearly four-year Herschel mission. The data from each observation h ad to be converted from raw telemetry into calibrated products and were included in the Herschel Science Archive. {Aims}. The HIFI pipeline was designed to provide robust conversion from raw telemetry into calibrated data throughout all phases of the HIFI missions. Pre-launch laboratory testing was supported as were routine mission operations. {Methods}. A modular software design allowed components to be easily added, removed, amended and/or extended as the understanding of the HIFI data developed during and after mission operations. {Results}. The HIFI pipeline processed data from all HIFI observing modes within the Herschel automated processing environment as well as within an interactive environment. The same software can be used by the general astronomical community to reprocess any standard HIFI observation. The pipeline also recorded the consistency of processing results and provided automated quality reports. Many pipeline modules were in use since the HIFI pre-launch instrument level testing. {Conclusions}. Processing in steps facilitated data analysis to discover and address instrument artefacts and uncertainties. The availability of the same pipeline components from pre-launch throughout the mission made for well-understood, tested, and stable processing. A smooth transition from one phase to the next significantly enhanced processing reliability and robustness.
121 - Shifan Zuo , Jixia Li , Yichao Li 2020
The Tianlai project is a 21cm intensity mapping experiment aimed at detecting dark energy by measuring the baryon acoustic oscillation (BAO) features in the large scale structure power spectrum. This experiment provides an opportunity to test the dat a processing methods for cosmological 21cm signal extraction, which is still a great challenge in current radio astronomy research. The 21cm signal is much weaker than the foregrounds and easily affected by the imperfections in the instrumental responses. Furthermore, processing the large volumes of interferometer data poses a practical challenge. We have developed a data processing pipeline software called {tt tlpipe} to process the drift scan survey data from the Tianlai experiment. It performs offline data processing tasks such as radio frequency interference (RFI) flagging, array calibration, binning, and map-making, etc. It also includes utility functions needed for the data analysis, such as data selection, transformation, visualization and others. A number of new algorithms are implemented, for example the eigenvector decomposition method for array calibration and the Tikhonov regularization for $m$-mode analysis. In this paper we describe the design and implementation of the {tt tlpipe} and illustrate its functions with some analysis of real data. Finally, we outline directions for future development of this publicly code.
We present the data processing pipeline to generate calibrated data products from the Spectral and Photometric Imaging Receiver (SPIRE) imaging Fourier Transform Spectrometer on the Herschel Space Observatory. The pipeline processes telemetry from SP IRE observations and produces calibrated spectra for all resolution modes. The spectrometer pipeline shares some elements with the SPIRE photometer pipeline, including the conversion of telemetry packets into data timelines and calculation of bolometer voltages. We present the following fundamental processing steps unique to the spectrometer: temporal and spatial interpolation of the scan mechanism and detector data to create interferograms; Fourier transformation; apodization; and creation of a data cube. We also describe the corrections for various instrumental effects including first- and second-level glitch identification and removal, correction of the effects due to emission from the Herschel telescope and from within the spectrometer instrument, interferogram baseline correction, temporal and spatial phase correction, non-linear response of the bolometers, and variation of instrument performance across the focal plane arrays. Astronomical calibration is based on combinations of observations of standard astronomical sources and regions of space known to contain minimal emission.
MUSE, a giant integral field spectrograph, is about to become the newest facility instrument at the VLT. It will see first light in February 2014. Here, we summarize the properties of the instrument as built and outline functionality of the data redu ction system, that transforms the raw data that gets recorded separately in 24 IFUs by 4k CCDs, into a fully calibrated, scientifically usable data cube. We then describe recent work regarding geometrical calibration of the instrument and testing of the processing pipeline, before concluding with results of the Preliminary Acceptance in Europe and an outlook to the on-sky commissioning.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا