ترغب بنشر مسار تعليمي؟ اضغط هنا

A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

220   0   0.0 ( 0 )
 نشر من قبل Marco Frailis
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.



قيم البحث

اقرأ أيضاً

Realistic synthetic observations of theoretical source models are essential for our understanding of real observational data. In using synthetic data, one can verify the extent to which source parameters can be recovered and evaluate how various data corruption effects can be calibrated. These studies are important when proposing observations of new sources, in the characterization of the capabilities of new or upgraded instruments, and when verifying model-based theoretical predictions in a comparison with observational data. We present the SYnthetic Measurement creator for long Baseline Arrays (SYMBA), a novel synthetic data generation pipeline for Very Long Baseline Interferometry (VLBI) observations. SYMBA takes into account several realistic atmospheric, instrumental, and calibration effects. We used SYMBA to create synthetic observations for the Event Horizon Telescope (EHT), a mm VLBI array, which has recently captured the first image of a black hole shadow. After testing SYMBA with simple source and corruption models, we study the importance of including all corruption and calibration effects. Based on two example general relativistic magnetohydrodynamics (GRMHD) model images of M87, we performed case studies to assess the attainable image quality with the current and future EHT array for different weather conditions. The results show that the effects of atmospheric and instrumental corruptions on the measured visibilities are significant. Despite these effects, we demonstrate how the overall structure of the input models can be recovered robustly after performing calibration steps. With the planned addition of new stations to the EHT array, images could be reconstructed with higher angular resolution and dynamic range. In our case study, these improvements allowed for a distinction between a thermal and a non-thermal GRMHD model based on salient features in reconstructed images.
We present the development of the End-to-End simulator for the SOXS instrument at the ESO-NTT 3.5-m telescope. SOXS will be a spectroscopic facility, made by two arms high efficiency spectrographs, able to cover the spectral range 350-2000 nm with re solving power R=4500. The E2E model allows to simulate the propagation of photons starting from the scientific target of interest up to the detectors. The outputs of the simulator are synthetic frames, which will be mainly exploited for optimizing the pipeline development and possibly assisting for proper alignment and integration phases in laboratory and at the telescope. In this paper, we will detail the architecture of the simulator and the computational model, which are strongly characterized by modularity and flexibility. Synthetic spectral formats, related to different seeing and observing conditions, and calibration frames to be ingested by the pipeline are also presented.
We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on it. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A $chi^2$ test for the full 70 GHz covariance in multipole range $ell=0-50$ yields a mean reduced $chi^2$ of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We derive also a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properties on high-resolution noise in pixel space through simulations.
MeerKATHI is the current development name for a radio-interferometric data reduction pipeline, assembled by an international collaboration. We create a publicly available end-to-end continuum- and line imaging pipeline for MeerKAT and other radio tel escopes. We implement advanced techniques that are suitable for producing high-dynamic-range continuum images and spectroscopic data cubes. Using containerization, our pipeline is platform-independent. Furthermore, we are applying a standardized approach for using a number of different of advanced software suites, partly developed within our group. We aim to use distributed computing approaches throughout our pipeline to enable the user to reduce larger data sets like those provided by radio telescopes such as MeerKAT. The pipeline also delivers a set of imaging quality metrics that give the user the opportunity to efficiently assess the data quality.
111 - B. Nord , A. Amara , A. Refregier 2016
The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments a re being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا