ترغب بنشر مسار تعليمي؟ اضغط هنا

We describe the processing of data from the Low Frequency Instrument (LFI) used in production of the Planck Early Release Compact Source Catalogue (ERCSC). In particular, we discuss the steps involved in reducing the data from telemetry packets to cl eaned, calibrated, time-ordered data (TOD) and frequency maps. Data are continuously calibrated using the modulation of the temperature of the cosmic microwave background radiation induced by the motion of the spacecraft. Noise properties are estimated from TOD from which the sky signal has been removed using a generalized least square map-making algorithm. Measured 1/f noise knee-frequencies range from 100mHz at 30GHz to a few tens of mHz at 70GHz. A destriping code (Madam) is employed to combine radiometric data and pointing information into sky maps, minimizing the variance of correlated noise. Noise covariance matrices required to compute statistical uncertainties on LFI and Planck products are also produced. Main beams are estimated down to the approx -10dB level using Jupiter transits, which are also used for geometrical calibration of the focal plane.
136 - M. Frailis , M. Maris , A. Zacchei 2010
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be rea dy for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا