ترغب بنشر مسار تعليمي؟ اضغط هنا

Performance analysis of the SO/PHI software framework for on-board data reduction

65   0   0.0 ( 0 )
 نشر من قبل Kinga Albert
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The Polarimetric and Helioseismic Imager (PHI) is the first deep-space solar spectropolarimeter, on-board the Solar Orbiter (SO) space mission. It faces: stringent requirements on science data accuracy, a dynamic environment, and severe limitations on telemetry volume. SO/PHI overcomes these restrictions through on-board instrument calibration and science data reduction, using dedicated firmware in FPGAs. This contribution analyses the accuracy of a data processing pipeline by comparing the results obtained with SO/PHI hardware to a reference from a ground computer. The results show that for the analysed pipeline the error introduced by the firmware implementation is well below the requirements of SO/PHI.

قيم البحث

اقرأ أيضاً

The extension of on-board data processing capabilities is an attractive option to reduce telemetry for scientific instruments on deep space missions. The challenges that this presents, however, require a comprehensive software system, which operates on the limited resources a data processing unit in space allows. We implemented such a system for the Polarimetric and Helioseismic Imager (PHI) on-board the Solar Orbiter (SO) spacecraft. It ensures autonomous operation to handle long command-response times, easy changing of the processes after new lessons have been learned and meticulous book-keeping of all operations to ensure scientific accuracy. This contribution presents the requirements and main aspects of the software implementation, followed by an example of a task implemented in the software frame, and results from running it on SO/PHI. The presented example shows that the different parts of the software framework work well together, and that the system processes data as we expect. The flexibility of the framework makes it possible to use it as a baseline for future applications with similar needs and limitations as SO/PHI.
We present in this paper the general formalism and data processing steps used in the MATISSE data reduction software, as it has been developed by the MATISSE consortium. The MATISSE instrument is the mid-infrared new generation interferometric instru ment of the Very Large Telescope Interferometer (VLTI). It is a 2-in-1 instrument with 2 cryostats and 2 detectors: one 2k x 2k Rockwell Hawaii 2RG detector for L&M-bands, and one 1k x 1k Raytheon Aquarius detector for N-band, both read at high framerates, up to 30 frames per second. MATISSE is undergoing its first tests in laboratory today.
110 - F. Schuller 2012
Together with the development of the Large APEX Bolometer Camera (LABOCA) for the Atacama Pathfinder Experiment (APEX), a new data reduction package has been written. This software naturally interfaces with the telescope control system, and provides all functionalities for the reduction, analysis and visualization of bolometer data. It is used at APEX for real time processing of observations performed with LABOCA and other bolometer arrays, providing feedback to the observer. Written in an easy-to-script language, BoA is also used offline to reduce APEX continuum data. In this paper, the general structure of this software is presented, and its online and offline capabilities are described.
We present the third release of the AMBER data reduction software by the JMMC. This software is based on core algorithms optimized after several years of operation. An optional graphic interface in a high level language allows the user to control the process step by step or in a completely automatic manner. Ongoing improvement is the implementation of a robust calibration scheme, making use of the full calibration sets available during the night. The output products are standard OI-FITS files, which can be used directly in high level software like model fitting or image reconstruction tools. The software performances are illustrated on a full data set of calibrators observed with AMBER during 5 years taken in various instrumental setup.
114 - M. Baak , G.J. Besjes , D. Cote 2014
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple data models at once, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication-quality style through a simple command-line interface.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا