ترغب بنشر مسار تعليمي؟ اضغط هنا

The digital data processing concepts of the LOFT mission

733   0   0.0 ( 0 )
 نشر من قبل Enrico Bozzo
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English
 تأليف C. Tenzer




اسأل ChatGPT حول البحث

The Large Observatory for X-ray Timing (LOFT) is one of the five mission candidates that were considered by ESA for an M3 mission (with a launch opportunity in 2022 - 2024). LOFT features two instruments: the Large Area Detector (LAD) and the Wide Field Monitor (WFM). The LAD is a 10 m 2 -class instrument with approximately 15 times the collecting area of the largest timing mission so far (RXTE) for the first time combined with CCD-class spectral resolution. The WFM will continuously monitor the sky and recognise changes in source states, detect transient and bursting phenomena and will allow the mission to respond to this. Observing the brightest X-ray sources with the effective area of the LAD leads to enormous data rates that need to be processed on several levels, filtered and compressed in real-time already on board. The WFM data processing on the other hand puts rather low constraints on the data rate but requires algorithms to find the photon interaction location on the detector and then to deconvolve the detector image in order to obtain the sky coordinates of observed transient sources. In the following, we want to give an overview of the data handling concepts that were developed during the study phase.



قيم البحث

اقرأ أيضاً

The Herschel Space Observatory is the fourth cornerstone mission in the ESA science programme and performs photometry and spectroscopy in the 55 - 672 micron range. The development of the Herschel Data Processing System started in 2002 to support the data analysis for Instrument Level Tests. The Herschel Data Processing System was used for the pre-flight characterisation of the instruments, and during various ground segment test campaigns. Following the successful launch of Herschel 14th of May 2009 the Herschel Data Processing System demonstrated its maturity when the first PACS preview observation of M51 was processed within 30 minutes of reception of the first science data after launch. Also the first HIFI observations on DR21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. A fast turn-around cycle between data retrieval and the production of science-ready products was demonstrated during the Herschel Science Demonstration Phase Initial Results Workshop held 7 months after launch, which is a clear proof that the system has reached a good level of maturity. We will summarise the scope, the management and development methodology of the Herschel Data Processing system, present some key software elements and give an overview about the current status and future development milestones.
130 - E. Bozzo 2014
LOFT, the Large Observatory For X-ray Timing, was one of the ESA M3 mission candidates that completed their assessment phase at the end of 2013. LOFT is equipped with two instruments, the Large Area Detector (LAD) and the Wide Field Monitor (WFM). Th e LAD performs pointed observations of several targets per orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the proprietary period will be 12 months). The WFM continuously monitors about 1/3 of the sky at a time and provides data for about ~100 sources a day, resulting in a total of ~20 GB of additional telemetry. The LOFT Burst alert System additionally identifies on-board bright impulsive events (e.g., Gamma-ray Bursts, GRBs) and broadcasts the corresponding position and trigger time to the ground using a dedicated system of ~15 VHF receivers. All WFM data are planned to be made public immediately. In this contribution we summarize the planned organization of the LOFT ground segment (GS), as established in the mission Yellow Book 1 . We describe the expected GS contributions from ESA and the LOFT consortium. A review is provided of the planned LOFT data products and the details of the data flow, archiving and distribution. Despite LOFT was not selected for launch within the M3 call, its long assessment phase (> 2 years) led to a very solid mission design and an efficient planning of its ground operations.
Euclid is a Europe-led cosmology space mission dedicated to a visible and near infrared survey of the entire extra-galactic sky. Its purpose is to deepen our knowledge of the dark content of our Universe. After an overview of the Euclid mission and s cience, this contribution describes how the community is getting organized to face the data analysis challenges, both in software development and in operational data processing matters. It ends with a more specific account of some of the main contributions of the Swiss Science Data Center (SDC-CH).
The second Gaia data release is based on 22 months of mission data with an average of 0.9 billion individual CCD observations per day. A data volume of this size and granularity requires a robust and reliable but still flexible system to achieve the demanding accuracy and precision constraints that Gaia is capable of delivering. The internal Gaia photometric system was initialised using an iterative process that is solely based on Gaia data. A set of calibrations was derived for the entire Gaia DR2 baseline and then used to produce the final mean source photometry. The photometric catalogue contains 2.5 billion sources comprised of three different grades depending on the availability of colour information and the procedure used to calibrate them: 1.5 billion gold, 144 million silver, and 0.9 billion bronze. These figures reflect the results of the photometric processing; the content of the data release will be different due to the validation and data quality filters applied during the catalogue preparation. The photometric processing pipeline, PhotPipe, implements all the processing and calibration workflows in terms of Map/Reduce jobs based on the Hadoop platform. This is the first example of a processing system for a large astrophysical survey project to make use of these technologies. The improvements in the generation of the integrated G-band fluxes, in the attitude modelling, in the cross-matching, and and in the identification of spurious detections led to a much cleaner input stream for the photometric processing. This, combined with the improvements in the definition of the internal photometric system and calibration flow, produced high-quality photometry. Hadoop proved to be an excellent platform choice for the implementation of PhotPipe in terms of overall performance, scalability, downtime, and manpower required for operations and maintenance.
302 - Eugene A. Magnier 2016
The Pan-STARRS Data Processing System is responsible for the steps needed to downloaded, archive, and process all images obtained by the Pan-STARRS telescopes, including real-time detection of transient sources such as supernovae and moving objects i ncluding potentially hazardous asteroids. With a nightly data volume of up to 4 terabytes and an archive of over 4 petabytes of raw imagery, Pan-STARRS is solidly in the realm of Big Data astronomy. The full data processing system consists of several subsystems covering the wide range of necessary capabilities. This article describes the Image Processing Pipeline and its connections to both the summit data systems and the outward-facing systems downstream. The latter include the Moving Object Processing System (MOPS) & the public database: the Published Science Products Subsystem (PSPS).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا