ترغب بنشر مسار تعليمي؟ اضغط هنا

DAS: a data management system for instrument tests and operations

99   0   0.0 ( 0 )
 نشر من قبل Stefano Sartor
 تاريخ النشر 2014
والبحث باللغة English
 تأليف Marco Frailis




اسأل ChatGPT حول البحث

The Data Access System (DAS) is a metadata and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.



قيم البحث

اقرأ أيضاً

202 - Alberto Accomazzi 2012
Assessing the impact of astronomical facilities rests upon an evaluation of the scientific discoveries which their data have enabled. Telescope bibliographies, which link data products with the literature, provide a way to use bibliometrics as an imp act measure for the underlying data. In this paper we argue that the creation and maintenance of telescope bibliographies should be considered an integral part of an observatorys operations. We review the existing tools, services, and workflows which support these curation activities, giving an estimate of the effort and expertise required to maintain an archive-based telescope bibliography.
The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field, ground-based survey system that will image the sky in six optical bands from 320 to 1050 nm, uniformly covering approximately $18,000$deg$^2$ of the sky over 800 times. The L SST is currently under construction on Cerro Pachon in Chile, and expected to enter operations in 2022. Once operational, the LSST will explore a wide range of astrophysical questions, from discovering killer asteroids to examining the nature of Dark Energy. The LSST will generate on average 15 TB of data per night, and will require a comprehensive Data Management system to reduce the raw data to scientifically useful catalogs and images with minimum human intervention. These reductions will result in a real-time alert stream, and eleven data releases over the 10-year duration of LSST operations. To enable this processing, the LSST project is developing a new, general-purpose, high-performance, scalable, well documented, open source data processing software stack for O/IR surveys. Prototypes of this stack are already capable of processing data from existing cameras (e.g., SDSS, DECam, MegaCam), and form the basis of the Hyper-Suprime Cam (HSC) Survey data reduction pipeline.
70 - Yang Xu , Liping Xin , Xuhui Han 2020
GWAC will have been built an integrated FOV of 5,000 $degree^2$ and have already built 1,800 square $degree^2$. The limit magnitude of a 10-second exposure image in the moonless night is 16R. In each observation night, GWAC produces about 0.7TB of ra w data, and the data processing pipeline generates millions of single frame alerts. We describe the GWAC Data Processing and Management System (GPMS), including hardware architecture, database, detection-filtering-validation of transient candidates, data archiving, and user interfaces for the check of transient and the monitor of the system. GPMS combines general technology and software in astronomy and computer field, and use some advanced technologies such as deep learning. Practical results show that GPMS can fully meet the scientific data processing requirement of GWAC. It can online accomplish the detection, filtering and validation of millions of transient candidates, and feedback the final results to the astronomer in real-time. During the observation from October of 2018 to December of 2019, we have already found 102 transients.
The very demanding requirements of the SKA-low instrument call for a challenging antenna design capable of delivering excellence performance in radiation patterns, impedance matching, polarization purity, cost, longevity, etc. This paper is devoted t o the development (design and test of first prototypes) of an active ultra-wideband antenna element for the low-frequency instrument of the SKA radio telescope. The antenna element and differential low noise amplifier described here were originally designed to cover the former SKA-low band (70-450MHz) but it is now aimed to cover the re-defined SKA-low band (50-350MHz) and furthermore the antenna is capable of performing up to 650MHz with the current design. The design is focused on maximum sensitivity in a wide field of view (+/- 45deg from zenith) and low cross-polarization ratios. Furthermore, the size and cost of the element has to be kept to a minimum as millions of these antennas will need to be deployed for the full SKA in very compact configurations. The primary focus of this paper is therefore to discuss various design implications for the SKA-low telescope.
The Dark Energy Survey (DES) is a project with the goal of building, installing and exploiting a new 74 CCD-camera at the Blanco telescope, in order to study the nature of cosmic acceleration. It will cover 5000 square degrees of the southern hemisph ere sky and will record the positions and shapes of 300 million galaxies up to redshift 1.4. The survey will be completed using 525 nights during a 5-year period starting in 2012. About O(1 TB) of raw data will be produced every night, including science and calibration images. The DES data management system has been designed for the processing, calibration and archiving of these data. It is being developed by collaborating DES institutions, led by NCSA. In this contribution, we describe the basic functions of the system, what kind of scientific codes are involved and how the Data Challenge process works, to improve simultaneously the Data Management system algorithms and the Science Working Group analysis codes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا