No Arabic abstract
Daisy (Data Analysis Integrated Software System) has been designed for the analysis and visualization of the X-ray experiments. To address an extensive range of Chinese radiation facilities communitys requirements from purely algorithmic problems to scientific computing infrastructure, Daisy sets up a cloud-native platform to support on-site data analysis services with fast feedback and interaction. The plugs-in based application is convenient to process the expected high throughput data flow in parallel at next-generation facilities such as the High Energy Photon Source (HEPS). The objectives, functionality and architecture of Daisy are described in this article.
We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple data models at once, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication-quality style through a simple command-line interface.
We have undertaken a major enhancement of our IDL-based simulation tools developed earlier for modeling microwave and X-ray emission. The object-based architecture provides an interactive graphical user interface that allows the user to import photospheric magnetic field maps and perform magnetic field extrapolations to almost instantly generate 3D magnetic field models, to investigate the magnetic topology of these models by interactively creating magnetic field lines and associated magnetic flux tubes, to populate the flux tubes with user-defined nonuniform thermal plasma and anisotropic, nonuniform, nonthermal electron distributions; to investigate the spatial and spectral properties of radio and X-ray emission calculated from the model, and to compare the model-derived images and spectra with observational data. The application integrates shared-object libraries containing fast gyrosynchrotron emission codes developed in FORTRAN and C++, soft and hard X-ray codes developed in IDL, a FORTRAN-based potential-field extrapolation routine and an IDL-based linear force free field extrapolation routine. The interactive interface allows users to add any user-defined radiation code that adheres to our interface standards, as well as user-defined magnetic field extrapolation routines. Here we use this tool to analyze a simple single-loop flare and use the model to constrain the 3D structure of the magnetic flaring loop and 3D spatial distribution of the fast electrons inside this loop. We iteratively compute multi-frequency microwave and multi-energy X-ray images from realistic magnetic fluxtubes obtained from an extrapolation of a magnetogram taken prior to the flare, and compare them with imaging data obtained by SDO, NoRH, and RHESSI instruments. We use this event to illustrate use of the tool for general interpretation of solar flares to address disparate problems in solar physics.
Solar X-ray Monitor (XSM) instrument of Indias Chandrayaan-2 lunar mission carries out broadband spectroscopy of the Sun in soft X-rays. XSM, with its unique features such as low background, high time cadence, and high spectral resolution, provides the opportunity to characterize transient and quiescent X-ray emission from the Sun even during low activity periods. It records the X-ray spectrum at one-second cadence, and the data recorded on-board are downloaded at regular intervals along with that of other payloads. During ground pre-processing, the XSM data is segregated, and the level-0 data is made available for higher levels of processing at the Payload Operations Center (POC). XSM Data Analysis Software (XSMDAS) is developed to carry out the processing of the level-0 data to higher levels and to generate calibrated light curves and spectra for user-defined binning parameters such that it is suitable for further scientific analysis. A front-end for the XSMDAS named XSM Quick Look Display (XSMQLD) is also developed to facilitate a first look at the data without applying calibration. XSM Data Management-Monitoring System (XSMDMS) is designed to carry out automated data processing at the POC and to maintain an SQLite database with relevant information on the data sets and an internal web application for monitoring data quality and instrument health. All XSM raw and calibrated data products are in FITS format, organized into day-wise files, and the data archive follows Planetary Data System-4 (PDS4) standards. The XSM data will be made available after a lock-in period along with the XSM Data Analysis Software from ISRO Science Data Archive (ISDA) at Indian Space Science Data Center(ISSDC). Here we discuss the design and implementation of all components of the software for the XSM data processing and the contents of the XSM data archive.
The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both GERDA and Majorana.
We introduce the MINERvA Analysis Toolkit (MAT), a utility for centralizing the handling of systematic uncertainties in HEP analyses. The fundamental utilities of the toolkit are the MnvHnD, a powerful histogram container class, and the systematic Universe classes, which provide a modular implementation of the many universe error analysis approach. These products can be used stand-alone or as part of a complete error analysis prescription. They support the propagation of systematic uncertainty through all stages of analysis, and provide flexibility for an arbitrary level of user customization. This extensible solution to error analysis enables the standardization of systematic uncertainty definitions across an experiment and a transparent user interface to lower the barrier to entry for new analyzers.