ﻻ يوجد ملخص باللغة العربية
The physics goals of the next Large Hadron Collider run include high precision tests of the Standard Model and searches for new physics. These goals require detailed comparison of data with computational models simulating the expected data behavior. To highlight the role which modeling and simulation plays in future scientific discovery, we report on use cases and experience with a unified system built to process both real and simulated data of growing volume and variety.
The ever-increasing volumes of scientific data present new challenges for distributed computing and Grid technologies. The emerging Big Data revolution drives exploration in scientific fields including nanotechnology, astrophysics, high-energy physic
The ATLAS experiment at the Large Hadron Collider has implemented a new system for recording information on detector status and data quality, and for transmitting this information to users performing physics analysis. This system revolves around the
The main purpose of the Baikal-GVD Data Quality Monitoring (DQM) system is to monitor the status of the detector and collected data. The system estimates quality of the recorded signals and performs the data validation. The DQM system is integrated w
This paper describes the design, implementation, and verification of a test-bed for determining the noise temperature of radio antennas operating between 400-800MHz. The requirements for this test-bed were driven by the HIRAX experiment, which uses a
A robust post processing technique is mandatory to analyse the coronagraphic high contrast imaging data. Angular Differential Imaging (ADI) and Principal Component Analysis (PCA) are the most used approaches to suppress the quasi-static structure in