ﻻ يوجد ملخص باللغة العربية
Upgrades to the LHCb computing infrastructure in the first long shutdown of the LHC have allowed for high quality decay information to be calculated by the software trigger making a separate offline event reconstruction unnecessary. Furthermore, the storage space of the triggered candidate is an order of magnitude smaller than the entire raw event that would otherwise need to be persisted. Tesla, following the LHCb renowned physicist naming convention, is an application designed to process the information calculated by the trigger, with the resulting output used to directly perform physics measurements.
One of the biggest challenges in the High-Luminosity LHC (HL- LHC) era will be the significantly increased data size to be recorded and analyzed from the collisions at the ATLAS and CMS experiments. ServiceX is a software R&D project in the area of D
Pattern recognition problems in high energy physics are notably different from traditional machine learning applications in computer vision. Reconstruction algorithms identify and measure the kinematic properties of particles produced in high energy
The upgraded LHCb detector, due to start datataking in 2022, will have to process an average data rate of 4~TB/s in real time. Because LHCbs physics objectives require that the full detector information for every LHC bunch crossing is read out and ma
CMOS pixel sensors (CPS) represent a novel technological approach to building charged particle detectors. CMOS processes allow to integrate a sensing volume and readout electronics in a single silicon die allowing to build sensors with a small pixel
Development of optical links with 850 nm multi-mode vertical-cavity surface-emitting lasers (VCSELs) has advanced to 25 Gbps in speed. For applications in high-energy experiments, the transceivers are required to be tolerant in radiation and particle