ترغب بنشر مسار تعليمي؟ اضغط هنا

HL-LHC Computing Review: Common Tools and Community Software

430   0   0.0 ( 0 )
 نشر من قبل Graeme Stewart
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Common and community software packages, such as ROOT, Geant4 and event generators have been a key part of the LHCs success so far and continued development and optimisation will be critical in the future. The challenges are driven by an ambitious physics programme, notably the LHC accelerator upgrade to high-luminosity, HL-LHC, and the corresponding detector upgrades of ATLAS and CMS. In this document we address the issues for software that is used in multiple experiments (usually even more widely than ATLAS and CMS) and maintained by teams of developers who are either not linked to a particular experiment or who contribute to common software within the context of their experiment activity. We also give space to general considerations for future software and projects that tackle upcoming challenges, no matter who writes it, which is an area where community convergence on best practice is extremely useful.

قيم البحث

اقرأ أيضاً

In modern High Energy Physics (HEP) experiments visualization of experimental data has a key role in many activities and tasks across the whole data chain: from detector development to monitoring, from event generation to reconstruction of physics ob jects, from detector simulation to data analysis, and all the way to outreach and education. In this paper, the definition, status, and evolution of data visualization for HEP experiments will be presented. Suggestions for the upgrade of data visualization tools and techniques in current experiments will be outlined, along with guidelines for future experiments. This paper expands on the summary content published in the HSF emph{Roadmap} Community White Paper~cite{HSF-CWP-2017-01}
At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and inte rpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.
To produce the best physics results, high energy physics experiments require access to calibration and other non-event data during event data processing. These conditions data are typically stored in databases that provide versioning functionality, a llowing physicists to make improvements while simultaneously guaranteeing the reproducibility of their results. With the increased complexity of modern experiments, and the evolution of computing models that demand large scale access to conditions data, the solutions for managing this access have evolved over time. In this white paper we give an overview of the conditions data access problem, present convergence on a common solution and present some considerations for the future.
Data processing frameworks are an essential part of HEP experiments software stacks. Frameworks provide a means by which code developers can undertake the essential tasks of physics data processing, accessing relevant inputs and storing their outputs , in a coherent way without needing to know the details of other domains. Frameworks provide essential core services for developers and help deliver a configurable working application to the experiments production systems. Modern HEP processing frameworks are in the process of adapting to a new computing landscape dominated by parallel processing and heterogeneity, which pose many questions regarding enhanced functionality and scaling that must be faced without compromising the maintainability of the code. In this paper we identify a program of work that can help further clarify the key concepts of frameworks for HEP and then spawn R&D activities that can focus the communitys efforts in the most efficient manner to address the challenges of the upcoming experimental program.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا