This paper represents the vision of the members of the Fermilab Scientific Computing Divisions Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Long term sustainability of the high energy physics (HEP) research software ecosystem is essential for the field. With upgrades and new facilities coming online throughout the 2020s this will only become increasingly relevant throughout this decade. Meeting this sustainability challenge requires a workforce with a combination of HEP domain knowledge and advanced software skills. The required software skills fall into three broad groups. The first is fundamental and generic software engineering (e.g. Unix, version control,C++, continuous integration). The second is knowledge of domain specific HEP packages and practices (e.g., the ROOT data format and analysis framework). The third is more advanced knowledge involving more specialized techniques. These include parallel programming, machine learning and data science tools, and techniques to preserve software projects at all scales. This paper dis-cusses the collective software training program in HEP and its activities led by the HEP Software Foundation (HSF) and the Institute for Research and Innovation in Software in HEP (IRIS-HEP). The program equips participants with an array of software skills that serve as ingredients from which solutions to the computing challenges of HEP can be formed. Beyond serving the community by ensuring that members are able to pursue research goals, this program serves individuals by providing intellectual capital and transferable skills that are becoming increasingly important to careers in the realm of software and computing, whether inside or outside HEP
Meta-software for data acquisition (DAQ) is a new approach to design the DAQ systems for experimental setups in experiments in high energy physics (HEP). It abstracts from experiment-specific data processing logic, but reflects it through configuration. It is also intended to substitute highly integrated DAQ software for a swarm of single-functional components, orchestrated by universal meta-software.
LHCb is the experiment at the Large Hadron Collider devoted to studies of new phenomena in CP violation and in rare decays. This review summarizes the status of the experiment in the imminence of the data taking, the prospects for the first measurements and highlights of its full physics program.
Setting up the infrastructure to manage a software project can become a task as significant writing the software itself. A variety of useful open source tools are available, such as Web-based viewers for version control systems, wikis for collaborative discussions and bug-tracking systems, but their use in high-energy physics, outside large collaborations, is insubstantial. Understandably, physicists would rather do physics than configure project management tools. We introduce the CEDAR HepForge system, which provides a lightweight development environment for HEP software. Services available as part of HepForge include the above-mentioned tools as well as mailing lists, shell accounts, archiving of releases and low-maintenance Web space. HepForge also exists to promote best-practice software development methods and to provide a central repository for re-usable HEP software and phenomenology codes.
The brief history, physics program and the current status of the SVD-2 detector is presented. The future plans for the experiments with upgraded SVD-2M setup is discussed.