We discuss main issues and R&D Required for the Intensity Frontier Accelerators and therefore provide input for the 2013 APS/DPF Community Summer Study (Snowmass-2013).
Operation, upgrade and development of accelerators for Intensity Frontier face formidable challenges in order to satisfy both the near-term and long-term Particle Physics program. Here we discuss key issues and R&D required for the Intensity Frontier accelerators.
The Intensity Frontier (IF) is a primary focus of the U.S.-based particle physics program. It encompasses a large spectrum of physics, including quark flavor physics, charged lepton processes, neutrinos, baryon number violation, new light weakly-coupled particles, and nucleons, nuclei and atoms. There are many experiments, a range of scales in data output and throughput, and a wide range in the number of experimenters. The experiments, projects and theory in this area all require demanding computing capabilities and technologies. The IF experiments have significant computing requirements for simulation, theory and modeling, beam line and experiment design, triggers and DAQ, online monitoring, event reconstruction and processing, and physics analysis. We have conducted a qualitative survey of the current and near-term future experiments in the IF to understand the computing demands of this area and their expected evolution. This report details the expected computing requirements for the IF in the context of the Snowmass Community Summer Study 2013.
Experimental results and simulation models show that crystals might play a relevant role for the development of new generations of high-energy and high-intensity particle accelerators and might disclose innovative possibilities at existing ones. In this paper we describe the most advanced manufacturing techniques of crystals suitable for operations at ultra-high energy and ultra-high intensity particle accelerators, reporting as an example of potential applications the collimation of the particle beams circulating in the Large Hadron Collider at CERN, which will be upgraded through the addition of bent crystals in the frame of the High Luminosity Large Hadron Collider project.
The ever increasing demands placed upon machine performance have resulted in the need for more comprehensive particle accelerator modeling. Computer simulations are key to the success of particle accelerators. Many aspects of particle accelerators rely on computer modeling at some point, sometimes requiring complex simulation tools and massively parallel supercomputing. Examples include the modeling of beams at extreme intensities and densities (toward the quantum degeneracy limit), and with ultra-fine control (down to the level of individual particles). In the future, adaptively tuned models might also be relied upon to provide beam measurements beyond the resolution of existing diagnostics. Much time and effort has been put into creating accelerator software tools, some of which are highly successful. However, there are also shortcomings such as the general inability of existing software to be easily modified to meet changing simulation needs. In this paper possible mitigating strategies are discussed for issues faced by the accelerator community as it endeavors to produce better and more comprehensive modeling tools. This includes lack of coordination between code developers, lack of standards to make codes portable and/or reusable, lack of documentation, among others.
In this Snowmass whitepaper, we describe the impact of ongoing and proposed intensity frontier experiments on the parameter space of the Minimally Supersymmetric Standard Model (MSSM). We extend a set of phenomenological MSSM (pMSSM) models to include non-zero CP-violating phases and study the sensitivity of various flavor observables in these scenarios Future electric dipole moment and rare meson decay experiments can have a strong impact on the viability of these models that is relatively independent of the detailed superpartner spectrum. In particular, we find that these experiments have the potential to probe models that are expected to escape searches at the high-luminosity LHC.