Do you want to publish a course? Click here

Beam-based aperture measurements with movable collimator jaws as performance booster of the CERN Large Hadron Collider

276   0   0.0 ( 0 )
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

The beam aperture of a particle accelerator defines the clearance available for the circulating beams and is a parameter of paramount importance for the accelerator performance. At the CERN Large Hadron Collider (LHC), the knowledge and control of the available aperture is crucial because the nominal proton beams carry an energy of 362 MJ stored in a superconducting environment. Even a tiny fraction of beam losses could quench the superconducting magnets or cause severe material damage. Furthermore, in a circular collider, the performance in terms of peak luminosity depends to a large extent on the aperture of the inner triplet quadrupoles, which are used to focus the beams at the interaction points. In the LHC, this aperture represents the smallest aperture at top-energy with squeezed beams and determines the maximum potential reach of the peak luminosity. Beam-based aperture measurements in these conditions are difficult and challenging. In this paper, we present different methods that have been developed over the years for precise beam-based aperture measurements in the LHC, highlighting applications and results that contributed to boost the operational LHC performance in Run 1 (2010-2013) and Run 2 (2015-2018).



rate research

Read More

This paper presents a review of the recent Machine Learning activities carried out on beam measurements performed at the CERN Large Hadron Collider. This paper has been accepted for publication in IEEE Instrumentation and Measurement Magazine and in the published version no abstract is provided.
The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful running period in 2010--2013, the LHC was routinely storing protons at 3.5--4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An un-controlled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multi-stage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte-Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.
Machine learning entails a broad range of techniques that have been widely used in Science and Engineering since decades. High-energy physics has also profited from the power of these tools for advanced analysis of colliders data. It is only up until recently that Machine Learning has started to be applied successfully in the domain of Accelerator Physics, which is testified by intense efforts deployed in this domain by several laboratories worldwide. This is also the case of CERN, where recently focused efforts have been devoted to the application of Machine Learning techniques to beam dynamics studies at the Large Hadron Collider (LHC). This implies a wide spectrum of applications from beam measurements and machine performance optimisation to analysis of numerical data from tracking simulations of non-linear beam dynamics. In this paper, the LHC-related applications that are currently pursued are presented and discussed in detail, paying also attention to future developments.
A good understanding of the luminosity performance in a collider, as well as reliable tools to analyse, predict, and optimise the performance, are of great importance for the successful planning and execution of future runs. In this article, we present two different models for the evolution of the beam parameters and the luminosity in heavy-ion colliders. The first, Collider Time Evolution (CTE) is a particle tracking code, while the second, the Multi-Bunch Simulation (MBS), is based on the numerical solution of ordinary differential equations for beam parameters. As a benchmark, we compare simulations and data for a large number of physics fills in the 2018 Pb-Pb run at the CERN Large Hadron Collider (LHC), finding excellent agreement for most parameters, both between the simulations and with the measured data. Both codes are then used independently to predict the performance in future heavy-ion operation, with both Pb-Pb and p-Pb collisions, at the LHC and its upgrade, the High-Luminosity LHC. The use of two independent codes based on different principles gives increased confidence in the results.
77 - C. M. Bhat , S. Bhat 2017
Increasing proton beam power on neutrino production targets is one of the major goals of the Fermilab long term accelerator programs. In this effort, the Fermilab 8 GeV Booster synchrotron plays a critical role for at least the next two decades. Therefore, understanding the Booster in great detail is important as we continue to improve its performance. For example, it is important to know accurately the available RF power in the Booster by carrying out beam-based measurements in order to specify the needed upgrades to the Booster RF system. Since the Booster magnetic field is changing continuously measuring/calibrating the RF voltage is not a trivial task. Here, we present a beam based method for the RF voltage measurements. Data analysis is carried out using computer programs developed in Python and MATLAB. The method presented here is applicable to any RCS which do not have flat-bottom and flat-top in the acceleration magnetic ramps. We have also carried out longitudinal beam tomography at injection and extraction energies with the data used for RF voltage measurements. Beam based RF voltage measurements and beam tomography were never done before for the Fermilab Booster. The results from these investigations will be very useful in future intensity upgrades.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا