Do you want to publish a course? Click here

Performance and luminosity models for heavy-ion operation at the CERN Large Hadron Collider

106   0   0.0 ( 0 )
 Added by Roderik Bruce
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

A good understanding of the luminosity performance in a collider, as well as reliable tools to analyse, predict, and optimise the performance, are of great importance for the successful planning and execution of future runs. In this article, we present two different models for the evolution of the beam parameters and the luminosity in heavy-ion colliders. The first, Collider Time Evolution (CTE) is a particle tracking code, while the second, the Multi-Bunch Simulation (MBS), is based on the numerical solution of ordinary differential equations for beam parameters. As a benchmark, we compare simulations and data for a large number of physics fills in the 2018 Pb-Pb run at the CERN Large Hadron Collider (LHC), finding excellent agreement for most parameters, both between the simulations and with the measured data. Both codes are then used independently to predict the performance in future heavy-ion operation, with both Pb-Pb and p-Pb collisions, at the LHC and its upgrade, the High-Luminosity LHC. The use of two independent codes based on different principles gives increased confidence in the results.

rate research

Read More

We have studied the time evolution of the heavy ion luminosity and bunch intensities in the Relativistic Heavy Ion Collider (RHIC), at BNL, and in the Large Hadron Collider (LHC), at CERN. First, we present measurements from a large number of RHIC stores (from Run 7), colliding 100 GeV/nucleon Au beams without stochastic cooling. These are compared with two different calculation methods. The first is a simulation based on multi-particle tracking taking into account collisions, intrabeam scattering, radiation damping, and synchrotron and betatron motion. In the second, faster, method, a system of ordinary differential equations with terms describing the corresponding effects on emittances and bunch populations is solved numerically. Results of the tracking method agree very well with the RHIC data. With the faster method, significant discrepancies are found since the losses of particles diffusing out of the RF bucket due to intrabeam scattering are not modeled accurately enough. Finally, we use both methods to make predictions of the time evolution of the future Pb beams in the LHC at injection and collision energy. For this machine, the two methods agree well.
This paper presents a review of the recent Machine Learning activities carried out on beam measurements performed at the CERN Large Hadron Collider. This paper has been accepted for publication in IEEE Instrumentation and Measurement Magazine and in the published version no abstract is provided.
Machine learning entails a broad range of techniques that have been widely used in Science and Engineering since decades. High-energy physics has also profited from the power of these tools for advanced analysis of colliders data. It is only up until recently that Machine Learning has started to be applied successfully in the domain of Accelerator Physics, which is testified by intense efforts deployed in this domain by several laboratories worldwide. This is also the case of CERN, where recently focused efforts have been devoted to the application of Machine Learning techniques to beam dynamics studies at the Large Hadron Collider (LHC). This implies a wide spectrum of applications from beam measurements and machine performance optimisation to analysis of numerical data from tracking simulations of non-linear beam dynamics. In this paper, the LHC-related applications that are currently pursued are presented and discussed in detail, paying also attention to future developments.
107 - M.Petrovici , A.Lindner , A.Pop 2018
Based on the recent RHIC and LHC experimental results, the $langle p_Trangle$ dependence of identified light flavour charged hadrons on $sqrt{(frac{dN}{dy})/S_{perp}}$, relevant scale in gluon saturation picture, is studied from $sqrt{s_{NN}}$=7.7 GeV up to 5.02 TeV. This study is extended to the slopes of the $langle p_Trangle$ dependence on the particle mass and the $langlebeta_Trangle$ parameter from Boltzmann-Gibbs Blast Wave (BGBW) fits of the $p_T$ spectra. A systematic decrease of the slope of the $langle p_Trangle$ dependence on $sqrt{(frac{dN}{dy})/S_{perp}}$ from BES to the LHC energies is evidenced. While for the RHIC energies, within the experimental errors, the $langle p_Trangle$/$sqrt{(frac{dN}{dy})/S_{perp}}$ does not depend on centrality, at the LHC energies a deviation from a linear behaviour is observed towards the most central collisions. The influence of the corona contribution to the observed trends is discussed. The slopes of the $langle p_Trangle$ particle mass dependence and the $langlebeta_Trangle$ parameter from BGBW fits scale well with $sqrt{(frac{dN}{dy})/S_{perp}}$. Similar systematic trends for pp at $sqrt{s}$=7 TeV are in a good agreement with the ones corresponding to Pb-Pb collisions at $sqrt{s_{NN}}$=2.76 TeV and 5.02 TeV pointing to a system size independent behaviour.
The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful running period in 2010--2013, the LHC was routinely storing protons at 3.5--4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An un-controlled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multi-stage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte-Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا