No Arabic abstract
During the beam commissioning of the Large Hadron Collider (LHC) with 150, 75, 50 and 25-ns bunch spacing, important electron-cloud effects, like pressure rise, cryogenic heat load, beam instabilities or emittance growth, were observed. A method has been developed to infer different key beam-pipe surface parameters by benchmarking simulations and pressure rise observed in the machine. This method allows us to monitor the scrubbing process (i.e. the reduction of the secondary emission yield as a function of time) in the regions where the vacuum-pressure gauges are located, in order to decide on the most appropriate strategies for machine operation. In this paper we present the methodology and first results from applying this technique to the LHC.
After a successful scrubbing run in the beginning of 2011, the LHC can be presently operated with high intensity proton beams with 50 ns bunch spacing. However, strong electron cloud effects were observed during machine studies with the nominal beam with 25 ns bunch spacing. In particular, fast transverse instabilities were observed when attempting to inject trains of 48 bunches into the LHC for the first time. An analysis of the turn-by-turn bunch-bybunch data from the transverse damper pick-ups during these injection studies is presented, showing a clear signature of the electron cloud effect. These experimental observations are reproduced using numerical simulations: the electron distribution before each bunch passage is generated with PyECLOUD and used as input for a set of HEADTAIL simulations. This paper describes the simulation method as well as the sensitivity of the results to the initial conditions for the electron build-up. The potential of this type of simulations and their clear limitations on the other hand are discussed.
Several indicators have pointed to the presence of an Electron Cloud (EC) in some of the CERN accelerators, when operating with closely spaced bunched beams. In particular, spurious signals on the pick ups used for beam detection, pressure rise and beam instabilities were observed at the Proton Synchrotron (PS) during the last stage of preparation of the beams for the Large Hadron Collider (LHC), as well as at the Super Proton Synchrotron (SPS). Since the LHC has started operation in 2009, typical electron cloud phenomena have appeared also in this machine, when running with trains of closely packed bunches (i.e. with spacings below 150ns). Beside the above mentioned indicators, other typical signatures were seen in this machine (due to its operation mode and/or more refined detection possibilities), like heat load in the cold dipoles, bunch dependent emittance growth and degraded lifetime in store and bunch-by-bunch stable phase shift to compensate for the energy loss due to the electron cloud. An overview of the electron cloud status in the different CERN machines (PS, SPS, LHC) will be presented in this paper, with a special emphasis on the dangers for future operation with more intense beams and the necessary countermeasures to mitigate or suppress the effect.
In the beam pipe of the positron damping ring of the Next Linear Collider, electrons will be created by beam interaction with the surrounding vacuum chamber wall and give rise to an electron cloud. Several solutions are possible for avoiding the electron cloud, without changing the bunch structure or the diameter of the vacuum chamber. Some of the currently available solutions for preventing this spurious electron load include reducing residual gas ionization by the beam, minimizing beam photon-induced electron production, and lowering the secondary electron yield (SEY) of the chamber wall. We will report on recent SEY measurements performed at SLAC on TiN coatings and TiZrV non-evaporable getter thin films.
A precise determination of absolute luminosity, using the bremsstrahlung process, at the future Electron-Ion Collider (EIC) will be very demanding, and its three major challenges are discussed herein. First, the bremsstrahlung rate suppression due to the so-called beam size effect has to be well controlled. Secondly, the impact of huge synchrotron radiation fluxes should be mitigated. Thirdly, enormous bremsstrahlung event rates, in excess of 10 GHz, have to be coped with. A basic layout of the luminosity measurement setup at the EIC, addressing these issues, is proposed, including preliminary detector technology choices. Finally, the uncertainties of three proposed methods are also discussed.
The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful running period in 2010--2013, the LHC was routinely storing protons at 3.5--4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An un-controlled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multi-stage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte-Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.