Do you want to publish a course? Click here

Expected Performance of the ATLAS Experiment - Detector, Trigger and Physics

246   0   0.0 ( 0 )
 Added by Thomas Kluge
 Publication date 2008
  fields
and research's language is English




Ask ChatGPT about the research

A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN.



rate research

Read More

Many theoretical models, like the Standard Model or SUSY at large tan(beta), predict Higgs bosons or new particles which decay more abundantly to final states including tau leptons than to other leptons. At the energy scale of the LHC, the identification of tau leptons, in particular in the hadronic decay mode, will be a challenging task due to an overwhelming QCD background which gives rise to jets of particles that can be hard to distinguish from hadronic tau decays. Equipped with excellent tracking and calorimetry, the ATLAS experiment has developed tau identification tools capable of working at the trigger level. This contribution presents tau trigger algorithms which exploit the main features of hadronic tau decays and describes the current tau trigger commissioning activities.
59 - Catrin Bernius 2017
The ATLAS trigger has been used very successfully for the online event selection during the first part of the second LHC run (Run-2) in 2015/16 at a center-of-mass energy of 13 TeV. The trigger system is composed of a hardware Level-1 trigger and a software-based high-level trigger; it reduces the event rate from the bunch-crossing rate of 40 MHz to an average recording rate of about 1 kHz. The excellent performance of the ATLAS trigger has been vital for the ATLAS physics program of Run-2, selecting interesting collision events for a wide variety of physics signatures with high efficiency. The trigger selection capabilities have been significantly improved compared to Run-1, in order to cope with the higher event rates and pile-up during Run-2. At the Level-1 trigger the undertaken improvements resulted in more pile-up robust selection efficiencies and event rates and in a reduction of fake candidate particles. A new hardware system, designed to analyze event-topologies, supports a more refined event selection at the Level-1. A hardware-based high-rate track reconstruction is currently being commissioned. Together with a re-design of the high-level trigger to deploy more offline-like reconstruction techniques, these changes improve the performance of the trigger selection turn-on and efficiency to nearly that of the offline reconstruction. In order to prepare for the anticipated further luminosity increase of the LHC in 2017/18, improving the trigger performance remains an ongoing endeavor. The large number of pile-up events is one of the most prominent challenges. This report gives a short review the ATLAS trigger system and its performance in 2015/16 before describing the significant improvements in selection sensitivity and pile-up robustness, which we implemented in preparation for the expected highest ever luminosities of the 2017/18 LHC.
In this paper, we present the physics performance of the ESSnuSB experiment in the standard three flavor scenario using the updated neutrino flux calculated specifically for the ESSnuSB configuration and updated migration matrices for the far detector. Taking conservative systematic uncertainties corresponding to a normalization error of $5%$ for signal and $10%$ for background, we find that there is $10sigma$ $(13sigma)$ CP violation discovery sensitivity for the baseline option of 540 km (360 km) at $delta_{rm CP} = pm 90^circ$. The corresponding fraction of $delta_{rm CP}$ for which CP violation can be discovered at more than $5 sigma$ is $70%$. Regarding CP precision measurements, the $1sigma$ error associated with $delta_{rm CP} = 0^circ$ is around $5^circ$ and with $delta_{rm CP} = -90^circ$ is around $14^circ$ $(7^circ)$ for the baseline option of 540 km (360 km). For hierarchy sensitivity, one can have $3sigma$ sensitivity for 540 km baseline except $delta_{rm CP} = pm 90^circ$ and $5sigma$ sensitivity for 360 km baseline for all values of $delta_{rm CP}$. The octant of $theta_{23}$ can be determined at $3 sigma$ for the values of: $theta_{23} > 51^circ$ ($theta_{23} < 42^circ$ and $theta_{23} > 49^circ$) for baseline of 540 km (360 km). Regarding measurement precision of the atmospheric mixing parameters, the allowed values at $3 sigma$ are: $40^circ < theta_{23} < 52^circ$ ($42^circ < theta_{23} < 51.5^circ$) and $2.485 times 10^{-3}$ eV$^2 < Delta m^2_{31} < 2.545 times 10^{-3}$ eV$^2$ ($2.49 times 10^{-3}$ eV$^2 < Delta m^2_{31} < 2.54 times 10^{-3}$ eV$^2$) for the baseline of 540 km (360 km).
128 - A. Hamilton 2010
The ATLAS trigger has been used very successfully to collect collision data during 2009 and 2010 LHC running at centre of mass energies of 900 GeV, 2.36 TeV, and 7 TeV. This paper presents the ongoing work to commission the ATLAS trigger with proton collisions, including an overview of the performance of the trigger based on extensive online running. We describe how the trigger has evolved with increasing LHC luminosity and give a brief overview of plans for forthcoming LHC running.
Given the extremely high output rate foreseen at LHC and the general-purpose nature of ATLAS experiment, an efficient and flexible way to select events in the High Level Trigger is needed. An extremely flexible solution is proposed that allows for early rejection of unwanted events and an easily configurable way to choose algorithms and to specify the criteria for trigger decisions. It is implemented in the standard ATLAS object-oriented software framework, Athena. The early rejection is achieved by breaking the decision process down into sequential steps. The configuration of each step defines sequences of algorithms which should be used to process the data, and trigger menus that define which physics signatures must be satisfied to continue on to the next step, and ultimately to accept the event. A navigation system has been built on top of the standard Athena transient store (StoreGate) to link the event data together in a tree-like structure. This is fundamental to the seeding mechanism, by which data from one step is presented to the next. The design makes it straightforward to utilize existing off-line reconstruction data classes and algorithms when they are suitable
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا