Do you want to publish a course? Click here

Monotop phenomenology at the Large Hadron Collider

149   0   0.0 ( 0 )
 Added by Benjamin Fuks
 Publication date 2013
  fields
and research's language is English




Ask ChatGPT about the research

We investigate new physics scenarios where systems comprised of a single top quark accompanied by missing transverse energy, dubbed monotops, can be produced at the LHC. Following a simplified model approach, we describe all possible monotop production modes via an effective theory and estimate the sensitivity of the LHC, assuming 20 fb$^{-1}$ of collisions at a center-of-mass energy of 8 TeV, to the observation of a monotop state. Considering both leptonic and hadronic top quark decays, we show that large fractions of the parameter space are reachable and that new physics particles with masses ranging up to 1.5 TeV can leave hints within the 2012 LHC dataset, assuming moderate new physics coupling strengths.



rate research

Read More

We present a new calculation of the energy distribution of high-energy neutrinos from the decay of charm and bottom hadrons produced at the Large Hadron Collider (LHC). In the kinematical region of very forward rapidities, heavy-flavor production and decay is a source of tau neutrinos that leads to thousands of { charged-current} tau neutrino events in a 1 m long, 1 m radius lead neutrino detector at a distance of 480 m from the interaction region. In our computation, next-to-leading order QCD radiative corrections are accounted for in the production cross-sections. Non-perturbative intrinsic-$k_T$ effects are approximated by a simple phenomenological model introducing a Gaussian $k_T$-smearing of the parton distribution functions, which might also mimic perturbative effects due to multiple initial-state soft-gluon emissions. The transition from partonic to hadronic states is described by phenomenological fragmentation functions. To study the effect of various input parameters, theoretical predictions for $D_s^pm$ production are compared with LHCb data on double-differential cross-sections in transverse momentum and rapidity. The uncertainties related to the choice of the input parameter values, ultimately affecting the predictions of the tau neutrino event distributions, are discussed. We consider a 3+1 neutrino mixing scenario to illustrate the potential for a neutrino experiment to constrain the 3+1 parameter space using tau neutrinos and antineutrinos. We find large theoretical uncertainties in the predictions of the neutrino fluxes in the far-forward region. Untangling the effects of tau neutrino oscillations into sterile neutrinos and distinguishing a 3+1 scenario from the standard scenario with three active neutrino flavours, will be challenging due to the large theoretical uncertainties from QCD.
For the foreseeable future, the exploration of the high-energy frontier will be the domain of the Large Hadron Collider (LHC). Of particular significance will be its high-luminosity upgrade (HL-LHC), which will operate until the mid-2030s. In this endeavour, for the full exploitation of the HL-LHC physics potential an improved understanding of the parton distribution functions (PDFs) of the proton is critical. The HL-LHC program would be uniquely complemented by the proposed Large Hadron electron Collider (LHeC), a high-energy lepton-proton and lepton-nucleus collider based at CERN. In this work, we build on our recent PDF projections for the HL-LHC to assess the constraining power of the LHeC measurements of inclusive and heavy quark structure functions. We find that the impact of the LHeC would be significant, reducing PDF uncertainties by up to an order of magnitude in comparison to state-of-the-art global fits. In comparison to the HL-LHC projections, the PDF constraints from the LHeC are in general more significant for small and intermediate values of the momentum fraction x. At higher values of x, the impact of the LHeC and HL-LHC data is expected to be of a comparable size, with the HL-LHC constraints being more competitive in some cases, and the LHeC ones in others. Our results illustrate the encouraging complementarity of the HL-LHC and the LHeC in terms of charting the quark and gluon structure of the proton.
71 - B.C. Allanach 2017
We examine the phenomenology of the production, at the 13 TeV Large Hadron Collider (LHC), of a heavy resonance $X$, which decays via other new on-shell particles $n$ into multi- (i.e. three or more) photon final states. In the limit that $n$ has a much smaller mass than $X$, the multi-photon final state may dominantly appear as a two photon final state because the $gamma$s from the $n$ decay are highly collinear and remain unresolved. We discuss how to discriminate this scenario from $X rightarrow gamma gamma$: rather than discarding non-isolated photons, it is better instead to relax the isolation criterion and instead form photon jet substructure variables. The spins of $X$ and $n$ leave their imprint upon the distribution of pseudorapidity gap $Delta eta$ between the apparent two photon states. Depending on the total integrated luminosity, this can be used in many cases to claim discrimination between the possible spin choices of $X$ and $n$, although the case where $X$ and $n$ are both scalar particles cannot be discriminated from the direct $X rightarrow gamma gamma$ decay in this manner. Information on the mass of $n$ can be gained by considering the mass of each photon jet.
A radion in a scenario with a warped extra dimension can be lighter than the Higgs boson, even if the Kaluza-Klein excitation modes of the graviton turn out to be in the multi-TeV region. The discovery of such a light radion would be gateway to new physics. We show how the two-photon mode of decay can enable us to probe a radion in the mass range 60 - 110 GeV. We take into account the diphoton background, including fragmentation effects, and include cuts designed to suppress the background to the maximum possible extent. Our conclusion is that, with an integrated luminosity of 3000 $rm fb^{-1}$ or less, the next run of the Large Hadron Collider should be able to detect a radion in this mass range, with a significance of 5 standard deviations or more.
We consider a minimal extension of the standard model where a real, gauge singlet scalar field is added to the standard spectrum. Introducing the Ansatz of universality of scalar couplings, we are led to a scenario which has a set of very distinctive and testable predictions: (i) the mixing between the standard model Higgs and the new state is near maximal, (ii) the ratio of the two Higgs mass eigenstates is fixed ($sim sqrt{3}$), (iii) the decay modes of each of the two eigenstates are standard model like. We also study how electroweak precision tests constrain this scenario. We predict the lighter Higgs to lie in the range of 114 and 145 GeV, and hence the heavier one between 198 and 250 GeV. The predictions of the model can be tested at the upcoming LHC.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا