ترغب بنشر مسار تعليمي؟ اضغط هنا

Big Science and the Large Hadron Collider

114   0   0.0 ( 0 )
 نشر من قبل Gian Francesco Giudice
 تاريخ النشر 2011
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question here by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.



قيم البحث

اقرأ أيضاً

We investigate new physics scenarios where systems comprised of a single top quark accompanied by missing transverse energy, dubbed monotops, can be produced at the LHC. Following a simplified model approach, we describe all possible monotop producti on modes via an effective theory and estimate the sensitivity of the LHC, assuming 20 fb$^{-1}$ of collisions at a center-of-mass energy of 8 TeV, to the observation of a monotop state. Considering both leptonic and hadronic top quark decays, we show that large fractions of the parameter space are reachable and that new physics particles with masses ranging up to 1.5 TeV can leave hints within the 2012 LHC dataset, assuming moderate new physics coupling strengths.
115 - Henry J. Frisch 2018
This is a personal and admittedly US-centric attempt to summarize the foundational impact of the Pisa CDF Group on the conceptual design, construction, and early operation of the CDF Detector at Fermilab. I have tried to go back to original documents where possible.
We present a new calculation of the energy distribution of high-energy neutrinos from the decay of charm and bottom hadrons produced at the Large Hadron Collider (LHC). In the kinematical region of very forward rapidities, heavy-flavor production and decay is a source of tau neutrinos that leads to thousands of { charged-current} tau neutrino events in a 1 m long, 1 m radius lead neutrino detector at a distance of 480 m from the interaction region. In our computation, next-to-leading order QCD radiative corrections are accounted for in the production cross-sections. Non-perturbative intrinsic-$k_T$ effects are approximated by a simple phenomenological model introducing a Gaussian $k_T$-smearing of the parton distribution functions, which might also mimic perturbative effects due to multiple initial-state soft-gluon emissions. The transition from partonic to hadronic states is described by phenomenological fragmentation functions. To study the effect of various input parameters, theoretical predictions for $D_s^pm$ production are compared with LHCb data on double-differential cross-sections in transverse momentum and rapidity. The uncertainties related to the choice of the input parameter values, ultimately affecting the predictions of the tau neutrino event distributions, are discussed. We consider a 3+1 neutrino mixing scenario to illustrate the potential for a neutrino experiment to constrain the 3+1 parameter space using tau neutrinos and antineutrinos. We find large theoretical uncertainties in the predictions of the neutrino fluxes in the far-forward region. Untangling the effects of tau neutrino oscillations into sterile neutrinos and distinguishing a 3+1 scenario from the standard scenario with three active neutrino flavours, will be challenging due to the large theoretical uncertainties from QCD.
137 - S.Y. Choi 2008
The next-generation high-energy facilities, the CERN Large Hadron Collider (LHC) and the prospective $e^+e^-$ International Linear Collider (ILC), are expected to unravel new structures of matter and forces from the electroweak scale to the TeV scale . In this report we review the complementary role of LHC and ILC in drawing a comprehensive and high-precision picture of the mechanism breaking the electroweak symmetries and generating mass, and the unification of forces in the frame of supersymmetry.
For the foreseeable future, the exploration of the high-energy frontier will be the domain of the Large Hadron Collider (LHC). Of particular significance will be its high-luminosity upgrade (HL-LHC), which will operate until the mid-2030s. In this en deavour, for the full exploitation of the HL-LHC physics potential an improved understanding of the parton distribution functions (PDFs) of the proton is critical. The HL-LHC program would be uniquely complemented by the proposed Large Hadron electron Collider (LHeC), a high-energy lepton-proton and lepton-nucleus collider based at CERN. In this work, we build on our recent PDF projections for the HL-LHC to assess the constraining power of the LHeC measurements of inclusive and heavy quark structure functions. We find that the impact of the LHeC would be significant, reducing PDF uncertainties by up to an order of magnitude in comparison to state-of-the-art global fits. In comparison to the HL-LHC projections, the PDF constraints from the LHeC are in general more significant for small and intermediate values of the momentum fraction x. At higher values of x, the impact of the LHeC and HL-LHC data is expected to be of a comparable size, with the HL-LHC constraints being more competitive in some cases, and the LHeC ones in others. Our results illustrate the encouraging complementarity of the HL-LHC and the LHeC in terms of charting the quark and gluon structure of the proton.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا