ترغب بنشر مسار تعليمي؟ اضغط هنا

Proton-nucleus (p+A) collisions have long been recognized as a crucial component of the physics programme with nuclear beams at high energies, in particular for their reference role to interpret and understand nucleus-nucleus data as well as for thei r potential to elucidate the partonic structure of matter at low parton fractional momenta (small-x). Here, we summarize the main motivations that make a proton-nucleus run a decisive ingredient for a successful heavy-ion programme at the Large Hadron Collider (LHC) and we present unique scientific opportunities arising from these collisions. We also review the status of ongoing discussions about operation plans for the p+A mode at the LHC.
We discuss the longitudinal structure function in nuclear DIS at small $x$. We work within the framework of universal parton densities obtained in DGLAP analyses at NLO. We show that the nuclear effects on the longitudinal structure function closely follow those on the gluon distribution. The error analyses available from newest sets of nuclear PDFs also allow to propagate the uncertainties from present data. In this way, we evaluate the minimal sensitivity required in future experiments for this observable to improve the knowledge of the nuclear glue. We further discuss the uncertainties on the extraction of $F_2$ off nuclear targets, introduced by the usual assumption that the ratio $F_L/F_2$ is independent of the nuclear size. We focus on the kinematical regions relevant for future lepton-ion colliders.
We present a Monte Carlo implementation, within PYTHIA, of medium-induced gluon radiation in the final state branching process. Medium effects are introduced through an additive term in the splitting functions computed in the multiple-soft scattering approximation. The observable effects of this modification are studied for different quantities as fragmentation functions and the hump-backed plateau, and transverse momentum and angular distributions. The anticipated increase of intra-jet multiplicities, energy loss of the leading particle and jet broadening are observed as well as modifications of naive expectations based solely on analytical calculations. This shows the adequacy of a Monte Carlo simulator for jet analyses. Effects of hadronization are found to wash out medium effects in the soft region, while the main features remain. To show the performance of the implementation and the feasibility of our approach in realistic experimental situations we provide some examples: fragmentation functions, nuclear suppression factors, jet shapes and jet multiplicities. The package containing the modified routines is available for public use. This code, which is not an official PYTHIA release, is called Q-PYTHIA. We also include a short manual to perform the simulations of jet quenching.
In this talk, we introduce our recently completed next-to-leading order (NLO) global analysis of the nuclear parton distribution functions (nPDFs) called EPS09 - a higher order successor to the well-known leading-order (LO) analysis EKS98 and also to our previous LO work EPS08. As an extension to similar global analyses carried out by other groups, we complement the data from deep inelastic $l+A$ scattering and Drell-Yan dilepton measurements in p+$A$ collisions by inclusive midrapidity pion production data from d+Au collisions at RHIC, which results in better constrained gluon distributions than before. The most important new ingredient, however, is the detailed error analysis, which employs the Hessian method and which allows us to map out the parameter-space vicinity of the best-fit to a collection of nPDF error sets. These error sets provide the end-user a way to compute how the PDF-uncertainties will propagate into the cross sections of his/her interest. The EPS09 package to be released soon, will contain both the NLO and LO results for the best fits and the uncertainty sets.
We present a next-to-leading order (NLO) global DGLAP analysis of nuclear parton distribution functions (nPDFs) and their uncertainties. Carrying out an NLO nPDF analysis for the first time with three different types of experimental input -- deep ine lastic $ell$+A scattering, Drell-Yan dilepton production in p+$A$ collisions, and inclusive pion production in d+Au and p+p collisions at RHIC -- we find that these data can well be described in a conventional collinear factorization framework. Although the pion production has not been traditionally included in the global analyses, we find that the shape of the nuclear modification factor $R_{rm dAu}$ of the pion $p_T$-spectrum at midrapidity retains sensitivity to the gluon distributions, providing evidence for shadowing and EMC-effect in the nuclear gluons. We use the Hessian method to quantify the nPDF uncertainties which originate from the uncertainties in the data. In this method the sensitivity of $chi^2$ to the variations of the fitting parameters is mapped out to orthogonal error sets which provide a user-friendly way to calculate how the nPDF uncertainties propagate to any factorizable nuclear cross-section. The obtained NLO and LO nPDFs and the corresponding error sets are collected in our new release called {ttfamily EPS09}. These results should find applications in precision analyses of the signatures and properties of QCD matter at the LHC and RHIC.
We present an improved leading-order global DGLAP analysis of nuclear parton distribution functions (nPDFs), supplementing the traditionally used data from deep inelastic lepton-nucleus scattering and Drell-Yan dilepton production in proton-nucleus c ollisions, with inclusive high-$p_T$ hadron production data measured at RHIC in d+Au collisions. With the help of an extended definition of the $chi^2$ function, we now can more efficiently exploit the constraints the different data sets offer, for gluon shadowing in particular, and account for the overall data normalization uncertainties during the automated $chi^2$ minimization. The very good simultaneous fit to the nuclear hard process data used demonstrates the feasibility of a universal set of nPDFs, but also limitations become visible. The high-$p_T$ forward-rapidity hadron data of BRAHMS add a new crucial constraint into the analysis by offering a direct probe for the nuclear gluon distributions -- a sector in the nPDFs which has traditionally been very badly constrained. We obtain a strikingly stronger gluon shadowing than what has been estimated in previous global analyses. The obtained nPDFs are released as a parametrization called EPS08.
Medium-induced gluon radiation is usually identified as the dominant dynamical mechanism underling the {it jet quenching} phenomenon observed in heavy-ion collisions. In its actual implementation, multiple medium-induced gluon emissions are assumed t o be independent, leading, in the eikonal approximation, to a Poisson distribution. Here, we introduce a medium term in the splitting probabilities so that both medium and vacuum contributions are included on the same footing in a DGLAP approach. The improvements include energy-momentum conservation at each individual splitting, medium-modified virtuality evolution and a coherent implementation of vacuum and medium splitting probabilities. Noticeably, the usual formalism is recovered when the virtuality and the energy of the parton are very large. This leads to a similar description of the suppression observed in heavy-ion collisions with values of the transport coefficient of the same order as those obtained using the {it quenching weights}.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا