ترغب بنشر مسار تعليمي؟ اضغط هنا

Analysis of possible systematic errors in the Oslo method

109   0   0.0 ( 0 )
 نشر من قبل Ann-Cecilie Larsen
 تاريخ النشر 2012
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

In this work, we have reviewed the Oslo method, which enables the simultaneous extraction of level density and gamma-ray transmission coefficient from a set of particle-gamma coincidence data. Possible errors and uncertainties have been investigated. Typical data sets from various mass regions as well as simulated data have been tested against the assumptions behind the data analysis.



قيم البحث

اقرأ أيضاً

260 - F. Zeiser , G. Potel , G.M. Tveten 2019
In this paper we present the first systematic analysis of the impact of the populated vs. intrinsic spin distribution on the nuclear level density and $gamma$-ray strength function retrieved through the Oslo Method. We illustrate the effect of the sp in distribution on the recently performed $^{239}mathrm{Pu}$(d,p$gamma$)$^{240}mathrm{Pu}$ experiment using a 12 MeV deuteron beam performed at the Oslo Cyclotron Lab. In the analysis we couple state-of-the-art calculations for the populated spin-distributions with the Monte-Carlo nuclear decay code RAINIER to compare Oslo Method results to the known input. We find that good knowledge of the populated spin distribution is crucial and show that the populated distribution has a significant impact on the extracted nuclear level density and $gamma$-ray strength function for the $^{239}mathrm{Pu}$(d,p$gamma$)$^{240}mathrm{Pu}$ case.
Unknown neutron-capture reaction rates remain a significant source of uncertainty in state-of-the-art $r$-process nucleosynthesis reaction network calculations. As the $r$-process involves highly neutron-rich nuclei for which direct ($n,gamma$) cross -section measurements are virtually impossible, indirect methods are called for to constrain ($n,gamma$) cross sections used as input for the $r$-process nuclear network. Here we discuss the newly developed beta-Oslo method, which is capable of providing experimental input for calculating ($n,gamma$) rates of neutron-rich nuclei. The beta-Oslo method represents a first step towards constraining neutron-capture rates of importance to the $r$-process.
83 - V. W. Ingeberg 2018
The $gamma$-ray strength function ($gamma$SF) and nuclear level density (NLD) have been extracted for the first time from inverse kinematic reactions with the Oslo Method. This novel technique allows measurements of these properties across a wide ran ge of previously inaccessible nuclei. Proton-$gamma$ coincidence events from the $mathrm{d}(^{86}mathrm{Kr}, mathrm{p}gamma)^{87}mathrm{Kr}$ reaction were measured at iThemba LABS and the $gamma$SF and NLD in $^{87}mathrm{Kr}$ obtained. The low-energy region of the $gamma$SF is compared to Shell Model calculations which suggest this region to be dominated by M1 strength. The $gamma$SF and NLD are used as input parameters to Hauser-Feshbach calculations to constrain $(mathrm{n},gamma)$ cross sections of nuclei using the TALYS reaction code. These results are compared to $^{86}mathrm{Kr}(n,gamma)$ data from direct measurements.
Gaussian Quantum Monte Carlo (GQMC) is a stochastic phase space method for fermions with positive weights. In the example of the Hubbard model close to half filling it fails to reproduce all the symmetries of the ground state leading to systematic er rors at low temperatures. In a previous work [Phys. Rev. B {bf 72}, 224518 (2005)] we proposed to restore the broken symmetries by projecting the density matrix obtained from the simulation onto the ground state symmetry sector. For ground state properties, the accuracy of this method depends on a {it large overlap} between the GQMC and exact density matrices. Thus, the method is not rigorously exact. We present the limits of the approach by a systematic study of the method for 2 and 3 leg Hubbard ladders for different fillings and on-site repulsion strengths. We show several indications that the systematic errors stem from non-vanishing boundary terms in the partial integration step in the derivation of the Fokker-Planck equation. Checking for spiking trajectories and slow decaying probability distributions provides an important test of the reliability of the results. Possible solutions to avoid boundary terms are discussed. Furthermore we compare results obtained from two different sampling methods: Reconfiguration of walkers and the Metropolis algorithm.
141 - M. J. Tannenbaum 2018
Centrality definition in A$+$A collisions at colliders such as RHIC and LHC suffers from a correlated systematic uncertainty caused by the efficiency of detecting a p$+$p collision ($50pm 5%$ for PHENIX at RHIC). In A$+$A collisions where centrality is measured by the number of nucleon collisions, $N_{rm coll}$, or the number of nucleon participants, $N_{rm part}$, or the number of constituent quark participants, $N_{rm qp}$, the error in the efficiency of the primary interaction trigger (Beam-Beam Counters) for a p$+$p collision leads to a correlated systematic uncertainty in $N_{rm part}$, $N_{rm coll}$ or $N_{rm qp}$ which reduces binomially as the A$+$A collisions become more central. If this is not correctly accounted for in projections of A$+$A to p$+$p collisions, then mistaken conclusions can result. A recent example is presented in whether the mid-rapidity charged multiplicity per constituent quark participant $({dN_{rm ch}/deta})/{N_{rm qp}}$ in Au$+$Au at RHIC was the same as the value in p$+$p collisions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا