ترغب بنشر مسار تعليمي؟ اضغط هنا

Examining the Geometric Mean Method for the Extraction of Spatial Resolution

222   0   0.0 ( 0 )
 نشر من قبل Stefanos Leontsinis Mr.
 تاريخ النشر 2013
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The spatial resolution of a detector, using a reference detector telecscope, can be measured applying the geometric mean method, with tracks reconstructed from hits of all the detectors, including ($sigma_mathrm{in}$) and excluding ($sigma_mathrm{ex}$) the hit from the detector under study. The geometric mean of the two measured resolution values ($sigma=sqrt{sigma_mathrm{ex}sigma_mathrm{in}}$), is proposed to provide a more accurate estimate of the intrinsic detector resolution. This method has been tested using a Monte Carlo algorithm and is proven to give accurate results, independently of the distance between the detectors used for the track fitting. The method does not give meaningful results if all the detectors do not carry the same characteristics.



قيم البحث

اقرأ أيضاً

Using simulations and analytical approaches, we have studied single hit resolutions obtained with a binary readout, which is often proposed for high granularity detectors to reduce the generated data volume. Our simulations considering several parame ters (e.g. strip pitch) show that the detector geometry and an electronics parameter of the binary readout chips could be optimized for binary readout to offer an equivalent spatial resolution to the one with an analogue readout. To understand the behavior as a function of simulation parameters, we developed analytical models that reproduce simulation results with a few parameters. The models can be used to optimize detector designs and operation conditions with regard to the spatial resolution.
120 - M. Boronat , C. Marinas , A. Frey 2014
In this paper we explore the effect of $delta$-ray emission, fluctuations in th e signal deposition on the detection of charged particles in silicon-based detec tors. We show that these two effects ultimately limit the resolution that can be achieved by interpolation of the signal in finely segmented position-sensitive solid-state devices.
The geometric-mean method is often used to estimate the spatial resolution of a position-sensitive detector probed by tracks. It calculates the resolution solely from measured track data without using a detailed tracking simulation and without consid ering multiple Coulomb scattering effects. Two separate linear track fits are performed on the same data, one excluding and the other including the hit from the probed detector. The geometric mean of the widths of the corresponding exclusive and inclusive residual distributions for the probed detector is then taken as a measure of the intrinsic spatial resolution of the probed detector: $sigma=sqrt{sigma_{ex}cdotsigma_{in}}$. The validity of this method is examined for a range of resolutions with a stand-alone Geant4 Monte Carlo simulation that specifically takes multiple Coulomb scattering in the tracking detector materials into account. Using simulated as well as actual tracking data from a representative beam test scenario, we find that the geometric-mean method gives systematically inaccurate spatial resolution results. Good resolutions are estimated as poor and vice versa. The more the resolutions of reference detectors and probed detector differ, the larger the systematic bias. An attempt to correct this inaccuracy by statistically subtracting multiple-scattering effects from geometric-mean results leads to resolutions that are typically too optimistic by 10-50%. This supports an earlier critique of this method based on simulation studies that did not take multiple scattering into account.
We describe an algorithm which has been developed to extract fine granularity information from an electromagnetic calorimeter with strip-based readout. Such a calorimeter, based on scintillator strips, is being developed to apply particle flow recons truction to future experiments in high energy physics. Tests of this algorithm in full detector simulations, using strips of size 45 x 5 mm^2 show that the performance is close to that of a calorimeter with true 5 x 5 mm^2 readout granularity. The performance can be further improved by the use of 10 x 10 mm^2 tile- shaped layers interspersed between strip layers.
The DArk Matter Particle Explorer (DAMPE) is a space-borne high energy cosmic-ray and $gamma$-ray detector which operates smoothly since the launch on December 17, 2015. The bismuth germanium oxide (BGO) calorimeter is one of the key sub-detectors of DAMPE used for energy measurement and electron proton identification. For events with total energy deposit higher than decades of TeV, the readouts of PMTs coupled on the BGO crystals would become saturated, which results in an underestimation of the energy measurement. Based on detailed simulations, we develop a correction method for the saturation effect according to the shower development topologies and energies measured by neighbouring BGO crystals. The verification with simulated and on-orbit events shows that this method can well reconstruct the energy deposit in the saturated BGO crystal.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا