ترغب بنشر مسار تعليمي؟ اضغط هنا

Study of a data analysis method for the angle resolving silicon telescope

79   0   0.0 ( 0 )
 نشر من قبل Petar \\v{Z}ugec
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

A new data analysis method is developed for the angle resolving silicon telescope introduced at the neutron time of flight facility n_TOF at CERN. The telescope has already been used in measurements of several neutron induced reactions with charged particles in the exit channel. The development of a highly detailed method is necessitated by the latest joint measurement of the $^{12}$C($n,p$) and $^{12}$C($n,d$) reactions from n_TOF. The reliable analysis of these data must account for the challenging nature of the involved reactions, as they are affected by the multiple excited states in the daughter nuclei and characterized by the anisotropic angular distributions of the reaction products. The unabridged analysis procedure aims at the separate reconstruction of all relevant reaction parameters - the absolute cross section, the branching ratios and the angular distributions - from the integral number of the coincidental counts detected by the separate pairs of silicon strips. This procedure is tested under the specific conditions relevant for the $^{12}$C($n,p$) and $^{12}$C($n,d$) measurements from n_TOF, in order to assess its direct applicability to these experimental data. Based on the reached conclusions, the original method is adapted to a particular level of uncertainties in the input data.

قيم البحث

اقرأ أيضاً

109 - Dimitri Bourilkov 2004
The Collaborative Analysis Versioning Environment System (CAVES) project concentrates on the interactions between users performing data and/or computing intensive analyses on large data sets, as encountered in many contemporary scientific disciplines . In modern science increasingly larger groups of researchers collaborate on a given topic over extended periods of time. The logging and sharing of knowledge about how analyses are performed or how results are obtained is important throughout the lifetime of a project. Here is where virtual data concepts play a major role. The ability to seamlessly log, exchange and reproduce results and the methods, algorithms and computer programs used in obtaining them enhances in a qualitative way the level of collaboration in a group or between groups in larger organizations. The CAVES project takes a pragmatic approach in assessing the needs of a community of scientists by building series of prototypes with increasing sophistication. In extending the functionality of existing data analysis packages with virtual data capabilities these prototypes provide an easy and habitual entry point for researchers to explore virtual data concepts in real life applications and to provide valuable feedback for refining the system design. The architecture is modular based on Web, Grid and other services which can be plugged in as desired. As a proof of principle we build a first system by extending the very popular data analysis framework ROOT, widely used in high energy physics and other fields, making it virtual data enabled.
A new method for the determination of electric signal time-shifts is introduced. As the Kolmogorov-Smirnov test, it is based on the comparison of the cumulative distribution functions of the reference signal with the test signal. This method is very fast and thus well suited for on-line applications. It is robust to noise and its performances in terms of precision are excellent for time-shifts ranging from a fraction to several sample durations. PACS. 29.40.Gx (Tracking and position-sensitive detectors), 29.30.Kv (X- and -ray spectroscopy), 07.50.Qx (Signal processing electronics)
The Shape method, a novel approach to obtain the functional form of the $gamma$-ray strength function ($gamma$SF) in the absence of neutron resonance spacing data, is introduced. When used in connection with the Oslo method the slope of the Nuclear L evel Density (NLD) is obtained simultaneously. The foundation of the Shape method lies in the primary $gamma$-ray transitions which preserve information on the functional form of the $gamma$SF. The Shape method has been applied to $^{56}$Fe, $^{92}$Zr, $^{164}$Dy, and $^{240}$Pu, which are representative cases for the variety of situations encountered in typical NLD and $gamma$SF studies. The comparisons of results from the Shape method to those from the Oslo method demonstrate that the functional form of the $gamma$SF is retained regardless of nuclear structure details or $J^pi$ values of the states fed by the primary transitions.
We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes theorem and its applications. In particular we discuss about how to calculate simpl e and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distributions and other related quantities. Nested fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.
The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, an d in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both GERDA and Majorana.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا