Do you want to publish a course? Click here

The Nested_fit data analysis program

137   0   0.0 ( 0 )
 Added by Martino Trassinelli
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present here Nested_fit, a Bayesian data analysis code developed for investigations of atomic spectra and other physical data. It is based on the nested sampling algorithm with the implementation of an upgraded lawn mower robot method for finding new live points. For a given data set and a chosen model, the program provides the Bayesian evidence, for the comparison of different hypotheses/models, and the different parameter probability distributions. A large database of spectral profiles is already available (Gaussian, Lorentz, Voigt, Log-normal, etc.) and additional ones can easily added. It is written in Fortran, for an optimized parallel computation, and it is accompanied by a Python library for the results visualization.



rate research

Read More

81 - Tianhao Liu 2021
Measuring the size of permanent electric dipole moments (EDM) of a particle or system provides a powerful tool to test Beyond-the-Standard-Model physics. The diamagnetic $^{129}$Xe atom is one of the promising candidates for EDM experiments due to its obtainable high nuclear polarization and its long spin-coherence time in a homogeneous magnetic field. By measuring the spin precession frequencies of polarized $^{129}$Xe and $^{3}$He, a new upper limit on the $^{129}$Xe atomic EDM $d_mathrm{A}(^{129}mathrm{Xe})$ was reported in Phys. Rev. Lett. 123, 143003 (2019). This writeup proposes a new evaluation method based on global phase fitting (GPF) for analyzing the continuous phase development of the $^{3}$He-$^{129}$Xe comagnetometer signal. The Cramer-Rao Lower Bound on the $^{129}$Xe EDM for the GPF method is theoretically derived and shows the benefit of achieving high statistical sensitivity without bringing new systematic uncertainties. The robustness of the GPF method is verified with Monte-Carlo studies. By optimizing the analysis parameters and adding few more data that could not be analyzed with the former method, a result of [ { d_mathrm{A} (^{129}mathrm{Xe})=(1.1 pm 3.6_mathrm{(stat)} pm 2.0_mathrm{(syst)})times 10 ^{-28} e~mathrm{cm}}, ] is obtained and is used to derive the upper limit of $^{129}$Xe permanent EDM at 95% C.L. [ {|d_text{A}(^{129}text{Xe})| < 8.3 times 10^{-28}~e~mathrm{cm}}. ] This limit is a factor of 1.7 smaller as compared to the previous result.
To evaluate the effectiveness of the containment on the epidemic spreading of the new Coronavirus disease 2019, we carry on an analysis of the time evolution of the infection in a selected number of different Countries, by considering well-known macroscopic growth laws, the Gompertz law, and the logistic law. We also propose here a generalization of Gompertz law. Our data analysis permits an evaluation of the maximum number of infected individuals. The daily data must be compared with the obtained fits, to verify if the spreading is under control. From our analysis it appears that the spreading reached saturation in China, due to the strong containment policy of the national government. In Singapore a large growth rate, recently observed, suggests the start of a new strong spreading. For South Korea and Italy, instead, the next data on new infections will be crucial to understand if the saturation will be reached for lower or higher numbers of infected individuals.
While experiments on fusion plasmas produce high-dimensional data time series with ever increasing magnitude and velocity, data analysis has been lagging behind this development. For example, many data analysis tasks are often performed in a manual, ad-hoc manner some time after an experiment. In this article we introduce the DELTA framework that facilitates near real-time streaming analysis of big and fast fusion data. By streaming measurement data from fusion experiments to a high-performance compute center, DELTA allows to perform demanding data analysis tasks in between plasma pulses. This article describe the modular and expandable software architecture of DELTA and presents performance benchmarks of its individual components as well as of entire workflows. Our focus is on the streaming analysis of ECEi data measured at KSTAR on NERSCs supercomputers and we routinely achieve data transfer rates of about 500 Megabyte per second. We show that a demanding turbulence analysis workload can be distributed among multiple GPUs and executes in under 5 minutes. We further discuss how DELTA uses modern database systems and container orchestration services to provide web-based real-time data visualization. For the case of ECEi data we demonstrate how data visualizations can be augmented with outputs from machine learning models. By providing session leaders and physics operators results of higher order data analysis using live visualization they may monitor the evolution of a long-pulse discharge in near real-time and may make more informed decision on how to configure the machine for the next shot.
Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The application to pump-probe coincidence measurements on acetone molecules enables novel quantitative interpretations about the molecular decay dynamics and fragmentation behavior.
This paper employs Bayesian probability theory for analyzing data generated in femtosecond pump-probe photoelectron-photoion coincidence (PEPICO) experiments. These experiments allow investigating ultrafast dynamical processes in photoexcited molecules. Bayesian probability theory is consistently applied to data analysis problems occurring in these types of experiments such as background subtraction and false coincidences. We previously demonstrated that the Bayesian formalism has many advantages, amongst which are compensation of false coincidences, no overestimation of pump-only contributions, significantly increased signal-to-noise ratio, and applicability to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, our approach allows running experiments at higher ionization rates, resulting in an appreciable reduction of data acquisition times. In addition to our previous paper, we include fluctuating laser intensities, of which the straightforward implementation highlights yet another advantage of the Bayesian formalism. Our method is thoroughly scrutinized by challenging mock data, where we find a minor impact of laser fluctuations on false coincidences, yet a noteworthy influence on background subtraction. We apply our algorithm to data obtained in experiments and discuss the impact of laser fluctuations on the data analysis.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا