Do you want to publish a course? Click here

Optimizing Nuclear Reaction Analysis (NRA) using Bayesian Experimental Design

126   0   0.0 ( 0 )
 Added by Udo V. Toussaint
 Publication date 2008
  fields Physics
and research's language is English




Ask ChatGPT about the research

Nuclear Reaction Analysis with ${}^{3}$He holds the promise to measure Deuterium depth profiles up to large depths. However, the extraction of the depth profile from the measured data is an ill-posed inversion problem. Here we demonstrate how Bayesian Experimental Design can be used to optimize the number of measurements as well as the measurement energies to maximize the information gain. Comparison of the inversion properties of the optimized design with standard settings reveals huge possible gains. Application of the posterior sampling method allows to optimize the experimental settings interactively during the measurement process.



rate research

Read More

The stabilities of the beam and machine have almost the highest priority in a modern light source. Although a lot of machine parameters could be used to represent the beam quality, there lacks a single one that could indicate the global information for the machine operators and accelerator physicists, recently. A new parameter has been studied for the last few years as a beam quality flag in Shanghai Synchrotron Radiation Facility (SSRF). Calculations, simulations and detailed analysis of the real-time data from the storage ring had been made and interesting results had confirmed its feasibility.
We describe how a single-particle tracking experiment should be designed in order for its recorded trajectories to contain the most information about a tracked particles diffusion coefficient. The precision of estimators for the diffusion coefficient is affected by motion blur, limited photon statistics, and the length of recorded time-series. We demonstrate for a particle undergoing free diffusion that precision is negligibly affected by motion blur in typical experiments, while optimizing photon counts and the number of recorded frames is the key to precision. Building on these results, we describe for a wide range of experimental scenarios how to choose experimental parameters in order to optimize the precision. Generally, one should choose quantity over quality: experiments should be designed to maximize the number of frames recorded in a time-series, even if this means lower information content in individual frames.
Observational data collected during experiments, such as the planned Fire and Smoke Model Evaluation Experiment (FASMEE), are critical for progressing and transitioning coupled fire-atmosphere models like WRF-SFIRE and WRF-SFIRE-CHEM into operational use. Historical meteorological data, representing typical weather conditions for the anticipated burn locations and times, have been processed to initialize and run a set of simulations representing the planned experimental burns. Based on an analysis of these numerical simulations, this paper provides recommendations on the experimental setup that include the ignition procedures, size and duration of the burns, and optimal sensor placement. New techniques are developed to initialize coupled fire-atmosphere simulations with weather conditions typical of the planned burn locations and time of the year. Analysis of variation and sensitivity analysis of simulation design to model parameters by repeated Latin Hypercube Sampling are used to assess the locations of the sensors. The simulations provide the locations of the measurements that maximize the expected variation of the sensor outputs with the model parameters.
The Linac Coherent Light Source changes configurations multiple times per day, necessitating fast tuning strategies to reduce setup time for successive experiments. To this end, we employ a Bayesian approach to transport optics tuning to optimize groups of quadrupole magnets. We use a Gaussian process to provide a probabilistic model of the machine response with respect to control parameters from a modest number of samples. Subsequent samples are selected during optimization using a statistical test combining the model prediction and uncertainty. The model parameters are fit from archived scans, and correlations between devices are added from a simple beam transport model. The result is a sample-efficient optimization routine, which we show significantly outperforms existing optimizers.
Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The application to pump-probe coincidence measurements on acetone molecules enables novel quantitative interpretations about the molecular decay dynamics and fragmentation behavior.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا