Do you want to publish a course? Click here

The modular SAXS data correction sequence for solids and dispersions

318   0   0.0 ( 0 )
 Added by Brian Pauw
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

Data correction is probably the least favourite activity amongst users experimenting with small-angle X-ray scattering (SAXS): if it is not done sufficiently well, this may become evident during the data analysis stage, necessitating the repetition of the data corrections from scratch. A recommended, comprehensive sequence of elementary data correction steps is presented here to alleviate the difficulties associated with data correction. When applied in the proposed order, the resulting data will provide a high degree of accuracy for both solid samples and dispersions. The solution here can be applied without modification to any pinhole-collimated instruments with photon-counting, direct detection area detectors.



rate research

Read More

139 - Brian Richard Pauw 2013
For obtaining reliable nanostructural details of large amounts of sample --- and if it is applicable --- Small-Angle Scattering (SAS) is a prime technique to use. It promises to obtain bulk-scale, statistically sound information on the morphological details of the nanostructure, and has thus led to many a researcher investing their time in it over the last eight decades of development. Due to pressure both from scientists requesting more details on increasingly complex nanostructures, as well as the ever improving instrumentation leaving less margin for ambiguity, small-angle scattering methodologies have been evolving at a high pace over the last few decades. As the quality of any results can only be as good as the data that goes into these methodologies, the improvements in data collection and all imaginable data correction steps are reviewed here. This work is intended to provide a comprehensive overview of all data corrections, to aid the small-angle scatterer to decide which are relevant for their measurement and how these corrections are performed. Clear mathematical descriptions of the corrections are provided where feasible. Furthermore, as no quality data exists without a decent estimate of its precision, the error estimation and propagation through all these steps is provided alongside the corrections. With these data corrections, the collected small-angle scattering pattern can be made of the highest standard allowing for authoritative nanostructural characterisation through its analysis. A brief background of small-angle scattering, the instrumentation developments over the years, and pitfalls that may be encountered upon data interpretations are provided as well.
In recent years, several approaches for modelling pedestrian dynamics have been proposed and applied e.g. for design of egress routes. However, so far not much attention has been paid to their quantitative validation. This unsatisfactory situation belongs amongst others on the uncertain and contradictory experimental data base. The fundamental diagram, i.e. the density-dependence of the flow or velocity, is probably the most important relation as it connects the basic parameter to describe the dynamic of crowds. But specifications in different handbooks as well as experimental measurements differ considerably. The same is true for the bottleneck flow. After a comprehensive review of the experimental data base we give an survey of a research project, including experiments with up to 250 persons performed under well controlled laboratory conditions. The trajectories of each person are measured in high precision to analyze the fundamental diagram and the flow through bottlenecks. The trajectories allow to study how the way of measurement influences the resulting relations. Surprisingly we found large deviation amongst the methods. These may be responsible for the deviation in the literature mentioned above. The results are of particular importance for the comparison of experimental data gained in different contexts and for the validation of models.
Thermodynamic fluctuations in mechanical resonators cause uncertainty in their frequency measurement, fundamentally limiting performance of frequency-based sensors. Recently, integrating nanophotonic motion readout with micro- and nano-mechanical resonators allowed practical chip-scale sensors to routinely operate near this limit in high-bandwidth measurements. However, the exact and general expressions for either thermodynamic frequency measurement uncertainty or efficient, real-time frequency estimators are not well established, particularly for fast and weakly-driven resonators. Here, we derive, and numerically validate, the Cramer-Rao lower bound (CRLB) and an efficient maximum-likelihood estimator for the frequency of a classical linear harmonic oscillator subject to thermodynamic fluctuations. For a fluctuating oscillator without external drive, the frequency Allan deviation calculated from simulated resonator motion data agrees with the derived CRLB $sigma_f = {1 over 2pi}sqrt{Gamma over 2tau}$ for averaging times $tau$ below, as well as above, the relaxation time $1overGamma$. The CRLB approach is general and can be extended to driven resonators, non-negligible motion detection imprecision, as well as backaction from a continuous linear quantum measurement.
We present here Nested_fit, a Bayesian data analysis code developed for investigations of atomic spectra and other physical data. It is based on the nested sampling algorithm with the implementation of an upgraded lawn mower robot method for finding new live points. For a given data set and a chosen model, the program provides the Bayesian evidence, for the comparison of different hypotheses/models, and the different parameter probability distributions. A large database of spectral profiles is already available (Gaussian, Lorentz, Voigt, Log-normal, etc.) and additional ones can easily added. It is written in Fortran, for an optimized parallel computation, and it is accompanied by a Python library for the results visualization.
The REST-for-Physics (Rare Event Searches Toolkit for Physics) framework is a ROOT-based solution providing the means to process and analyze experimental or Monte Carlo event data. Special care has been taken on the traceability of the code and the validation of the results produced within the framework, together with the connectivity between code and data stored registered through specific version metadata members. The framework development was originally motivated to cover the needs at Rare Event Searches experiments (experiments looking for phenomena having extremely low occurrence probability like dark matter or neutrino interactions or rare nuclear decays), and its components naturally implement tools to address the challenges in these kinds of experiments; the integration of a detector physics response, the implementation of signal processing routines, or topological algorithms for physical event identification are some examples. Despite this specialization, the framework was conceived thinking in scalability, and other event-oriented applications could benefit from the data processing routines and/or metadata description implemented in REST, being the generic framework tools completely decoupled from dedicated libraries. REST-for-Physics is a consolidated piece of software already serving the needs of different physics experiments - using gaseous Time Projection Chambers (TPCs) as detection technology - for background data analysis and detector characterization, as well as generic detector R&D. Even though REST has been exploited mainly with gaseous TPCs, the code could be easily applied or adapted to other detection technologies. We present in this work an overview of REST-for-Physics, providing a broad perspective to the infrastructure and organization of the project as a whole. The framework and its different components will be described in the text.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا