ترغب بنشر مسار تعليمي؟ اضغط هنا

Structure retrieval from 4D-STEM: statistical analysis of potential pitfalls in high-dimensional data

382   0   0.0 ( 0 )
 نشر من قبل Xin Li
 تاريخ النشر 2019
والبحث باللغة English




اسأل ChatGPT حول البحث

Four-dimensional scanning transmission electron microscopy (4D-STEM) is one of the most rapidly growing modes of electron microscopy imaging. The advent of fast pixelated cameras and the associated data infrastructure have greatly accelerated this process. Yet conversion of the 4D datasets into physically meaningful structure images in real-space remains an open issue. In this work, we demonstrate that, it is possible to systematically create filters that will affect the apparent resolution or even qualitative features of the real-space structure image, reconstructing artificially generated patterns. As initial efforts, we explore statistical model selection algorithms, aiming for robustness and reliability of estimated filters. This statistical model selection analysis demonstrates the need for regularization and cross-validation of inversion methods to robustly recover structure from high-dimensional diffraction datasets.

قيم البحث

اقرأ أيضاً

Interface structures in complex oxides remain one of the active areas of condensed matter physics research, largely enabled by recent advances in scanning transmission electron microscopy (STEM). Yet the nature of the STEM contrast in which the struc ture is projected along the given direction precludes separation of possible structural models. Here, we utilize deep convolutional neural networks (DCNN) trained on simulated 4D scanning transmission electron microscopy (STEM) datasets to predict structural descriptors of interfaces. We focus on the widely studied interface between LaAlO3 and SrTiO3, using dynamical diffraction theory and leveraging high performance computing to simulate thousands of possible 4D STEM datasets to train the DCNN to learn properties of the underlying structures on which the simulations are based. We validate the DCNN on simulated data and show that it is possible (with >95% accuracy) to identify a physically rough from a chemically diffuse interface and achieve 85% accuracy in determination of buried step positions within the interface. The method shown here is general and can be applied for any inverse imaging problem where forward models are present.
Comparing paleoclimate time series is complicated by a variety of typical features, including irregular sampling, age model uncertainty (e.g., errors due to interpolation between radiocarbon sampling points) and time uncertainty (uncertainty in calib ration), which, taken together, result in unequal and uncertain observation times of the individual time series to be correlated. Several methods have been proposed to approximate the joint probability distribution needed to estimate correlations, most of which rely either on interpolation or temporal downsampling. Here, we compare the performance of some popular approximation methods using synthetic data resembling common properties of real world marine sediment records. Correlations are determined by estimating the parameters of a bivariate Gaussian model from the data using Markov Chain Monte Carlo sampling. We complement our pseudoproxy experiments by applying the same methodology to a pair of marine benthic oxygen records from the Atlantic Ocean. We find that methods based upon interpolation yield better results in terms of precision and accuracy than those which reduce the number of observations. In all cases, the specific characteristics of the studied time series are, however, more important than the choice of a particular interpolation method. Relevant features include the number of observations, the persistence of each record, and the imposed coupling strength between the paired series. In most of our pseudoproxy experiments, uncertainty in observation times introduces less additional uncertainty than unequal sampling and errors in observation times do. Thus, it can be reasonable to rely on published time scales as long as calibration uncertainties are not known.
Spatial prediction of weather-elements like temperature, precipitation, and barometric pressure are generally based on satellite imagery or data collected at ground-stations. None of these data provide information at a more granular or hyper-local re solution. On the other hand, crowdsourced weather data, which are captured by sensors installed on mobile devices and gathered by weather-related mobile apps like WeatherSignal and AccuWeather, can serve as potential data sources for analyzing environmental processes at a hyper-local resolution. However, due to the low quality of the sensors and the non-laboratory environment, the quality of the observations in crowdsourced data is compromised. This paper describes methods to improve hyper-local spatial prediction using this varying-quality noisy crowdsourced information. We introduce a reliability metric, namely Veracity Score (VS), to assess the quality of the crowdsourced observations using a coarser, but high-quality, reference data. A VS-based methodology to analyze noisy spatial data is proposed and evaluated through extensive simulations. The merits of the proposed approach are illustrated through case studies analyzing crowdsourced daily average ambient temperature readings for one day in the contiguous United States.
90 - Lucio Anderlini 2015
Density Estimation Trees can play an important role in exploratory data analysis for multidimensional, multi-modal data models of large samples. I briefly discuss the algorithm, a self-optimization technique based on kernel density estimation, and some applications in High Energy Physics.
Lithium iron phosphate (LixFePO4), a cathode material used in rechargeable Li-ion batteries, phase separates upon de/lithiation under equilibrium. The interfacial structure and chemistry within these cathode materials affects Li-ion transport, and th erefore battery performance. Correlative imaging of LixFePO4 was performed using four-dimensional scanning transmission electron microscopy (4D-STEM), scanning transmission X-ray microscopy (STXM), and X-ray ptychography in order to analyze the local structure and chemistry of the same particle set. Over 50,000 diffraction patterns from 10 particles provided measurements of both structure and chemistry at a nanoscale spatial resolution (16.6-49.5 nm) over wide (several micron) fields-of-view with statistical robustness.LixFePO4 particles at varying stages of delithiation were measured to examine the evolution of structure and chemistry as a function of delithiation. In lithiated and delithiated particles, local variations were observed in the degree of lithiation even while local lattice structures remained comparatively constant, and calculation of linear coefficients of chemical expansion suggest pinning of the lattice structures in these populations. Partially delithiated particles displayed broadly core-shell-like structures, however, with highly variable behavior both locally and per individual particle that exhibited distinctive intermediate regions at the interface between phases, and pockets within the lithiated core that correspond to FePO4 in structure and chemistry.The results provide insight into the LixFePO4 system, subtleties in the scope and applicability of Vegards law (linear lattice parameter-composition behavior) under local versus global measurements, and demonstrate a powerful new combination of experimental and analytical modalities for bridging the crucial gap between local and statistical characterization.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا