Do you want to publish a course? Click here

Correlating Paleoclimate Time Series: Sources of Uncertainty and Potential Pitfalls

57   0   0.0 ( 0 )
 Added by Jasper G. Franke
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

Comparing paleoclimate time series is complicated by a variety of typical features, including irregular sampling, age model uncertainty (e.g., errors due to interpolation between radiocarbon sampling points) and time uncertainty (uncertainty in calibration), which, taken together, result in unequal and uncertain observation times of the individual time series to be correlated. Several methods have been proposed to approximate the joint probability distribution needed to estimate correlations, most of which rely either on interpolation or temporal downsampling. Here, we compare the performance of some popular approximation methods using synthetic data resembling common properties of real world marine sediment records. Correlations are determined by estimating the parameters of a bivariate Gaussian model from the data using Markov Chain Monte Carlo sampling. We complement our pseudoproxy experiments by applying the same methodology to a pair of marine benthic oxygen records from the Atlantic Ocean. We find that methods based upon interpolation yield better results in terms of precision and accuracy than those which reduce the number of observations. In all cases, the specific characteristics of the studied time series are, however, more important than the choice of a particular interpolation method. Relevant features include the number of observations, the persistence of each record, and the imposed coupling strength between the paired series. In most of our pseudoproxy experiments, uncertainty in observation times introduces less additional uncertainty than unequal sampling and errors in observation times do. Thus, it can be reasonable to rely on published time scales as long as calibration uncertainties are not known.



rate research

Read More

Four-dimensional scanning transmission electron microscopy (4D-STEM) is one of the most rapidly growing modes of electron microscopy imaging. The advent of fast pixelated cameras and the associated data infrastructure have greatly accelerated this process. Yet conversion of the 4D datasets into physically meaningful structure images in real-space remains an open issue. In this work, we demonstrate that, it is possible to systematically create filters that will affect the apparent resolution or even qualitative features of the real-space structure image, reconstructing artificially generated patterns. As initial efforts, we explore statistical model selection algorithms, aiming for robustness and reliability of estimated filters. This statistical model selection analysis demonstrates the need for regularization and cross-validation of inversion methods to robustly recover structure from high-dimensional diffraction datasets.
Understanding centennial scale climate variability requires data sets that are accurate, long, continuous and of broad spatial coverage. Since instrumental measurements are generally only available after 1850, temperature fields must be reconstructed using paleoclimate archives, known as proxies. Various climate field reconstructions (CFR) methods have been proposed to relate past temperature to such proxy networks. In this work, we propose a new CFR method, called GraphEM, based on Gaussian Markov random fields embedded within an EM algorithm. Gaussian Markov random fields provide a natural and flexible framework for modeling high-dimensional spatial fields. At the same time, they provide the parameter reduction necessary for obtaining precise and well-conditioned estimates of the covariance structure, even in the sample-starved setting common in paleoclimate applications. In this paper, we propose and compare the performance of different methods to estimate the graphical structure of climate fields, and demonstrate how the GraphEM algorithm can be used to reconstruct past climate variations. The performance of GraphEM is compared to the widely used CFR method RegEM with regularization via truncated total least squares, using synthetic data. Our results show that GraphEM can yield significant improvements, with uniform gains over space, and far better risk properties. We demonstrate that the spatial structure of temperature fields can be well estimated by graphs where each neighbor is only connected to a few geographically close neighbors, and that the increase in performance is directly related to recovering the underlying sparsity in the covariance of the spatial field. Our work demonstrates how significant improvements can be made in climate reconstruction methods by better modeling the covariance structure of the climate field.
We propose a Bayesian nonparametric approach to modelling and predicting a class of functional time series with application to energy markets, based on fully observed, noise-free functional data. Traders in such contexts conceive profitable strategies if they can anticipate the impact of their bidding actions on the aggregate demand and supply curves, which in turn need to be predicted reliably. Here we propose a simple Bayesian nonparametric method for predicting such curves, which take the form of monotonic bounded step functions. We borrow ideas from population genetics by defining a class of interacting particle systems to model the functional trajectory, and develop an implementation strategy which uses ideas from Markov chain Monte Carlo and approximate Bayesian computation techniques and allows to circumvent the intractability of the likelihood. Our approach shows great adaptation to the degree of smoothness of the curves and the volatility of the functional series, proves to be robust to an increase of the forecast horizon and yields an uncertainty quantification for the functional forecasts. We illustrate the model and discuss its performance with simulated datasets and on real data relative to the Italian natural gas market.
We study the applicability of the time-dependent variational principle in matrix product state manifolds for the long time description of quantum interacting systems. By studying integrable and nonintegrable systems for which the long time dynamics are known we demonstrate that convergence of long time observables is subtle and needs to be examined carefully. Remarkably, for the disordered nonintegrable system we consider the long time dynamics are in good agreement with the rigorously obtained short time behavior and with previous obtained numerically exact results, suggesting that at least in this case the apparent convergence of this approach is reliable. Our study indicates that while great care must be exercised in establishing the convergence of the method, it may still be asymptotically accurate for a class of disordered nonintegrable quantum systems.
81 - B Ravi Kiran 2017
In the class of streaming anomaly detection algorithms for univariate time series, the size of the sliding window over which various statistics are calculated is an important parameter. To address the anomalous variation in the scale of the pseudo-periodicity of time series, we define a streaming multi-scale anomaly score with a streaming PCA over a multi-scale lag-matrix. We define three methods of aggregation of the multi-scale anomaly scores. We evaluate their performance on Yahoo! and Numenta dataset for unsupervised anomaly detection benchmark. To the best of authors knowledge, this is the first time a multi-scale streaming anomaly detection has been proposed and systematically studied.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا