No Arabic abstract
It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.
Financial time series have been investigated to follow fat-tailed distributions. Further, an empirical probability distribution sometimes shows cut-off shapes on its tails. To describe this stylized fact, we incorporate the cut-off effect in superstatistics. Then we confirm that the presented stochastic model is capable of describing the statistical properties of real financial time series. In addition, we present an option pricing formula with respect to superstatistics.
During the last decades there is a continuing international endeavor in developing realistic space weather prediction tools aiming to forecast the conditions on the Sun and in the interplanetary environment. These efforts have led to the need of developing appropriate metrics in order to assess the performance of those tools. Metrics are necessary for validating models, comparing different models and monitoring adjustments or improvements of a certain model over time. In this work, we introduce the Dynamic Time Warping (DTW) as an alternative way to validate models and, in particular, to quantify differences between observed and synthetic (modeled) time series for space weather purposes. We present the advantages and drawbacks of this method as well as applications on WIND observations and EUHFORIA modeled output at L1. We show that DTW is a useful tool that permits the evaluation of both the fast and slow solar wind. Its distinctive characteristic is that it warps sequences in time, aiming to align them with the minimum cost by using dynamic programming. It can be applied in two different ways for the evaluation of modeled solar wind time series. The first way calculates the so-called sequence similarity factor (SSF), a number that provides a quantification of how good the forecast is, compared to a best and a worst case prediction scenarios. The second way quantifies the time and amplitude differences between the points that are best matched between the two sequences. As a result, it can serve as a hybrid metric between continuous measurements (such as, e.g., the correlation coefficient) and point-by-point comparisons. We conclude that DTW is a promising technique for the assessment of solar wind profiles offering functions that other metrics do not, so that it can give at once the most complete evaluation profile of a model.
In this work we evaluate multi-output (MO) Gaussian Process (GP) models based on the linear model of coregionalization (LMC) for estimation of biophysical parameter variables under a gap filling setup. In particular, we focus on LAI and fAPAR over rice areas. We show how this problem cannot be solved with standard single-output (SO) GP models, and how the proposed MO-GP models are able to successfully predict these variables even in high missing data regimes, by implicitly performing an across-domain information transfer.
A nonparametric method to predict non-Markovian time series of partially observed dynamics is developed. The prediction problem we consider is a supervised learning task of finding a regression function that takes a delay embedded observable to the observable at a future time. When delay embedding theory is applicable, the proposed regression function is a consistent estimator of the flow map induced by the delay embedding. Furthermore, the corresponding Mori-Zwanzig equation governing the evolution of the observable simplifies to only a Markovian term, represented by the regression function. We realize this supervised learning task with a class of kernel-based linear estimators, the kernel analog forecast (KAF), which are consistent in the limit of large data. In a scenario with a high-dimensional covariate space, we employ a Markovian kernel smoothing method which is computationally cheaper than the Nystrom projection method for realizing KAF. In addition to the guaranteed theoretical convergence, we numerically demonstrate the effectiveness of this approach on higher-dimensional problems where the relevant kernel features are difficult to capture with the Nystrom method. Given noisy training data, we propose a nonparametric smoother as a de-noising method. Numerically, we show that the proposed smoother is more accurate than EnKF and 4Dvar in de-noising signals corrupted by independent (but not necessarily identically distributed) noise, even if the smoother is constructed using a data set corrupted by white noise. We show skillful prediction using the KAF constructed from the denoised data.
Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses.