ترغب بنشر مسار تعليمي؟ اضغط هنا

Nonlinear analysis of time series of vibration data from a friction brake: SSA, PCA, and MFDFA

111   0   0.0 ( 0 )
 نشر من قبل Nikolay Vitanov k
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We use the methodology of singular spectrum analysis (SSA), principal component analysis (PCA), and multi-fractal detrended fluctuation analysis (MFDFA), for investigating characteristics of vibration time series data from a friction brake. SSA and PCA are used to study the long time-scale characteristics of the time series. MFDFA is applied for investigating all time scales up to the smallest recorded one. It turns out that the majority of the long time-scale dynamics, that is presumably dominated by the structural dynamics of the brake system, is dominated by very few active dimensions only and can well be understood in terms of low dimensional chaotic attractors. The multi-fractal analysis shows that the fast dynamical processes originating in the friction interface are in turn truly multi-scale in nature.



قيم البحث

اقرأ أيضاً

Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CC M fails to infer accurate coupling strength and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses.
45 - A. Dekel 2001
We allow for nonlinear effects in the likelihood analysis of peculiar velocities, and obtain ~35%-lower values for the cosmological density parameter and for the amplitude of mass-density fluctuations. The power spectrum in the linear regime is assum ed to be of the flat LCDM model (h=0.65, n=1) with only Om_m free. Since the likelihood is driven by the nonlinear regime, we break the power spectrum at k_b=0.2 h/Mpc and fit a two-parameter power-law at k>k_b. This allows for an unbiased fit in the linear regime. Tests using improved mock catalogs demonstrate a reduced bias and a better fit. We find for the Mark III and SFI data Om_m=0.35+-0.09$ with sigma_8*Om_m^0.6=0.55+-0.10 (90% errors). When allowing deviations from lcdm, we find an indication for a wiggle in the power spectrum in the form of an excess near k~0.05 and a deficiency at k~0.1 h/Mpc --- a cold flow which may be related to a feature indicated from redshift surveys and the second peak in the CMB anisotropy. A chi^2 test applied to principal modes demonstrates that the nonlinear procedure improves the goodness of fit. The Principal Component Analysis (PCA) helps identifying spatial features of the data and fine-tuning the theoretical and error models. We address the potential for optimal data compression using PCA.
Topological Data Analysis (TDA) is the collection of mathematical tools that capture the structure of shapes in data. Despite computational topology and computational geometry, the utilization of TDA in time series and signal processing is relatively new. In some recent contributions, TDA has been utilized as an alternative to the conventional signal processing methods. Specifically, TDA is been considered to deal with noisy signals and time series. In these applications, TDA is used to find the shapes in data as the main properties, while the other properties are assumed much less informative. In this paper, we will review recent developments and contributions where topological data analysis especially persistent homology has been applied to time series analysis, dynamical systems and signal processing. We will cover problem statements such as stability determination, risk analysis, systems behaviour, and predicting critical transitions in financial markets.
108 - Devesh K. Jha 2021
Markov models are often used to capture the temporal patterns of sequential data for statistical learning applications. While the Hidden Markov modeling-based learning mechanisms are well studied in literature, we analyze a symbolic-dynamics inspired approach. Under this umbrella, Markov modeling of time-series data consists of two major steps -- discretization of continuous attributes followed by estimating the size of temporal memory of the discretized sequence. These two steps are critical for the accurate and concise representation of time-series data in the discrete space. Discretization governs the information content of the resultant discretized sequence. On the other hand, memory estimation of the symbolic sequence helps to extract the predictive patterns in the discretized data. Clearly, the effectiveness of signal representation as a discrete Markov process depends on both these steps. In this paper, we will review the different techniques for discretization and memory estimation for discrete stochastic processes. In particular, we will focus on the individual problems of discretization and order estimation for discrete stochastic process. We will present some results from literature on partitioning from dynamical systems theory and order estimation using concepts of information theory and statistical learning. The paper also presents some related problem formulations which will be useful for machine learning and statistical learning application using the symbolic framework of data analysis. We present some results of statistical analysis of a complex thermoacoustic instability phenomenon during lean-premixed combustion in jet-turbine engines using the proposed Markov modeling method.
102 - C. Raeth , I. Laut 2015
It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا