Do you want to publish a course? Click here

Causal coupling inference from multivariate time series based on ordinal partition transition networks

66   0   0.0 ( 0 )
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Identifying causal relationships is a challenging yet crucial problem in many fields of science like epidemiology, climatology, ecology, genomics, economics and neuroscience, to mention only a few. Recent studies have demonstrated that ordinal partition transition networks (OPTNs) allow inferring the coupling direction between two dynamical systems. In this work, we generalize this concept to the study of the interactions among multiple dynamical systems and we propose a new method to detect causality in multivariate observational data. By applying this method to numerical simulations of coupled linear stochastic processes as well as two examples of interacting nonlinear dynamical systems (coupled Lorenz systems and a network of neural mass models), we demonstrate that our approach can reliably identify the direction of interactions and the associated coupling delays. Finally, we study real-world observational microelectrode array electrophysiology data from rodent brain slices to identify the causal coupling structures underlying epileptiform activity. Our results, both from simulations and real-world data, suggest that OPTNs can provide a complementary and robust approach to infer causal effect networks from multivariate observational data.



rate research

Read More

Our goal is to estimate causal interactions in multivariate time series. Using vector autoregressive (VAR) models, these can be defined based on non-vanishing coefficients belonging to respective time-lagged instances. As in most cases a parsimonious causality structure is assumed, a promising approach to causal discovery consists in fitting VAR models with an additional sparsity-promoting regularization. Along this line we here propose that sparsity should be enforced for the subgroups of coefficients that belong to each pair of time series, as the absence of a causal relation requires the coefficients for all time-lags to become jointly zero. Such behavior can be achieved by means of l1-l2-norm regularized regression, for which an efficient active set solver has been proposed recently. Our method is shown to outperform standard methods in recovering simulated causality graphs. The results are on par with a second novel approach which uses multiple statistical testing.
We investigated the topological properties of stock networks through a comparison of the original stock network with the estimated stock network from the correlation matrix created by the random matrix theory (RMT). We used individual stocks traded on the market indices of Korea, Japan, Canada, the USA, Italy, and the UK. The results are as follows. As the correlation matrix reflects the more eigenvalue property, the estimated stock network from the correlation matrix gradually increases the degree of consistency with the original stock network. Each stock with a different number of links to other stocks in the original stock network shows a different response. In particular, the largest eigenvalue is a significant deterministic factor in terms of the formation of a stock network.
In this work, we introduce a new methodology for inferring the interaction structure of discrete valued time series which are Poisson distributed. While most related methods are premised on continuous state stochastic processes, in fact, discrete and counting event oriented stochastic process are natural and common, so called time-point processes (TPP). An important application that we focus on here is gene expression. Nonparameteric methods such as the popular k-nearest neighbors (KNN) are slow converging for discrete processes, and thus data hungry. Now, with the new multi-variate Poisson estimator developed here as the core computational engine, the causation entropy (CSE) principle, together with the associated greedy search algorithm optimal CSE (oCSE) allows us to efficiently infer the true network structure for this class of stochastic processes that were previously not practical. We illustrate the power of our method, first in benchmarking with synthetic datum, and then by inferring the genetic factors network from a breast cancer micro-RNA (miRNA) sequence count data set. We show the Poisson oCSE gives the best performance among the tested methods anfmatlabd discovers previously known interactions on the breast cancer data set.
Approaches for mapping time series to networks have become essential tools for dealing with the increasing challenges of characterizing data from complex systems. Among the different algorithms, the recently proposed ordinal networks stand out due to its simplicity and computational efficiency. However, applications of ordinal networks have been mainly focused on time series arising from nonlinear dynamical systems, while basic properties of ordinal networks related to simple stochastic processes remain poorly understood. Here, we investigate several properties of ordinal networks emerging from random time series, noisy periodic signals, fractional Brownian motion, and earthquake magnitude series. For ordinal networks of random series, we present an approach for building the exact form of the adjacency matrix, which in turn is useful for detecting non-random behavior in time series and the existence of missing transitions among ordinal patterns. We find that the average value of a local entropy, estimated from transition probabilities among neighboring nodes of ordinal networks, is more robust against noise addition than the standard permutation entropy. We show that ordinal networks can be used for estimating the Hurst exponent of time series with accuracy comparable with state-of-the-art methods. Finally, we argue that ordinal networks can detect sudden changes in Earth seismic activity caused by large earthquakes.
State-space models provide an important body of techniques for analyzing time-series, but their use requires estimating unobserved states. The optimal estimate of the state is its conditional expectation given the observation histories, and computing this expectation is hard when there are nonlinearities. Existing filtering methods, including sequential Monte Carlo, tend to be either inaccurate or slow. In this paper, we study a nonlinear filter for nonlinear/non-Gaussian state-space models, which uses Laplaces method, an asymptotic series expansion, to approximate the states conditional mean and variance, together with a Gaussian conditional distribution. This {em Laplace-Gaussian filter} (LGF) gives fast, recursive, deterministic state estimates, with an error which is set by the stochastic characteristics of the model and is, we show, stable over time. We illustrate the estimation ability of the LGF by applying it to the problem of neural decoding and compare it to sequential Monte Carlo both in simulations and with real data. We find that the LGF can deliver superior results in a small fraction of the computing time.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا