ترغب بنشر مسار تعليمي؟ اضغط هنا

Inferring the instability of a dynamical system from the skill of data assimilation exercises

91   0   0.0 ( 0 )
 نشر من قبل Valerio Lucarini
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Data assimilation (DA) aims at optimally merging observational data and model outputs to create a coherent statistical and dynamical picture of the system under investigation. Indeed, DA aims at minimizing the effect of observational and model error, and at distilling the correct ingredients of its dynamics. DA is of critical importance for the analysis of systems featuring sensitive dependence on the initial conditions, as chaos wins over any finitely accurate knowledge of the state of the system, even in absence of model error. Clearly, the skill of DA is guided by the properties of dynamical system under investigation, as merging optimally observational data and model outputs is harder when strong instabilities are present. In this paper we reverse the usual angle on the problem and show that it is indeed possible to use the skill of DA to infer some basic properties of the tangent space of the system, which may be hard to compute in very high-dimensional systems. Here, we focus our attention on the first Lyapunov exponent and the Kolmogorov-Sinai entropy, and perform numerical experiments on the Vissio-Lucarini 2020 model, a recently proposed generalisation of the Lorenz 1996 model that is able to describe in a simple yet meaningful way the interplay between dynamical and thermodynamical variables.

قيم البحث

اقرأ أيضاً

Chaos is ubiquitous in physical systems. The associated sensitivity to initial conditions is a significant obstacle in forecasting the weather and other geophysical fluid flows. Data assimilation is the process whereby the uncertainty in initial cond itions is reduced by the astute combination of model predictions and real-time data. This chapter reviews recent findings from investigations on the impact of chaos on data assimilation methods: for the Kalman filter and smoother in linear systems, analytic results are derived; for their ensemble-bas
182 - Wenxu Wang , Ying-Cheng Lai , 2017
The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. In this paper, we review the recent advances in this forefront and rapidly evolving field, aiming to cover topics such as compressive sensing (a novel optimization paradigm for sparse-signal reconstruction), noised-induced dynamical mapping, perturbations, reverse engineering, synchronization, inner composition alignment, global silencing, Granger Causality and alternative optimization algorithms. Often, these rely on various concepts from statistical and nonlinear physics such as phase transitions, bifurcation, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks towards the ultimate goal of controlling such systems. Despite recent progress, many challenges remain. A purpose of this Review is then to point out the specific difficulties as they arise from different contexts, so as to stimulate further efforts in this interdisciplinary field.
We propose a comprehensive framework able to address both the predictability of the first and of the second kind for high-dimensional chaotic models. For this purpose, we analyse the properties of a newly introduced multistable climate toy model cons tructed by coupling the Lorenz 96 model with a zero-dimensional energy balance model. First, the attractors of the system are identified with Monte Carlo Basin Bifurcation Analysis. Additionally, we are able to detect the Melancholia state separating the two attractors. Then, Neural Ordinary Differential Equations are applied in order to predict the future state of the system in both of the identified attractors.
103 - K. J. H. Law , A. M. Stuart 2011
Data assimilation leads naturally to a Bayesian formulation in which the posterior probability distribution of the system state, given the observations, plays a central conceptual role. The aim of this paper is to use this Bayesian posterior probabil ity distribution as a gold standard against which to evaluate various commonly used data assimilation algorithms. A key aspect of geophysical data assimilation is the high dimensionality and low predictability of the computational model. With this in mind, yet with the goal of allowing an explicit and accurate computation of the posterior distribution, we study the 2D Navier-Stokes equations in a periodic geometry. We compute the posterior probability distribution by state-of-the-art statistical sampling techniques. The commonly used algorithms that we evaluate against this accurate gold standard, as quantified by comparing the relative error in reproducing its moments, are 4DVAR and a variety of sequential filtering approximations based on 3DVAR and on extended and ensemble Kalman filters. The primary conclusions are that: (i) with appropriate parameter choices, approximate filters can perform well in reproducing the mean of the desired probability distribution; (ii) however they typically perform poorly when attempting to reproduce the covariance; (iii) this poor performance is compounded by the need to modify the covariance, in order to induce stability. Thus, whilst filters can be a useful tool in predicting mean behavior, they should be viewed with caution as predictors of uncertainty. These conclusions are intrinsic to the algorithms and will not change if the model complexity is increased, for example by employing a smaller viscosity, or by using a detailed NWP model.
Data-driven prediction and physics-agnostic machine-learning methods have attracted increased interest in recent years achieving forecast horizons going well beyond those to be expected for chaotic dynamical systems. In a separate strand of research data-assimilation has been successfully used to optimally combine forecast models and their inherent uncertainty with incoming noisy observations. The key idea in our work here is to achieve increased forecast capabilities by judiciously combining machine-learning algorithms and data assimilation. We combine the physics-agnostic data-driven approach of random feature maps as a forecast model within an ensemble Kalman filter data assimilation procedure. The machine-learning model is learned sequentially by incorporating incoming noisy observations. We show that the obtained forecast model has remarkably good forecast skill while being computationally cheap once trained. Going beyond the task of forecasting, we show that our method can be used to generate reliable ensembles for probabilistic forecasting as well as to learn effective model closure in multi-scale systems.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا