Do you want to publish a course? Click here

Higher-order synchronization of a nudging-based algorithm for data assimilation for the 2D NSE: a refined paradigm for global interpolant observables

193   0   0.0 ( 0 )
 Added by Vincent R. Martinez
 Publication date 2021
  fields
and research's language is English




Ask ChatGPT about the research

This paper considers a nudging-based scheme for data assimilation for the two-dimensional (2D) Navier-Stokes equations (NSE) with periodic boundary conditions and studies the synchronization of the signal produced by this algorithm with the true signal, to which the observations correspond, in all higher-order Sobolev topologies. This work complements previous results in the literature where conditions were identified under which synchronization is guaranteed either with respect to only the $H^1$--topology, in the case of general observables, or to the analytic Gevrey topology, in the case of spectral observables. To accommodate the property of synchronization in the stronger topologies, the framework of general interpolant observable operators, originally introduced by Azouani, Olson, and Titi, is expanded to a far richer class of operators. A significant effort is dedicated to the development of this more expanded framework, specifically, their basic approximation properties, the identification of subclasses of such operators relevant to obtaining synchronization, as well as the detailed relation between the structure of these operators and the system regarding the syncrhonization property. One of the main features of this framework is its mesh-free aspect, which allows the observational data itself to dictate the subdivision of the domain. Lastly, estimates for the radius of the absorbing ball of the 2D NSE in all higher-order Sobolev norms are obtained, thus properly generalizing previously known bounds; such estimates are required for establishing the synchronization property of the algorithm in the higher-order topologies.



rate research

Read More

143 - Didier Auroux 2008
In this paper, we consider the back and forth nudging algorithm that has been introduced for data assimilation purposes. It consists of iteratively and alternately solving forward and backward in time the model equation, with a feedback term to the observations. We consider the case of 1-dimensional transport equations, either viscous or inviscid, linear or not (Burgers equation). Our aim is to prove some theoretical results on the convergence, and convergence properties, of this algorithm. We show that for non viscous equations (both linear transport and Burgers), the convergence of the algorithm holds under observability conditions. Convergence can also be proven for viscous linear transport equations under some strong hypothesis, but not for viscous Burgers equation. Moreover, the convergence rate is always exponential in time. We also notice that the forward and backward system of equations is well posed when no nudging term is considered.
The local and global control results for a general higher-order KdV-type operator posed on the unit circle are presented. Using spectral analysis, we are able to prove local results, that is, the equation is locally controllable and exponentially stable. To extend the local results to the global one we captured the smoothing properties of the Bourgain spaces, the so-called propagation of singularities, which are proved with a new perspective. These propagation, together with the Strichartz estimates, are the key to extending the local control properties to the global one, precisely, higher-order KdV-type equations are globally controllable and exponentially stabilizable in the Sobolev space $H^{s}(mathbb{T})$ for any $s geq 0$. Our results recover previous results in the literature for the KdV and Kawahara equations and extend, for a general higher-order operator of KdV-type, the Strichartz estimates as well as the propagation results, which are the main novelties of this work.
In this paper we study second order master equations arising from mean field games with common noise over arbitrary time duration. A classical solution typically requires the monotonicity condition (or small time duration) and sufficiently smooth data. While keeping the monotonicity condition, our goal is to relax the regularity of the data, which is an open problem in the literature. In particular, we do not require any differentiability in terms of the measures, which prevents us from obtaining classical solutions. We shall propose three weaker notions of solutions, named as {it good solutions}, {it weak solutions}, and {it viscosity solutions}, respectively, and establish the wellposedness of the master equation under all three notions. We emphasize that, due to the game nature, one cannot expect comparison principle even for classical solutions. The key for the global (in time) wellposedness is the uniform a priori estimate for the Lipschitz continuity of the solution in the measures. The monotonicity condition is crucial for this uniform estimate and thus is crucial for the existence of the global solution, but is not needed for the uniqueness. To facilitate our analysis, we construct a smooth mollifier for functions on Wasserstein space, which is new in the literature and is interesting in its own right. As an important application of our results, we prove the convergence of the Nash system, a high dimensional system of PDEs arising from the corresponding $N$-player game, under mild regularity requirements. We shall also prove a propagation of chaos property for the associated optimal trajectories.
This paper presents a model-based method for fusing data from multiple sensors with a hypothesis-test-based component for rejecting potentially faulty or otherwise malign data. Our framework is based on an extension of the classic particle filter algorithm for real-time state estimation of uncertain systems with nonlinear dynamics with partial and noisy observations. This extension, based on classical statistical theories, utilizes statistical tests against the systems observation model. We discuss the application of the two major statistical testing frameworks, Fisherian significance testing and Neyman-Pearsonian hypothesis testing, to the Monte Carlo and sensor fusion settings. The Monte Carlo Neyman-Pearson test we develop is useful when one has a reliable model of faulty data, while the Fisher one is applicable when one may not have a model of faults, which may occur when dealing with third-party data, like GNSS data of transportation system users. These statistical tests can be combined with a particle filter to obtain a Monte Carlo state estimation scheme that is robust to faulty or outlier data. We present a synthetic freeway traffic state estimation problem where the filters are able to reject simulated faulty GNSS measurements. The fault-model-free Fisher filter, while underperforming the Neyman-Pearson one when the latter has an accurate fault model, outperforms it when the assumed fault model is incorrect.
We present a paradigm for characterization of artifacts in limited data tomography problems. In particular, we use this paradigm to characterize artifacts that are generated in reconstructions from limited angle data with generalized Radon transforms and general filtered backprojection type operators. In order to find when visible singularities are imaged, we calculate the symbol of our reconstruction operator as a pseudodifferential operator.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا