ترغب بنشر مسار تعليمي؟ اضغط هنا

A coherent structure approach for parameter estimation in Lagrangian Data Assimilation

89   0   0.0 ( 0 )
 نشر من قبل John Maclean
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce a data assimilation method to estimate model parameters with observations of passive tracers by directly assimilating Lagrangian Coherent Structures. Our approach differs from the usual Lagrangian Data Assimilation approach, where parameters are estimated based on tracer trajectories. We employ the Approximate Bayesian Computation (ABC) framework to avoid computing the likelihood function of the coherent structure, which is usually unavailable. We solve the ABC by a Sequential Monte Carlo (SMC) method, and use Principal Component Analysis (PCA) to identify the coherent patterns from tracer trajectory data. Our new method shows remarkably improved results compared to the bootstrap particle filter when the physical model exhibits chaotic advection.



قيم البحث

اقرأ أيضاً

Lagrangian data assimilation of complex nonlinear turbulent flows is an important but computationally challenging topic. In this article, an efficient data-driven statistically accurate reduced-order modeling algorithm is developed that significantly accelerates the computational efficiency of Lagrangian data assimilation. The algorithm starts with a Fourier transform of the high-dimensional flow field, which is followed by an effective model reduction that retains only a small subset of the Fourier coefficients corresponding to the energetic modes. Then a linear stochastic model is developed to approximate the nonlinear dynamics of each Fourier coefficient. Effective additive and multiplicative noise processes are incorporated to characterize the modes that exhibit Gaussian and non-Gaussian statistics, respectively. All the parameters in the reduced order system, including the multiplicative noise coefficients, are determined systematically via closed analytic formulae. These linear stochastic models succeed in forecasting the uncertainty and facilitate an extremely rapid data assimilation scheme. The new Lagrangian data assimilation is then applied to observations of sea ice floe trajectories that are driven by atmospheric winds and turbulent ocean currents. It is shown that observing only about $30$ non-interacting floes in a $200$km$times200$km domain is sufficient to recover the key multi-scale features of the ocean currents. The additional observations of the floe angular displacements are found to be suitable supplements to the center-of-mass positions for improving the data assimilation skill. In addition, the observed large and small floes are more useful in recovering the large- and small-scale features of the ocean, respectively. The Fourier domain data assimilation also succeeds in recovering the ocean features in the areas where cloud cover obscures the observations.
The mixed-logit model is a flexible tool in transportation choice analysis, which provides valuable insights into inter and intra-individual behavioural heterogeneity. However, applications of mixed-logit models are limited by the high computational and data requirements for model estimation. When estimating on small samples, the Bayesian estimation approach becomes vulnerable to over and under-fitting. This is problematic for investigating the behaviour of specific population sub-groups or market segments with low data availability. Similar challenges arise when transferring an existing model to a new location or time period, e.g., when estimating post-pandemic travel behaviour. We propose an Early Stopping Bayesian Data Assimilation (ESBDA) simulator for estimation of mixed-logit which combines a Bayesian statistical approach with Machine Learning methodologies. The aim is to improve the transferability of mixed-logit models and to enable the estimation of robust choice models with low data availability. This approach can provide new insights into choice behaviour where the traditional estimation of mixed-logit models was not possible due to low data availability, and open up new opportunities for investment and planning decisions support. The ESBDA estimator is benchmarked against the Direct Application approach, a basic Bayesian simulator with random starting parameter values and a Bayesian Data Assimilation (BDA) simulator without early stopping. The ESBDA approach is found to effectively overcome under and over-fitting and non-convergence issues in simulation. Its resulting models clearly outperform those of the reference simulators in predictive accuracy. Furthermore, models estimated with ESBDA tend to be more robust, with significant parameters with signs and values consistent with behavioural theory, even when estimated on small samples.
137 - Eugene Kazantsev 2008
The use of data assimilation technique to identify optimal topography is discussed in frames of time-dependent motion governed by non-linear barotropic ocean model. Assimilation of artificially generated data allows to measure the influence of variou s error sources and to classify the impact of noise that is present in observational data and model parameters. The choice of assimilation window is discussed. Assimilating noisy data with longer windows provides higher accuracy of identified topography. The topography identified once by data assimilation can be successfully used for other model runs that start from other initial conditions and are situated in other parts of the models attractor.
87 - Guangyao Wang , Yulin Pan 2020
Through ensemble-based data assimilation (DA), we address one of the most notorious difficulties in phase-resolved ocean wave forecast, regarding the deviation of numerical solution from the true surface elevation due to the chaotic nature of and und errepresented physics in the nonlinear wave models. In particular, we develop a coupled approach of the high-order spectral (HOS) method with the ensemble Kalman filter (EnKF), through which the measurement data can be incorporated into the simulation to improve the forecast performance. A unique feature in this coupling is the mismatch between the predictable zone and measurement region, which is accounted for through a special algorithm to modify the analysis equation in EnKF. We test the performance of the new EnKF-HOS method using both synthetic data and real radar measurements. For both cases (though differing in details), it is shown that the new method achieves much higher accuracy than the HOS-only method, and can retain the phase information of an irregular wave field for arbitrarily long forecast time with sequentially assimilated data.
In recent years, there has been growing interest in using Precipitable Water Vapor (PWV) derived from Global Positioning System (GPS) signal delays to predict rainfall. However, the occurrence of rainfall is dependent on a myriad of atmospheric param eters. This paper proposes a systematic approach to analyze various parameters that affect precipitation in the atmosphere. Different ground-based weather features like Temperature, Relative Humidity, Dew Point, Solar Radiation, PWV along with Seasonal and Diurnal variables are identified, and a detailed feature correlation study is presented. While all features play a significant role in rainfall classification, only a few of them, such as PWV, Solar Radiation, Seasonal and Diurnal features, stand out for rainfall prediction. Based on these findings, an optimum set of features are used in a data-driven machine learning algorithm for rainfall prediction. The experimental evaluation using a four-year (2012-2015) database shows a true detection rate of 80.4%, a false alarm rate of 20.3%, and an overall accuracy of 79.6%. Compared to the existing literature, our method significantly reduces the false alarm rates.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا