Do you want to publish a course? Click here

A Dual EnKF for Estimating Water Level, Bottom Roughness, and Bathymetry in a 1-D Hydrodynamic Model

116   0   0.0 ( 0 )
 Added by Milad Hooshyar
 Publication date 2016
  fields Physics
and research's language is English




Ask ChatGPT about the research

Data assimilation has been applied to coastal hydrodynamic models to better estimate system states or parameters by incorporating observed data into the model. Kalman Filter (KF) is one of the most studied data assimilation methods whose application is limited to linear systems. For nonlinear systems such as hydrodynamic models a variation of the KF called Ensemble Kalman Filter (EnKF) is applied to update the system state in the context of Monte Carlo simulation. In this research, a dual EnKF approach is used to simultaneously estimate state (water surface elevation) and parameters (bottom roughness and bathymetry) of the shallow water models. The sensitivity of the filter to 1) the quantity and precision of the observations, and 2) the initial estimation of parameters is investigated in a 1-D shallow water problem located in the Gulf of Mexico. Results show that starting from an initial estimate of bottom roughness and bathymetry within a logical range and utilizing observations available at a limited number of gages the dual EnKF is able to improve the bottom roughness and bathymetry fields. The performance of the filter is sensitive to the precision of measured data, especially in the case of estimating Mannings n and bathymetry simultaneously.



rate research

Read More

This study presents a new formulation for the norms and scalar products used in tangent linear or adjoint models to determine forecast errors and sensitivity to observations and to calculate singular vectors. The new norm is derived from the concept of moist-air available enthalpy, which is one of the availability functions referred to as exergy in general thermodynamics. It is shown that the sum of the kinetic energy and the moist-air available enthalpy can be used to define a new moist-air squared norm which is quadratic in: 1) wind components; 2) temperature; 3) surface pressure; and 4) water vapor content. Preliminary numerical applications are performed to show that the new weighting factors for temperature and water vapor are significantly different from those used in observation impact studies, and are in better agreement with observed analysis increments. These numerical applications confirm that the weighting factors for water vapor and temperature exhibit a large increase with height (by several orders of magnitude) and a minimum in the middle troposphere, respectively.
106 - A. DellAquila 2005
In this study we compare the representation of the southern hemisphere midlatitude winter variability in the NCEP-NCAR and ERA40 reanalyses. We use the classical Hayashi spectral technique, recently applied to compare the description of the atmospheric variability in the northern hemisphere on different spectral sub-domains. We test the agreement of the two reanalysis systems in the representation of the atmospheric activity. In the southern hemisphere, even in the satellite period, the assimilated data are relatively scarce, predominately over the oceans, and they provide a weaker constraint to the model dynamics. We find relevant discrepancies in the description of the variability at different spatial and temporal scales. ERA40 is generally characterised by a larger variance, especially in the high frequency spectral region. In the pre-satellite period the discrepancies between the two reanalyses are large and randomly distributed while after the 1979 the discrepancies are systematic. Moreover, a sudden jump in the VTPR period (1973-1978) is observed, mostly in the ERA40 reanalysis. Our results suggest that today we do not have a well-defined picture of the properties of the winter mid-latitude variability in the southern hemisphere to be used in the evaluation of the realism of climate models and demand for an intercomparison study for the assessment of the self-consistency of the IPCC models in the representation of the analysed properties.
141 - Nicola Scafetta 2013
Errors in applying regression models and wavelet filters used to analyze geophysical signals are discussed: (1) multidecadal natural oscillations (e.g. the quasi 60-year Atlantic Multidecadal Oscillation (AMO), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO)) need to be taken into account for properly quantifying anomalous accelerations in tide gauge records such as in New York City; (2) uncertainties and multicollinearity among climate forcing functions prevent a proper evaluation of the solar contribution to the 20th century global surface temperature warming using overloaded linear regression models during the 1900-2000 period alone; (3) when periodic wavelet filters, which require that a record is pre-processed with a reflection methodology, are improperly applied to decompose non-stationary solar and climatic time series, Gibbs boundary artifacts emerge yielding misleading physical interpretations. By correcting these errors and using optimized regression models that reduce multicollinearity artifacts, I found the following results: (1) the sea level in New York City is not accelerating in an alarming way, and may increase by about 350 mm from 2000 to 2100 instead of the previously projected values varying from 1130 mm to 1550 mm estimated using the methods proposed by Sallenger et al. (2012) and Boon (2012), respectively; (2) the solar activity increase during the 20th century contributed about 50% of the 0.8 K global warming observed during the 20th century instead of only 7-10% (IPCC, 2007; Benestad and Schmidt, 2009; Lean and Rind, 2009). These findings stress the importance of natural oscillations and of the sun to properly interpret climatic changes.
In the analysis of empirical signals, detecting correlations that capture genuine interactions between the elements of a complex system is a challenging task with applications across disciplines. Here we analyze a global data set of surface air temperature (SAT) with daily resolution. Hilbert analysis is used to obtain phase, instantaneous frequency and amplitude information of SAT seasonal cycles in different geographical zones. The analysis of the phase dynamics reveals large regions with coherent seasonality. The analysis of the instantaneous frequencies uncovers clean wave patterns formed by alternating regions of negative and positive correlations. In contrast, the analysis of the amplitude dynamics uncovers wave patterns with additional large-scale structures. These structures are interpreted as due to the fact that the amplitude dynamics is affected by processes that act in long and short time scales, while the dynamics of the instantaneous frequency is mainly governed by fast processes. Therefore, Hilbert analysis allows to disentangle climatic processes and to track planetary atmospheric waves. Our results are relevant for the analysis of complex oscillatory signals because they offer a general strategy for uncovering interactions that act at different time scales.
We propose a statistical approach to tornadoes modeling for predicting and simulating occurrences of tornadoes and accumulated cost distributions over a time interval. This is achieved by modeling the tornadoes intensity, measured with the Fujita scale, as a stochastic process. Since the Fujita scale divides tornadoes intensity into six states, it is possible to model the tornadoes intensity by using Markov and semi-Markov models. We demonstrate that the semi-Markov approach is able to reproduce the duration effect that is detected in tornadoes occurrence. The superiority of the semi-Markov model as compared to the Markov chain model is also affirmed by means of a statistical test of hypothesis. As an application we compute the expected value and the variance of the costs generated by the tornadoes over a given time interval in a given area. he paper contributes to the literature by demonstrating that semi-Markov models represent an effective tool for physical analysis of tornadoes as well as for the estimation of the economic damages to human things.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا