Do you want to publish a course? Click here

Spatial Global Sensitivity Analysis of High Resolution classified topographic data use in 2D urban flood modelling

138   0   0.0 ( 0 )
 Added by Olivier Delestre
 Publication date 2016
and research's language is English




Ask ChatGPT about the research

This paper presents a spatial Global Sensitivity Analysis (GSA) approach in a 2D shallow water equations based High Resolution (HR) flood model. The aim of a spatial GSA is to produce sensitivity maps which are based on Sobol index estimations. Such an approach allows to rank the effects of uncertain HR topographic data input parameters on flood model output. The influence of the three following parameters has been studied: the measurement error, the level of details of above-ground elements representation and the spatial discretization resolution. To introduce uncertainty, a Probability Density Function and discrete spatial approach have been applied to generate 2, 000 DEMs. Based on a 2D urban flood river event modelling, the produced sensitivity maps highlight the major influence of modeller choices compared to HR measurement errors when HR topographic data are used, and the spatial variability of the ranking. Highlights $bullet$ Spatial GSA allowed the production of Sobol index maps, enhancing the relative weight of each uncertain parameter on the variability of calculated output parameter of interest. 1 $bullet$ The Sobol index maps illustrate the major influence of the modeller choices, when using the HR topographic data in 2D hydraulic models with respect to the influence of HR dataset accuracy. $bullet$ Added value is for modeller to better understand limits of his model. $bullet$ Requirements and limits for this approach are related to subjectivity of choices and to computational cost.



rate research

Read More

High resolution (infra-metric) topographic data, including photogram-metric born 3D classified data, are becoming commonly available at large range of spatial extend, such as municipality or industrial site scale. This category of dataset is promising for high resolution (HR) Digital Surface Model (DSM) generation, allowing inclusion of fine above-ground structures which might influence overland flow hydrodynamic in urban environment. Nonetheless several categories of technical and numerical challenges arise from this type of data use with standard 2D Shallow Water Equations (SWE) based numerical codes. FullSWOF (Full Shallow Water equations for Overland Flow) is a code based on 2D SWE under conservative form. This code relies on a well-balanced finite volume method over a regular grid using numerical method based on hydrostatic reconstruction scheme. When compared to existing industrial codes used for urban flooding simulations, numerical approach implemented in FullSWOF allows to handle properly flow regime changes, preservation of water depth positivity at wet/dry cells transitions and steady state preservation. FullSWOF has already been tested on analytical solution library (SWASHES) and has been used to simulate runoff and dam-breaks. FullSWOFs above mentioned properties are of good interest for urban overland flow. Objectives of this study are (i) to assess the feasibility and added values of using HR 3D classified topographic data to model river overland flow and (ii) to take advantage of FullSWOF code properties for overland flow simulation in urban environment.
67 - M Abily 2016
Technologies such as aerial photogrammetry allow production of 3D topographic data including complex environments such as urban areas. Therefore, it is possible to create High Resolution (HR) Digital Elevation Models (DEM) incorporating thin above ground elements influencing overland flow paths. Even though this category of big data has a high level of accuracy, there are still errors in measurements and hypothesis under DEM elaboration. Moreover, operators look for optimizing spatial discretization resolution in order to improve flood models computation time. Errors in measurement, errors in DEM generation, and operator choices for inclusion of this data within 2D hydraulic model, might influence results of flood models simulations. These errors and hypothesis may influence significantly flood modelling results variability. The purpose of this study is to investigate uncertainties related to (i) the own error of high resolution topographic data, and (ii) the modeller choices when including topographic data in hydraulic codes. The aim is to perform a Global Sensitivity Analysis (GSA) which goes through a Monte-Carlo uncertainty propagation, to quantify impact of uncertainties, followed by a Sobol indices computation, to rank influence of identified parameters on result variability. A process using a coupling of an environment for parametric computation (Prom{e}th{e}e) and a code relying on 2D shallow water equations (FullSWOF 2D) has been developed (P-FS tool). The study has been performed over the lower part of the Var river valley using the estimated hydrograph of 1994 flood event. HR topographic data has been made available for the study area, which is 17.5 km 2 , by Nice municipality. Three uncertain parameters were studied: the measurement error (var. E), the level of details of above-ground element representation in DEM (buildings, sidewalks, etc.) (var. S), and the spatial discretization resolution (grid cell size for regular mesh) (var. R). Parameter var. E follows a probability density function, whereas parameters var. S and var. R. are discrete operator choices. Combining these parameters, a database of 2, 000 simulations has been produced using P-FS tool implemented on a high performance computing structure. In our study case, the output of interest is the maximal
Accurately forecasting urban development and its environmental and climate impacts critically depends on realistic models of the spatial structure of the built environment, and of its dependence on key factors such as population and economic development. Scenario simulation and sensitivity analysis, i.e., predicting how changes in underlying factors at a given location affect urbanization outcomes at other locations, is currently not achievable at a large scale with traditional urban growth models, which are either too simplistic, or depend on detailed locally-collected socioeconomic data that is not available in most places. Here we develop a framework to estimate, purely from globally-available remote-sensing data and without parametric assumptions, the spatial sensitivity of the (textit{static}) rate of change of urban sprawl to key macroeconomic development indicators. We formulate this spatial regression problem as an image-to-image translation task using conditional generative adversarial networks (GANs), where the gradients necessary for comparative static analysis are provided by the backpropagation algorithm used to train the model. This framework allows to naturally incorporate physical constraints, e.g., the inability to build over water bodies. To validate the spatial structure of model-generated built environment distributions, we use spatial statistics commonly used in urban form analysis. We apply our method to a novel dataset comprising of layers on the built environment, nightlighs measurements (a proxy for economic development and energy use), and population density for the worlds most populous 15,000 cities.
In the political decision process and control of COVID-19 (and other epidemic diseases), mathematical models play an important role. It is crucial to understand and quantify the uncertainty in models and their predictions in order to take the right decisions and trustfully communicate results and limitations. We propose to do uncertainty quantification in SIR-type models using the efficient framework of generalized Polynomial Chaos. Through two particular case studies based on Danish data for the spread of Covid-19 we demonstrate the applicability of the technique. The test cases are related to peak time estimation and superspeading and illustrate how very few model evaluations can provide insightful statistics.
In this paper, we construct a hierarchical model for spatial compositional data, which is used to reconstruct past land-cover compositions (in terms of coniferous forest, broadleaved forest, and unforested/open land) for five time periods during the past $6,000$ years over Europe. The model consists of a Gaussian Markov Random Field (GMRF) with Dirichlet observations. A block updated Markov chain Monte Carlo (MCMC), including an adaptive Metropolis adjusted Langevin step, is used to estimate model parameters. The sparse precision matrix in the GMRF provides computational advantages leading to a fast MCMC algorithm. Reconstructions are obtained by combining pollen-based estimates of vegetation cover at a limited number of locations with scenarios of past deforestation and output from a dynamic vegetation model. To evaluate uncertainties in the predictions a novel way of constructing joint confidence regions for the entire composition at each prediction location is proposed. The hierarchical models ability to reconstruct past land cover is evaluated through cross validation for all time periods, and by comparing reconstructions for the recent past to a present day European forest map. The evaluation results are promising and the model is able to capture known structures in past land-cover compositions.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا