Do you want to publish a course? Click here

Bayesian Characterization of Uncertainties Surrounding Fluvial Flood Hazard Estimates

72   0   0.0 ( 0 )
 Added by Sanjib Sharma
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Fluvial floods drive severe risk to riverine communities. There is a strong evidence of increasing flood hazards in many regions around the world. The choice of methods and assumptions used in flood hazard estimates can impact the design of risk management strategies. In this study, we characterize the expected flood hazards conditioned on the uncertain model structures, model parameters and prior distributions of the parameters. We construct a Bayesian framework for river stage return level estimation using a nonstationary statistical model that relies exclusively on Indian Ocean Dipole Index. We show that ignoring uncertainties can lead to biased estimation of expected flood hazards. We find that the considered model parametric uncertainty is more influential than model structures and model priors. Our results highlight the importance of incorporating uncertainty in river stage estimates, and are of practical use for informing water infrastructure designs in a changing climate.

rate research

Read More

361 - Syed Kabir 2020
Most of the two-dimensional (2D) hydraulic/hydrodynamic models are still computationally too demanding for real-time applications. In this paper, an innovative modelling approach based on a deep convolutional neural network (CNN) method is presented for rapid prediction of fluvial flood inundation. The CNN model is trained using outputs from a 2D hydraulic model (i.e. LISFLOOD-FP) to predict water depths. The pre-trained model is then applied to simulate the January 2005 and December 2015 floods in Carlisle, UK. The CNN predictions are compared favourably with the outputs produced by LISFLOOD-FP. The performance of the CNN model is further confirmed by benchmarking against a support vector regression (SVR) method. The results show that the CNN model outperforms SVR by a large margin. The CNN model is highly accurate in capturing flooded cells as indicated by several quantitative assessment matrices. The estimated error for reproducing maximum flood depth is 0 ~ 0.2 meters for the 2005 event and 0 ~ 0.5 meters for the 2015 event at over 99% of the cells covering the computational domain. The proposed CNN method offers great potential for real-time flood modelling/forecasting considering its simplicity, superior performance and computational efficiency.
We study methods for reconstructing Bayesian uncertainties on dynamical mass estimates of galaxy clusters using convolutional neural networks (CNNs). We discuss the statistical background of approximate Bayesian neural networks and demonstrate how variational inference techniques can be used to perform computationally tractable posterior estimation for a variety of deep neural architectures. We explore how various model designs and statistical assumptions impact prediction accuracy and uncertainty reconstruction in the context of cluster mass estimation. We measure the quality of our model posterior recovery using a mock cluster observation catalog derived from the MultiDark simulation and UniverseMachine catalog. We show that approximate Bayesian CNNs produce highly accurate dynamical cluster mass posteriors. These model posteriors are log-normal in cluster mass and recover $68%$ and $90%$ confidence intervals to within $1%$ of their measured value. We note how this rigorous modeling of dynamical mass posteriors is necessary for using cluster abundance measurements to constrain cosmological parameters.
Flood-related risks to people and property are expected to increase in the future due to environmental and demographic changes. It is important to quantify and effectively communicate flood hazards and exposure to inform the design and implementation of flood risk management strategies. Here we develop an integrated modeling framework to assess projected changes in regional riverine flood inundation risks. The framework samples climate model outputs to force a hydrologic model and generate streamflow projections. Together with a statistical and hydraulic model, we use the projected streamflow to map the uncertainty of flood inundation projections for extreme flood events. We implement the framework for rivers across the state of Pennsylvania, United States. Our projections suggest that flood hazards and exposure across Pennsylvania are overall increasing with future climate change. Specific regions, including the main stem Susquehanna River, lower portion of the Allegheny basin and central portion of Delaware River basin, demonstrate higher flood inundation risks. In our analysis, the climate uncertainty dominates the overall uncertainty surrounding the flood inundation projection chain. The combined hydrologic and hydraulic uncertainties can account for as much as 37% of the total uncertainty. We discuss how this framework can provide regional and dynamic flood-risk assessments and help to inform the design of risk-management strategies.
The vast majority of landslide susceptibility studies assumes the slope instability process to be time-invariant under the definition that the past and present are keys to the future. This assumption may generally be valid. However, the trigger, be it a rainfall or an earthquake event, clearly varies over time. And yet, the temporal component of the trigger is rarely included in landslide susceptibility studies and only confined to hazard assessment. In this work, we investigate a population of landslides triggered in response to the 2017 Jiuzhaigou earthquake ($M_w = 6.5$) including the associated ground motion in the analyses, these being carried out at the Slope Unit (SU) level. We do this by implementing a Bayesian version of a Generalized Additive Model and assuming that the slope instability across the SUs in the study area behaves according to a Bernoulli probability distribution. This procedure would generally produce a susceptibility map reflecting the spatial pattern of the specific trigger and therefore of limited use for land use planning. However, we implement this first analytical step to reliably estimate the ground motion effect, and its distribution, on unstable SUs. We then assume the effect of the ground motion to be time-invariant, enabling statistical simulations for any ground motion scenario that occurred in the area from 1933 to 2017. As a result, we obtain the full spectrum of potential susceptibility patterns over the last century and compress this information into a susceptibility model/map representative of all the possible ground motion patterns since 1933. This backward statistical simulations can also be further exploited in the opposite direction where, by accounting for scenario-based ground motion, one can also use it in a forward direction to estimate future unstable slopes.
In this paper we describe an algorithm for predicting the websites at risk in a long range hacking activity, while jointly inferring the provenance and evolution of vulnerabilities on websites over continuous time. Specifically, we use hazard regression with a time-varying additive hazard function parameterized in a generalized linear form. The activation coefficients on each feature are continuous-time functions constrained with total variation penalty inspired by hacking campaigns. We show that the optimal solution is a 0th order spline with a finite number of adaptively chosen knots, and can be solved efficiently. Experiments on real data show that our method significantly outperforms classic methods while providing meaningful interpretability.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا