ﻻ يوجد ملخص باللغة العربية
Fluvial floods drive severe risk to riverine communities. There is a strong evidence of increasing flood hazards in many regions around the world. The choice of methods and assumptions used in flood hazard estimates can impact the design of risk management strategies. In this study, we characterize the expected flood hazards conditioned on the uncertain model structures, model parameters and prior distributions of the parameters. We construct a Bayesian framework for river stage return level estimation using a nonstationary statistical model that relies exclusively on Indian Ocean Dipole Index. We show that ignoring uncertainties can lead to biased estimation of expected flood hazards. We find that the considered model parametric uncertainty is more influential than model structures and model priors. Our results highlight the importance of incorporating uncertainty in river stage estimates, and are of practical use for informing water infrastructure designs in a changing climate.
Most of the two-dimensional (2D) hydraulic/hydrodynamic models are still computationally too demanding for real-time applications. In this paper, an innovative modelling approach based on a deep convolutional neural network (CNN) method is presented
We study methods for reconstructing Bayesian uncertainties on dynamical mass estimates of galaxy clusters using convolutional neural networks (CNNs). We discuss the statistical background of approximate Bayesian neural networks and demonstrate how va
Flood-related risks to people and property are expected to increase in the future due to environmental and demographic changes. It is important to quantify and effectively communicate flood hazards and exposure to inform the design and implementation
The vast majority of landslide susceptibility studies assumes the slope instability process to be time-invariant under the definition that the past and present are keys to the future. This assumption may generally be valid. However, the trigger, be i
In this paper we describe an algorithm for predicting the websites at risk in a long range hacking activity, while jointly inferring the provenance and evolution of vulnerabilities on websites over continuous time. Specifically, we use hazard regress