Do you want to publish a course? Click here

From scenario-based seismic hazard to scenario-based landslide hazard: rewinding to the past via statistical simulations

117   0   0.0 ( 0 )
 Added by Luigi Lombardo
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

The vast majority of landslide susceptibility studies assumes the slope instability process to be time-invariant under the definition that the past and present are keys to the future. This assumption may generally be valid. However, the trigger, be it a rainfall or an earthquake event, clearly varies over time. And yet, the temporal component of the trigger is rarely included in landslide susceptibility studies and only confined to hazard assessment. In this work, we investigate a population of landslides triggered in response to the 2017 Jiuzhaigou earthquake ($M_w = 6.5$) including the associated ground motion in the analyses, these being carried out at the Slope Unit (SU) level. We do this by implementing a Bayesian version of a Generalized Additive Model and assuming that the slope instability across the SUs in the study area behaves according to a Bernoulli probability distribution. This procedure would generally produce a susceptibility map reflecting the spatial pattern of the specific trigger and therefore of limited use for land use planning. However, we implement this first analytical step to reliably estimate the ground motion effect, and its distribution, on unstable SUs. We then assume the effect of the ground motion to be time-invariant, enabling statistical simulations for any ground motion scenario that occurred in the area from 1933 to 2017. As a result, we obtain the full spectrum of potential susceptibility patterns over the last century and compress this information into a susceptibility model/map representative of all the possible ground motion patterns since 1933. This backward statistical simulations can also be further exploited in the opposite direction where, by accounting for scenario-based ground motion, one can also use it in a forward direction to estimate future unstable slopes.

rate research

Read More

In this paper we describe an algorithm for predicting the websites at risk in a long range hacking activity, while jointly inferring the provenance and evolution of vulnerabilities on websites over continuous time. Specifically, we use hazard regression with a time-varying additive hazard function parameterized in a generalized linear form. The activation coefficients on each feature are continuous-time functions constrained with total variation penalty inspired by hacking campaigns. We show that the optimal solution is a 0th order spline with a finite number of adaptively chosen knots, and can be solved efficiently. Experiments on real data show that our method significantly outperforms classic methods while providing meaningful interpretability.
Fluvial floods drive severe risk to riverine communities. There is a strong evidence of increasing flood hazards in many regions around the world. The choice of methods and assumptions used in flood hazard estimates can impact the design of risk management strategies. In this study, we characterize the expected flood hazards conditioned on the uncertain model structures, model parameters and prior distributions of the parameters. We construct a Bayesian framework for river stage return level estimation using a nonstationary statistical model that relies exclusively on Indian Ocean Dipole Index. We show that ignoring uncertainties can lead to biased estimation of expected flood hazards. We find that the considered model parametric uncertainty is more influential than model structures and model priors. Our results highlight the importance of incorporating uncertainty in river stage estimates, and are of practical use for informing water infrastructure designs in a changing climate.
In the last decade, scenario-based serious-games have become a main tool for learning new skills and capabilities. An important factor in the development of such systems is the overhead in time, cost and human resources to manually create the content for these scenarios. We focus on how to create content for scenarios in medical, military, commerce and gaming applications where maintaining the integrity and coherence of the content is integral for the systems success. To do so, we present an automatic method for generating content about everyday activities through combining computer science techniques with the crowd. We use the crowd in three basic ways: to capture a database of scenarios of everyday activities, to generate a database of likely replacements for specific events within that scenario, and to evaluate the resulting scenarios. We found that the generated scenarios were rated as reliable and consistent by the crowd when compared to the scenarios that were originally captured. We also compared the generated scenarios to those created by traditional planning techniques. We found that both methods were equally effective in generated reliable and consistent scenarios, yet the main advantages of our approach is that the content we generate is more varied and much easier to create. We have begun integrating this approach within a scenario-based training application for novice investigators within the law enforcement departments to improve their questioning skills.
We describe algorithms for creating probabilistic scenarios for the situation when the underlying forecast methodology is modeled as being more (or less) accurate than it has been historically. Such scenarios can be used in studies that extend into the future and may need to consider the possibility that forecast technology will improve. Our approach can also be used to generate alternative realizations of renewable energy production that are consistent with historical forecast accuracy, in effect serving as a method for creating families of realistic alternatives -- which are often critical in simulation-based analysis methodologies
Hurricanes have caused power outages and blackouts, affecting millions of customers and inducing severe social and economic impacts. The impacts of hurricane-caused blackouts may worsen due to increased heat extremes and possibly increased hurricanes under climate change. We apply hurricane and heatwave projections with power outage and recovery process analysis to investigate how the emerging hurricane-blackout-heatwave compound hazard may vary in a changing climate, for Harris County in Texas (including major part of Houston City) as an example. We find that, under the high-emissions scenario RCP8.5, the expected percent of customers experiencing at least one longer-than-5-day hurricane-induced power outage in a 20-year period would increase significantly from 14% at the end of the 20th century to 44% at the end of the 21st century in Harris County. The expected percent of customers who may experience at least one longer-than-5-day heatwave without power (to provide air conditioning) would increase alarmingly, from 0.8% to 15.5%. These increases of risk may be largely avoided if the climate is well controlled under the stringent mitigation scenario RCP2.6. We also reveal that a moderate enhancement of critical sectors of the distribution network can significantly improve the resilience of the entire power grid and mitigate the risk of the future compound hazard. Together these findings suggest that, in addition to climate mitigation, climate adaptation actions are urgently needed to improve the resilience of coastal power systems.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا