No Arabic abstract
Recently, resilience is increasingly used as a concept for understanding natural disaster systems. Landslide is one of the most frequent geohazards in the Three Gorges Reservoir Area (TGRA).However, it is difficult to measure local disaster resilience, because of special geographical location in the TGRA and the special disaster landslide. Current approaches to disaster resilience evaluation are usually limited either by the qualitative method or properties of different disaster. Therefore, practical evaluating methods for the disaster resilience are needed. In this study, we developed an indicator system to evaluate landslides disaster resilience in the TGRE at the county level. It includes two properties of inherent geological stress and external social response, which are summarized into physical stress and social forces. The evaluated disaster resilience can be simulated for promoting strategies with fuzzy cognitive map (FCM).
Prolonged power outages debilitate the economy and threaten public health. Existing research is generally limited in its scope to a single event, an outage cause, or a region. Here, we provide one of the most comprehensive analyses of U.S. power outages for 2002--2019. We categorized all outage data collected under U.S. federal mandates into four outage causes and computed industry-standard reliability metrics. Our spatiotemporal analysis reveals six of the most resilient U.S. states since 2010, improvement of power resilience against natural hazards in the south and northeast regions, and a disproportionately large number of human attacks for its population in the Western Electricity Coordinating Council region. Our regression analysis identifies several statistically significant predictors and hypotheses for power resilience. Furthermore, we propose a novel framework for analyzing outage data using differential weighting and influential points to better understand power resilience. We share curated data and code as Supplementary Materials.
Hurricanes have caused power outages and blackouts, affecting millions of customers and inducing severe social and economic impacts. The impacts of hurricane-caused blackouts may worsen due to increased heat extremes and possibly increased hurricanes under climate change. We apply hurricane and heatwave projections with power outage and recovery process analysis to investigate how the emerging hurricane-blackout-heatwave compound hazard may vary in a changing climate, for Harris County in Texas (including major part of Houston City) as an example. We find that, under the high-emissions scenario RCP8.5, the expected percent of customers experiencing at least one longer-than-5-day hurricane-induced power outage in a 20-year period would increase significantly from 14% at the end of the 20th century to 44% at the end of the 21st century in Harris County. The expected percent of customers who may experience at least one longer-than-5-day heatwave without power (to provide air conditioning) would increase alarmingly, from 0.8% to 15.5%. These increases of risk may be largely avoided if the climate is well controlled under the stringent mitigation scenario RCP2.6. We also reveal that a moderate enhancement of critical sectors of the distribution network can significantly improve the resilience of the entire power grid and mitigate the risk of the future compound hazard. Together these findings suggest that, in addition to climate mitigation, climate adaptation actions are urgently needed to improve the resilience of coastal power systems.
The vast majority of landslide susceptibility studies assumes the slope instability process to be time-invariant under the definition that the past and present are keys to the future. This assumption may generally be valid. However, the trigger, be it a rainfall or an earthquake event, clearly varies over time. And yet, the temporal component of the trigger is rarely included in landslide susceptibility studies and only confined to hazard assessment. In this work, we investigate a population of landslides triggered in response to the 2017 Jiuzhaigou earthquake ($M_w = 6.5$) including the associated ground motion in the analyses, these being carried out at the Slope Unit (SU) level. We do this by implementing a Bayesian version of a Generalized Additive Model and assuming that the slope instability across the SUs in the study area behaves according to a Bernoulli probability distribution. This procedure would generally produce a susceptibility map reflecting the spatial pattern of the specific trigger and therefore of limited use for land use planning. However, we implement this first analytical step to reliably estimate the ground motion effect, and its distribution, on unstable SUs. We then assume the effect of the ground motion to be time-invariant, enabling statistical simulations for any ground motion scenario that occurred in the area from 1933 to 2017. As a result, we obtain the full spectrum of potential susceptibility patterns over the last century and compress this information into a susceptibility model/map representative of all the possible ground motion patterns since 1933. This backward statistical simulations can also be further exploited in the opposite direction where, by accounting for scenario-based ground motion, one can also use it in a forward direction to estimate future unstable slopes.
A mathematical model for the COVID-19 pandemic spread, which integrates age-structured Susceptible-Exposed-Infected-Recovered-Deceased dynamics with real mobile phone data accounting for the population mobility, is presented. The dynamical model adjustment is performed via Approximate Bayesian Computation. Optimal lockdown and exit strategies are determined based on nonlinear model predictive control, constrained to public-health and socio-economic factors. Through an extensive computational validation of the methodology, it is shown that it is possible to compute robust exit strategies with realistic reduced mobility values to inform public policy making, and we exemplify the applicability of the methodology using datasets from England and France. Code implementing the described experiments is available at https://github.com/OptimalLockdown.
Recent years have witnessed the proliferation of Low-power Wide Area Networks (LPWANs) in the unlicensed band for various Internet-of-Things (IoT) applications. Due to the ultra-low transmission power and long transmission duration, LPWAN devices inevitably suffer from high power Cross Technology Interference (CTI), such as interference from Wi-Fi, coexisting in the same spectrum. To alleviate this issue, this paper introduces the Partial Symbol Recovery (PSR) scheme for improving the CTI resilience of LPWAN. We verify our idea on LoRa, a widely adopted LPWAN technique, as a proof of concept. At the PHY layer, although CTI has much higher power, its duration is relatively shorter compared with LoRa symbols, leaving part of a LoRa symbol uncorrupted. Moreover, due to its high redundancy, LoRa chips within a symbol are highly correlated. This opens the possibility of detecting a LoRa symbol with only part of the chips. By examining the unique frequency patterns in LoRa symbols with time-frequency analysis, our design effectively detects the clean LoRa chips that are free of CTI. This enables PSR to only rely on clean LoRa chips for successfully recovering from communication failures. We evaluate our PSR design with real-world testbeds, including SX1280 LoRa chips and USRP B210, under Wi-Fi interference in various scenarios. Extensive experiments demonstrate that our design offers reliable packet recovery performance, successfully boosting the LoRa packet reception ratio from 45.2% to 82.2% with a performance gain of 1.8 times.