ترغب بنشر مسار تعليمي؟ اضغط هنا

Magnitude Of Earthquakes Controls The Size Distribution Of Their Triggered Events

61   0   0.0 ( 0 )
 نشر من قبل Shyam Nandan
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The driving concept behind one of the most successful statistical forecasting models, the ETAS model, has been that the seismicity is driven by spontaneously occurring background earthquakes that cascade into multitudes of triggered earthquakes. In nearly all generalizations of the ETAS model, the magnitudes of the background and the triggered earthquakes are assumed to follow Gutenberg-Richter law with the same exponent (b{eta}-value). Furthermore, the magnitudes of the triggered earthquakes are always assumed to be independent of the magnitude of the triggering earthquake. Using an EM algorithm applied to the Californian earthquake catalogue, we show that the distribution of earthquake magnitudes exhibits three distinct b{eta}-values: b{eta}_b for background events; b{eta}_a-{delta} and b{eta}_a+{delta}, respectively, for triggered events below and above the magnitude of the triggering earthquake; the two last values express a correlation between the magnitudes of triggered events with that of the triggering earthquake, a feature so far absent in all proposed operational generalizations of the ETAS model. The ETAS model incorporating this kinked magnitude distribution provides by far the best description of seismic catalogs and could thus have the best forecasting potential. We speculate that the kinked magnitude distribution may result from the system tending to restore the symmetry of the regional displacement gradient tensor that has been broken by the initiating event. The general emerging concept could be that while the background events occur primarily to accommodate the symmetric stress tensor at the boundaries of the system, the triggered earthquakes are quasi-Goldstone fluctuations of a self-organized critical deformation state.

قيم البحث

اقرأ أيضاً

Fundamentally related to the UV divergence problem in Physics, conventional wisdom in seismology is that the smallest earthquakes, which are numerous and often go undetected, dominate the triggering of major earthquakes, making the prediction of the latter difficult if not inherently impossible. By developing a rigorous validation procedure, we show that, in fact, large earthquakes (above magnitude 6.3 in California) are preferentially triggered by large events. Because of the magnitude correlations intrinsic in the validated model, we further rationalize the existence of earthquake doublets. These findings have far-reaching implications for short-term and medium-term seismic risk assessment, as well as for the development of a deeper theory without UV cut-off that is locally self-similar.
87 - A. Saichev 2004
We report an empirical determination of the probability density functions P(r) of the number r of earthquakes in finite space-time windows for the California catalog, over fixed spatial boxes 5 x 5 km^2 and time intervals dt =1, 10, 100 and 1000 days . We find a stable power law tail P(r) ~ 1/r^{1+mu} with exponent mu approx 1.6 for all time intervals. These observations are explained by a simple stochastic branching process previously studied by many authors, the ETAS (epidemic-type aftershock sequence) model which assumes that each earthquake can trigger other earthquakes (``aftershocks). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. We develop the full theory in terms of generating functions for describing the space-time organization of earthquake sequences and develop several approximations to solve the equations. The calibration of the theory to the empirical observations shows that it is essential to augment the ETAS model by taking account of the pre-existing frozen heterogeneity of spontaneous earthquake sources. This seems natural in view of the complex multi-scale nature of fault networks, on which earthquakes nucleate. Our extended theory is able to account for the empirical observation satisfactorily. In particular, the adjustable parameters are determined by fitting the largest time window $dt=1000$ days and are then used as frozen in the formulas for other time scales, with very good agreement with the empirical data.
The distribution of recurrence times or return intervals between extreme events is important to characterize and understand the behavior of physical systems and phenomena in many disciplines. It is well known that many physical processes in nature an d society display long range correlations. Hence, in the last few years, considerable research effort has been directed towards studying the distribution of return intervals for long range correlated time series. Based on numerical simulations, it was shown that the return interval distributions are of stretched exponential type. In this paper, we obtain an analytical expression for the distribution of return intervals in long range correlated time series which holds good when the average return intervals are large. We show that the distribution is actually a product of power law and a stretched exponential form. We also discuss the regimes of validity and perform detailed studies on how the return interval distribution depends on the threshold used to define extreme events.
Declustering aims to divide earthquake catalogs into independent events (mainshocks), and dependent (clustered) events, and is an integral component of many seismicity studies, including seismic hazard assessment. We assess the effect of declustering on the frequency-magnitude distribution of mainshocks. In particular, we examine the dependence of the b-value of declustered catalogs on the choice of declustering approach and algorithm-specific parameters. Using the catalog of earthquakes in California since 1980, we show that the b-value decreases by up to 30% due to declustering with respect to the undeclustered catalog. The extent of the reduction is highly dependent on the declustering method and parameters applied. We then reproduce a similar effect by declustering synthetic earthquake catalogs with known b-value, which have been generated using an Epidemic-Type Aftershock Sequence (ETAS) model. Our analysis suggests that the observed decrease in b-value must, at least partially, arise from the application of the declustering algorithm on the catalog, rather than from differences in the nature of mainshocks versus fore- or aftershocks. We conclude that declustering should be considered as a potential source of bias in seismicity and hazard studies.
We report a similarity of fluctuations in equilibrium critical phenomena and non-equilibrium systems, which is based on the concept of natural time. The world-wide seismicity as well as that of San Andreas fault system and Japan are analyzed. An orde r parameter is chosen and its fluctuations relative to the standard deviation of the distribution are studied. We find that the scaled distributions fall on the same curve, which interestingly exhibits, over four orders of magnitude, features similar to those in several equilibrium critical phenomena (e.g., 2D Ising model) as well as in non-equilibrium systems (e.g., 3D turbulent flow).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا