ترغب بنشر مسار تعليمي؟ اضغط هنا

Apparent Clustering and Apparent Background Earthquakes Biased by Undetected Seismicity

93   0   0.0 ( 0 )
 نشر من قبل Max Werner
 تاريخ النشر 2005
  مجال البحث فيزياء
والبحث باللغة English
 تأليف D. Sornette




اسأل ChatGPT حول البحث

In models of triggered seismicity and in their inversion with empirical data, the detection threshold m_d is commonly equated to the magnitude m_0 of the smallest triggering earthquake. This unjustified assumption neglects the possibility of shocks below the detection threshold triggering observable events. We introduce a formalism that distinguishes between the detection threshold m_d and the minimum triggering earthquake m_0 < m_d. By considering the branching structure of one complete cascade of triggered events, we derive the apparent branching ratio n_a (which is the apparent fraction of aftershocks in a given catalog) and the apparent background source S_a that are observed when only the structure above the detection threshold m_d is known due to the presence of smaller undetected events that are capable of triggering larger events. If earthquake triggering is controlled in large part by the smallest magnitudes as several recent analyses have shown, this implies that previous estimates of the clustering parameters may significantly underestimate the true values: for instance, an observed fraction of 55% of aftershocks is renormalized into a true value of 75% of triggered events.



قيم البحث

اقرأ أيضاً

Fundamentally related to the UV divergence problem in Physics, conventional wisdom in seismology is that the smallest earthquakes, which are numerous and often go undetected, dominate the triggering of major earthquakes, making the prediction of the latter difficult if not inherently impossible. By developing a rigorous validation procedure, we show that, in fact, large earthquakes (above magnitude 6.3 in California) are preferentially triggered by large events. Because of the magnitude correlations intrinsic in the validated model, we further rationalize the existence of earthquake doublets. These findings have far-reaching implications for short-term and medium-term seismic risk assessment, as well as for the development of a deeper theory without UV cut-off that is locally self-similar.
67 - Dirk Grupe 2006
We remark on the utility of an observational relation between the absorption column density in excess of the Galactic absorption column density, $Delta N_{rm H} = N_{rm H, fit} - N_{rm H, gal}$, and redshift, z, determined from all 55 Swift-observed long bursts with spectroscopic redshifts as of 2006 December. The absorption column densities, $N_{rm H, fit}$, are determined from powerlaw fits to the X-ray spectra with the absorption column density left as a free parameter. We find that higher excess absorption column densities with $Delta N_{rm H} > 2times 10^{21}$ cm$^{-2}$ are only present in bursts with redshifts z$<$2. Low absorption column densities with $Delta N_{rm H} < 1times 10^{21}$ cm$^{-2}$ appear preferentially in high-redshift bursts. Our interpretation is that this relation between redshift and excess column density is an observational effect resulting from the shift of the source rest-frame energy range below 1 keV out of the XRT observable energy range for high redshift bursts. We found a clear anti-correlation between $Delta N_{rm H}$ and z that can be used to estimate the range of the maximum redshift of an afterglow. A critical application of our finding is that rapid X-ray observations can be used to optimize the instrumentation used for ground-based optical/NIR follow-up observations. Ground-based spectroscopic redshift measurements of as many bursts as possible are crucial for GRB science.
175 - D. Sornette 2008
This entry in the Encyclopedia of Complexity and Systems Science, Springer present a summary of some of the concepts and calculational tools that have been developed in attempts to apply statistical physics approaches to seismology. We summarize the leading theoretical physical models of the space-time organization of earthquakes. We present a general discussion and several examples of the new metrics proposed by statistical physicists, underlining their strengths and weaknesses. The entry concludes by briefly outlining future directions. The presentation is organized as follows. I Glossary II Definition and Importance of the Subject III Introduction IV Concepts and Calculational Tools IV.1 Renormalization, Scaling and the Role of Small Earthquakes in Models of Triggered Seismicity IV.2 Universality IV.3 Intermittent Periodicity and Chaos IV.4 Turbulence IV.5 Self-Organized Criticality V Competing mechanisms and models V.1 Roots of complexity in seismicity: dynamics or heterogeneity? V.2 Critical earthquakes V.3 Spinodal decomposition V.4 Dynamics, stress interaction and thermal fluctuation effects VI Empirical studies of seismicity inspired by statistical physics VI.1 Early successes and latter subsequent challenges VI.2 Entropy method for the distribution of time intervals between mainshocks VI.3 Scaling of the PDF of Waiting Times VI.4 Scaling of the PDF of Distances Between Subsequent Earthquakes VI.5 The Network Approach VII Future Directions
59 - Arnaud Mignan 2015
The standard paradigm to describe seismicity induced by fluid injection is to apply nonlinear diffusion dynamics in a poroelastic medium. I show that the spatiotemporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the behaviour of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non- Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope.
Several recent works point out that the crowd of small unobservable earthquakes (with magnitudes below the detection threshold $m_d$) may play a significant and perhaps dominant role in triggering future seismicity. Using the ETAS branching model of triggered seismicity, we apply the formalism of generating probability functions to investigate how the statistical properties of observable earthquakes differ from the statistics of all events. The ETAS (epidemic-type aftershock sequence) model assumes that each earthquake can trigger other earthquakes (``aftershocks). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. The triggering efficiency of earthquakes is assumed to vanish below a lower magnitude limit $m_0$, in order to ensure the convergence of the theory and may reflect the physics of state-and-velocity frictional rupture. We show that, to a good approximation, the ETAS model is renormalized onto itself under what amounts to a decimation procedure $m_0 to m_d$, with just a renormalization of the branching ratio from $n$ to an effective value $n(m_d)$. Our present analysis thus confirms, for the full statistical properties, the results obtained previously by one of us and Werner, based solely on the average seismic rates (the first-order moment of the statistics). However, our analysis also demonstrates that this renormalization is not exact, as there are small corrections which can be systematically calculated, in terms of additional contributions that can be mapped onto a different branching model (a new relevant direction in the language of the renormalization group).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا