ترغب بنشر مسار تعليمي؟ اضغط هنا

Power Law Distributions of Seismic Rates

232   0   0.0 ( 0 )
 نشر من قبل Sornette
 تاريخ النشر 2004
  مجال البحث فيزياء
والبحث باللغة English
 تأليف A. Saichev




اسأل ChatGPT حول البحث

We report an empirical determination of the probability density functions $P_{text{data}}(r)$ of the number $r$ of earthquakes in finite space-time windows for the California catalog. We find a stable power law tail $P_{text{data}}(r) sim 1/r^{1+mu}$ with exponent $mu approx 1.6$ for all space ($5 times 5$ to $20 times 20$ km$^2$) and time intervals (0.1 to 1000 days). These observations, as well as the non-universal dependence on space-time windows for all different space-time windows simultaneously, are explained by solving one of the most used reference model in seismology (ETAS), which assumes that each earthquake can trigger other earthquakes. The data imposes that active seismic regions are Cauchy-like fractals, whose exponent $delta =0.1 pm 0.1$ is well-constrained by the seismic rate data.

قيم البحث

اقرأ أيضاً

87 - A. Saichev 2004
We report an empirical determination of the probability density functions P(r) of the number r of earthquakes in finite space-time windows for the California catalog, over fixed spatial boxes 5 x 5 km^2 and time intervals dt =1, 10, 100 and 1000 days . We find a stable power law tail P(r) ~ 1/r^{1+mu} with exponent mu approx 1.6 for all time intervals. These observations are explained by a simple stochastic branching process previously studied by many authors, the ETAS (epidemic-type aftershock sequence) model which assumes that each earthquake can trigger other earthquakes (``aftershocks). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. We develop the full theory in terms of generating functions for describing the space-time organization of earthquake sequences and develop several approximations to solve the equations. The calibration of the theory to the empirical observations shows that it is essential to augment the ETAS model by taking account of the pre-existing frozen heterogeneity of spontaneous earthquake sources. This seems natural in view of the complex multi-scale nature of fault networks, on which earthquakes nucleate. Our extended theory is able to account for the empirical observation satisfactorily. In particular, the adjustable parameters are determined by fitting the largest time window $dt=1000$ days and are then used as frozen in the formulas for other time scales, with very good agreement with the empirical data.
237 - A. Saichev 2005
Using the ETAS branching model of triggered seismicity, we apply the formalism of generating probability functions to calculate exactly the average difference between the magnitude of a mainshock and the magnitude of its largest aftershock over all g enerations. This average magnitude difference is found empirically to be independent of the mainshock magnitude and equal to 1.2, a universal behavior known as Baths law. Our theory shows that Baths law holds only sufficiently close to the critical regime of the ETAS branching process. Allowing for error bars +- 0.1 for Baths constant value around 1.2, our exact analytical treatment of Baths law provides new constraints on the productivity exponent alpha and the branching ratio n: $0.9 <= alpha <= 1$ and 0.8 <= n <= 1. We propose a novel method for measuring alpha based on the predicted renormalization of the Gutenberg-Richter distribution of the magnitudes of the largest aftershock. We also introduce the ``second Baths law for foreshocks: the probability that a main earthquake turns out to be the foreshock does not depend on its magnitude.
A number of human activities exhibit a bursty pattern, namely periods of very high activity that are followed by rest periods. Records of these processes generate time series of events whose inter-event times follow a probability distribution that di splays a fat tail. The grounds for such phenomenon are not yet clearly understood. In the present work we use the freely available Wikipedia editing records to unravel some features of this phenomenon. We show that even though the probability to start editing is conditioned by the circadian 24 hour cycle, the conditional probability for the time interval between successive edits at a given time of the day is independent from the latter. We confirm our findings with the activity of posting on the social network Twitter. Our result suggests there is an intrinsic humankind scheduling pattern: after overcoming the encumbrance to start an activity, there is a robust distribution of new related actions, which does not depend on the time of day.
A proof of the relativistic $H$-theorem by including nonextensive effects is given. As it happens in the nonrelativistic limit, the molecular chaos hypothesis advanced by Boltzmann does not remain valid, and the second law of thermodynamics combined with a duality transformation implies that the q-parameter lies on the interval [0,2]. It is also proved that the collisional equilibrium states (null entropy source term) are described by the relativistic $q$-power law extension of the exponential Juttner distribution which reduces, in the nonrelativistic domain, to the Tsallis power law function. As a simple illustration of the basic approach, we derive the relativistic nonextensive equilibrium distribution for a dilute charged gas under the action of an electromagnetic field $F^{{mu u}}$. Such results reduce to the standard ones in the extensive limit, thereby showing that the nonextensive entropic framework can be harmonized with the space-time ideas contained in the special relativity theory.
The article analyses two potential metamaterial designs, the metafoundation and the metabarrier, capable to attenuate seismic waves on buildings or structural components in a frequency band between 3.5 to 8 Hz. The metafoundation serves the dual purp ose of reducing the seismic response and supporting the superstructure. Conversely the metabarrier surrounds and shields the structure from incoming waves. The two solutions are based on a cell layout of local resonators whose dynamic properties are tuned using finite element simulations combined with Bloch-Floquet boundary conditions. To enlarge the attenuation band, a graded design where the resonant frequency of each cell varies spatially is employed. If appropriately enlarged or reduced, the metamaterial designs could be used to attenuate lower frequency seismic waves or groundborne vibrations respectively. A sensitivity analysis over various design parameters including size, number of resonators, soil type and source directivity, carried out by computing full 3D numerical simulations in time domain for horizontal shear waves is proposed. Overall, the metamaterial solutions discussed here can reduce the spectral amplification of the superstructure between approx. 15 to 70% depending on several parameters including the metastructure size and the properties of the soil. Pitfalls and advantages of each configuration are discussed in detail. The role of damping, crucial to avoid multiple resonant coupling, and the analogies between graded metamaterials and tuned mass dampers is also investigated.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا