ترغب بنشر مسار تعليمي؟ اضغط هنا

Cross-Correlation Earthquake Precursors in the Hydrogeochemical and Geoacoustic Signals for the Kamchatka Peninsula

83   0   0.0 ( 0 )
 نشر من قبل Yuriy Polyakov
 تاريخ النشر 2012
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We propose a new type of earthquake precursor based on the analysis of correlation dynamics between geophysical signals of different nature. The precursor is found using a two-parameter cross-correlation function introduced within the framework of flicker-noise spectroscopy, a general statistical physics approach to the analysis of time series. We consider an example of cross-correlation analysis for water salinity time series, an integral characteristic of the chemical composition of groundwater, and geoacoustic emissions recorded at the G-1 borehole on the Kamchatka peninsula in the time frame from 2001 to 2003, which is characterized by a sequence of three groups of significant seismic events. We found that cross-correlation precursors took place 27, 31, and 35 days ahead of the strongest earthquakes for each group of seismic events, respectively. At the same time, precursory anomalies in the signals themselves were observed only in the geoacoustic emissions for one group of earthquakes.

قيم البحث

اقرأ أيضاً

A phenomenological systems approach for identifying potential precursors in multiple signals of different types for the same local seismically active region is proposed based on the assumption that a large earthquake may be preceded by a system recon figuration (preparation) at different time and space scales. A nonstationarity factor introduced within the framework of flicker-noise spectroscopy, a statistical physics approach to the analysis of time series, is used as the dimensionless criterion for detecting qualitative (precursory) changes within relatively short time intervals in arbitrary signals. Nonstationarity factors for chlorine-ion concentration variations in the underground water of two boreholes on the Kamchatka peninsula and geacoustic emissions in a deep borehole within the same seismic zone are studied together in the time frame around a large earthquake on October 8, 2001. It is shown that nonstationarity factor spikes (potential precursors) take place in the interval from 70 to 50 days before the earthquake for the hydrogeochemical data and at 29 and 6 days in advance for the geoacoustic data.
In line of the intermediate-term monitoring of seismic activity aimed at prediction of the world largest earthquakes the seismic dynamics of the Earths lithosphere is analysed as a single whole, which is the ultimate scale of the complex hierarchical non-linear system. The present study demonstrates that the lithosphere does behave, at least in intermediate-term scale, as non-linear dynamic system that reveals classical symptoms of instability at the approach of catastrophe, i.e., mega-earthquake. These are: (i) transformation of magnitude distribution, (ii) spatial redistribution of seismic activity, (iii) rise and acceleration of activity, (iv) change of dependencies across magnitudes of different types, and other patterns of collective behaviour. The observed global scale seismic behaviour implies the state of criticality of the Earth lithosphere in the last decade.
106 - J. Faillettaz , M. Funk , 2009
A hanging glacier at the east face of Weisshorn broke off in 2005. We were able to monitor and measure surface motion and icequake activity for 21 days up to three days prior to the break-off. Results are presented from the analysis of seismic waves generated by the glacier during the rupture maturation process. Three types of precursory signals of the imminent catastrophic rupture were identified: (i) an increasing seismic activity within the glacier, (ii) a change in the size-frequency distribution of icequake energy, and (iii) a log-periodic oscillating behavior superimposed on power law acceleration of the inverse of waiting time between two icequakes. The analysis of the seismic activity gave indications of the rupture process and led to the identification of two regimes: a stable one where events are isolated and non correlated which is characteristic of diffuse damage, and an unstable and dangerous one in which events become synchronized and large icequakes are triggered.
A likely source of earthquake clustering is static stress transfer between individual events. Previous attempts to quantify the role of static stress for earthquake triggering generally considered only the stress changes caused by large events, and o ften discarded data uncertainties. We conducted a robust two-fold empirical test of the static stress change hypothesis by accounting for all events of magnitude M>=2.5 and their location and focal mechanism uncertainties provided by catalogs for Southern California between 1981 and 2010, first after resolving the focal plane ambiguity and second after randomly choosing one of the two nodal planes. For both cases, we find compelling evidence supporting the static triggering with stronger evidence after resolving the focal plane ambiguity above significantly small (about 10 Pa) but consistently observed stress thresholds. The evidence for the static triggering hypothesis is robust with respect to the choice of the friction coefficient, Skemptons coefficient and magnitude threshold. Weak correlations between the Coulomb Index (fraction of earthquakes that received positive Coulomb stress change) and the coefficient of friction indicate that the role of normal stress in triggering is rather limited. Last but not the least, we determined that the characteristic time for the loss of the stress change memory of a single event is nearly independent of the amplitude of the Coulomb stress change and varies between ~95 and ~180 days implying that forecasts based on static stress changes will have poor predictive skills beyond times that are larger than a few hundred days on average.
We present the condensation method that exploits the heterogeneity of the probability distribution functions (PDF) of event locations to improve the spatial information content of seismic catalogs. The method reduces the size of seismic catalogs whil e improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog attributes different weights to each event, providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ~25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5km) might depend on the relocation procedure. Accounting for these new results, the Epidemic Type Aftershock Model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used to argue that earthquakes are unpredictable in general.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا