Do you want to publish a course? Click here

Does the Earth show up an impending mega-earthquake?

73   0   0.0 ( 0 )
 Added by Leontina Romashkova
 Publication date 2007
  fields Physics
and research's language is English




Ask ChatGPT about the research

In line of the intermediate-term monitoring of seismic activity aimed at prediction of the world largest earthquakes the seismic dynamics of the Earths lithosphere is analysed as a single whole, which is the ultimate scale of the complex hierarchical non-linear system. The present study demonstrates that the lithosphere does behave, at least in intermediate-term scale, as non-linear dynamic system that reveals classical symptoms of instability at the approach of catastrophe, i.e., mega-earthquake. These are: (i) transformation of magnitude distribution, (ii) spatial redistribution of seismic activity, (iii) rise and acceleration of activity, (iv) change of dependencies across magnitudes of different types, and other patterns of collective behaviour. The observed global scale seismic behaviour implies the state of criticality of the Earth lithosphere in the last decade.

rate research

Read More

We propose a new type of earthquake precursor based on the analysis of correlation dynamics between geophysical signals of different nature. The precursor is found using a two-parameter cross-correlation function introduced within the framework of flicker-noise spectroscopy, a general statistical physics approach to the analysis of time series. We consider an example of cross-correlation analysis for water salinity time series, an integral characteristic of the chemical composition of groundwater, and geoacoustic emissions recorded at the G-1 borehole on the Kamchatka peninsula in the time frame from 2001 to 2003, which is characterized by a sequence of three groups of significant seismic events. We found that cross-correlation precursors took place 27, 31, and 35 days ahead of the strongest earthquakes for each group of seismic events, respectively. At the same time, precursory anomalies in the signals themselves were observed only in the geoacoustic emissions for one group of earthquakes.
A likely source of earthquake clustering is static stress transfer between individual events. Previous attempts to quantify the role of static stress for earthquake triggering generally considered only the stress changes caused by large events, and often discarded data uncertainties. We conducted a robust two-fold empirical test of the static stress change hypothesis by accounting for all events of magnitude M>=2.5 and their location and focal mechanism uncertainties provided by catalogs for Southern California between 1981 and 2010, first after resolving the focal plane ambiguity and second after randomly choosing one of the two nodal planes. For both cases, we find compelling evidence supporting the static triggering with stronger evidence after resolving the focal plane ambiguity above significantly small (about 10 Pa) but consistently observed stress thresholds. The evidence for the static triggering hypothesis is robust with respect to the choice of the friction coefficient, Skemptons coefficient and magnitude threshold. Weak correlations between the Coulomb Index (fraction of earthquakes that received positive Coulomb stress change) and the coefficient of friction indicate that the role of normal stress in triggering is rather limited. Last but not the least, we determined that the characteristic time for the loss of the stress change memory of a single event is nearly independent of the amplitude of the Coulomb stress change and varies between ~95 and ~180 days implying that forecasts based on static stress changes will have poor predictive skills beyond times that are larger than a few hundred days on average.
We present the condensation method that exploits the heterogeneity of the probability distribution functions (PDF) of event locations to improve the spatial information content of seismic catalogs. The method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog attributes different weights to each event, providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ~25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5km) might depend on the relocation procedure. Accounting for these new results, the Epidemic Type Aftershock Model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used to argue that earthquakes are unpredictable in general.
Geometrical properties of landscapes result from the geological processes that have acted through time. The quantitative analysis of natural relief represents an objective form of aiding in the visual interpretation of landscapes, as studies on coastlines, river networks, and global topography, have shown. Still, an open question is whether a clear relationship between the quantitative properties of landscapes and the dominant geomorphologic processes that originate them can be established. In this contribution, we show that the geometry of topographic isolines is an appropriate observable to help disentangle such a relationship. A fractal analysis of terrestrial isolines yields a clear identification of trenches and abyssal plains, differentiates oceanic ridges from continental slopes and platforms, localizes coastlines and river systems, and isolates areas at high elevation (or latitude) subjected to the erosive action of ice. The study of the geometrical properties of the lunar landscape supports the existence of a correspondence between principal geomorphic processes and landforms. Our analysis can be easily applied to other planetary bodies.
The procedure by means of which the occurrence time of an impending mainshock can be identified by analyzing in natural time the seismicity in the candidate area subsequent to the recording of a precursory Seismic Electric Signals (SES) activity is reviewed. Here, we report the application of this procedure to an Mw5.4 mainshock that occurred in Greece on 17 November 2014 and was strongly felt in Athens. This mainshock (which is pretty rare since it is the strongest in that area for more than half a century) was preceded by an SES activity recorded on 27 July 2014 and the results of the natural time analysis reveal that the system approached the critical point (mainshock occurrence) early in the morning on 15 November 2014. SES activities that have been recently recorded are also presented. Furthermore, in a Note we discuss the case of the Mw5.3 earthquake that was also strongly felt in Athens on 19 July 2019 (Parnitha fault).
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا