Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 18 de 18
Filtrar
1.
Rep Prog Phys ; 84(7)2021 May 25.
Artículo en Inglés | MEDLINE | ID: mdl-33857928

RESUMEN

Charles Richter's observation that 'only fools and charlatans predict earthquakes,' reflects the fact that despite more than 100 years of effort, seismologists remain unable to do so with reliable and accurate results. Meaningful prediction involves specifying the location, time, and size of an earthquake before it occurs to greater precision than expected purely by chance from the known statistics of earthquakes in an area. In this context, 'forecasting' implies a prediction with a specification of a probability of the time, location, and magnitude. Two general approaches have been used. In one, the rate of motion accumulating across faults and the amount of slip in past earthquakes is used to infer where and when future earthquakes will occur and the shaking that would be expected. Because the intervals between earthquakes are highly variable, these long-term forecasts are accurate to no better than a hundred years. They are thus valuable for earthquake hazard mitigation, given the long lives of structures, but have clear limitations. The second approach is to identify potentially observable changes in the Earth that precede earthquakes. Various precursors have been suggested, and may have been real in certain cases, but none have yet proved to be a general feature preceding all earthquakes or to stand out convincingly from the normal variability of the Earth's behavior. However, new types of data, models, and computational power may provide avenues for progress using machine learning that were not previously available. At present, it is unclear whether deterministic earthquake prediction is possible. The frustrations of this search have led to the observation that (echoing Yogi Berra) 'it is difficult to predict earthquakes, especially before they happen.' However, because success would be of enormous societal benefit, the search for methods of earthquake prediction and forecasting will likely continue. In this review, we note that the focus is on anticipating the earthquake rupture before it occurs, rather than characterizing it rapidly just after it occurs. The latter is the domain of earthquake early warning, which we do not treat in detail here, although we include a short discussion in the machine learning section at the end.

2.
Phys Rev Lett ; 124(6): 068501, 2020 Feb 14.
Artículo en Inglés | MEDLINE | ID: mdl-32109126

RESUMEN

We analyze a new model for growing networks, the constrained Leath invasion percolation model. Cluster dynamics are characterized by bursts in space and time. The model quantitatively reproduces the observed frequency-magnitude scaling of earthquakes in the limit that the occupation probability approaches the critical bond percolation probability in d=2. The model may have application to other systems characterized by burst dynamics.

3.
Philos Trans A Math Phys Eng Sci ; 377(2136)2018 Nov 26.
Artículo en Inglés | MEDLINE | ID: mdl-30478209

RESUMEN

A standard approach to quantifying the seismic hazard is the relative intensity (RI) method. It is assumed that the rate of seismicity is constant in time and the rate of occurrence of small earthquakes is extrapolated to large earthquakes using Gutenberg-Richter scaling. We introduce nowcasting to extend RI forecasting to time-dependent seismicity, for example, during an aftershock sequence. Nowcasting uses 'natural time'; in seismicity natural time is the event count of small earthquakes. The event count for small earthquakes is extrapolated to larger earthquakes using Gutenberg-Richter scaling. We first review the concepts of natural time and nowcasting and then illustrate seismic nowcasting with three examples. We first consider the aftershock sequence of the 2004 Parkfield earthquake on the San Andreas fault in California. Some earthquakes have higher rates of aftershock activity than other earthquakes of the same magnitude. Our approach allows the determination of the rate in real time during the aftershock sequence. We also consider two examples of induced earthquakes. Large injections of waste water from petroleum extraction have generated high rates of induced seismicity in Oklahoma. The extraction of natural gas from the Groningen gas field in The Netherlands has also generated very damaging earthquakes. In order to reduce the seismic activity, rates of injection and withdrawal have been reduced in these two cases. We show how nowcasting can be used to assess the success of these efforts.This article is part of the theme issue 'Statistical physics of fracture and earthquakes'.

4.
Proc Natl Acad Sci U S A ; 108(40): 16533-8, 2011 Oct 04.
Artículo en Inglés | MEDLINE | ID: mdl-21949355

RESUMEN

The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.


Asunto(s)
Terremotos/estadística & datos numéricos , Predicción/métodos , Modelos Teóricos , California , Funciones de Verosimilitud
5.
Phys Rev E ; 103(1-1): 012310, 2021 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-33601580

RESUMEN

Invasion percolation is a model that was originally proposed to describe growing networks of fractures. Here we describe a loopless algorithm on random lattices, coupled with an avalanche-based model for bursts. The model reproduces the characteristic b-value seismicity and spatial distribution of bursts consistent with earthquakes resulting from hydraulic fracturing ("fracking"). We test models for both site invasion percolation and bond invasion percolation. These have differences on the scale of site and bond lengths l. But since the networks are characterized by their large-scale behavior, l≪L, we find small differences between scaling exponents. Though data may not differentiate between models, our results suggest that both models belong to different universality classes.

6.
Earth Space Sci ; 6(1): 191-197, 2019 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-30854411

RESUMEN

Seismic nowcasting uses counts of small earthquakes as proxy data to estimate the current dynamical state of an earthquake fault system. The result is an earthquake potential score that characterizes the current state of progress of a defined geographic region through its nominal earthquake "cycle." The count of small earthquakes since the last large earthquake is the natural time that has elapsed since the last large earthquake (Varotsos et al., 2006, https://doi.org/10.1103/PhysRevE.74.021123). In addition to natural time, earthquake sequences can also be analyzed using Shannon information entropy ("information"), an idea that was pioneered by Shannon (1948, https://doi.org/10.1002/j.1538-7305.1948.tb01338.x). As a first step to add seismic information entropy into the nowcasting method, we incorporate magnitude information into the natural time counts by using event self-information. We find in this first application of seismic information entropy that the earthquake potential score values are similar to the values using only natural time. However, other characteristics of earthquake sequences, including the interevent time intervals, or the departure of higher magnitude events from the magnitude-frequency scaling line, may contain additional information.

7.
Phys Rev E Stat Nonlin Soft Matter Phys ; 78(4 Pt 1): 041115, 2008 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-18999387

RESUMEN

In this paper we extend the branching aftershock sequence model to study the role of missing data at short times and small amplitudes after a mainshock. We apply this model, which contains three parameters characterizing the missing data, to the magnitude and temporal statistics of four aftershock sequences in California. We find that the observed time-dependent deviations of the frequency-magnitude scaling from the Gutenberg-Richter power law dependency can be described quantitatively by the model. We also show that, for the same set of parameters, the model is able to explain quantitatively the observed magnitude-dependent deviations of the temporal decay of aftershocks from Omori's law. In addition, we show that the same sets of data can also reproduce quite well the various functional forms of the probability density functions of the return times between consecutive events with magnitudes above a prescribed threshold, as well as the violation of scaling at short and intermediate time scales.

8.
J Res Natl Inst Stand Technol ; 99(4): 377-389, 1994.
Artículo en Inglés | MEDLINE | ID: mdl-37405289

RESUMEN

Floods and draughts constitute extreme values of great consequence to society. A wide variety of statistical techniques have been applied to the evaluation of the flood hazard. A primary difficulty is the relatively short time span over which historical data are available, and quantitative estimates for palcofloods are generally suspect. It was in the context of floods that Hurst introduced the concept of the rescaled range. This was subsequently extended by Mandelbrot and his colleagues to concepts of fractional Gaussian noises and fractional Brownian walks. These studies introduced the controversial possibility that the extremes of floods and droughts could be fractal. An extensive study of flood gauge records at 1200 stations in the United States indicates a good correlation with fractal statistics. It is convenient to introduce the parameter F which is the ratio of the 10 year flood to the 1-year flood; for fractal statistics F is also the ratio of the 100 year flood to the 10 year flood and the ratio of the 1000 year flood to the 100 year flood. It is found that the parameter F has strong regional variations associated with climale. The acceptance of power-law statistics rather than exponentially based statistics would lead to a far more conservative estimate of future flood hazards.

9.
Artículo en Inglés | MEDLINE | ID: mdl-25353434

RESUMEN

Recent developments in hydraulic fracturing (fracking) have enabled the recovery of large quantities of natural gas and oil from old, low-permeability shales. These developments include a change from low-volume, high-viscosity fluid injection to high-volume, low-viscosity injection. The injected fluid introduces distributed damage that provides fracture permeability for the extraction of the gas and oil. In order to model this process, we utilize a loopless nontrapping invasion percolation previously introduced to model optimal polymers in a strongly disordered medium and for determining minimum energy spanning trees on a lattice. We performed numerical simulations on a two-dimensional square lattice and find significant differences from other percolation models. Additionally, we find that the growing fracture network satisfies both Horton-Strahler and Tokunaga network statistics. As with other invasion percolation models, our model displays burst dynamics, in which the cluster extends rapidly into a connected region. We introduce an alternative definition of bursts to be a consecutive series of opened bonds whose strengths are all below a specified value. Using this definition of bursts, we find good agreement with a power-law frequency-area distribution. These results are generally consistent with the observed distribution of microseismicity observed during a high-volume frack.

10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 86(2 Pt 2): 026103, 2012 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-23005821

RESUMEN

Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peak-to-peak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3-year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peak-to-peak sequences in time series and that this can be employed for long data sets as a test of their i.i.d. character. We illustrate the validity of our analysis utilizing the peak-to-peak statistics of a Gaussian white noise. We also consider the nearest-neighbor cluster statistics of point processes in time. If the time intervals are random, we show that cluster size statistics are identical to the peak-to-peak sequence statistics of time series. In order to study the influence of correlations in a time series, we determine the peak-to-peak sequence statistics for the Langevin equation of kinetic theory leading to Brownian motion. To test our methodology, we consider a variety of applications. Using a global catalog of earthquakes, we obtain the peak-to-peak statistics of earthquake magnitudes and the nearest neighbor interoccurrence time statistics. In both cases, we find good agreement with the i.i.d. theory. We also consider the interval statistics of the Old Faithful geyser in Yellowstone National Park. In this case, we find a significant deviation from the i.i.d. theory which we attribute to antipersistence. We consider the interval statistics using the AL index of geomagnetic substorms. We again find a significant deviation from i.i.d. behavior that we attribute to mild persistence. Finally, we examine the behavior of Standard and Poor's 500 stock index's daily returns from 1928-2011 and show that, while it is close to being i.i.d., there is, again, significant persistence. We expect that there will be many other applications of our methodology both to interoccurrence statistics and to time series.


Asunto(s)
Dinámica Poblacional , Algoritmos , Animales , Canadá , Análisis por Conglomerados , Terremotos , Inversiones en Salud , Cinética , Modelos Estadísticos , Movimiento (Física) , Distribución Normal , Probabilidad , Reproducibilidad de los Resultados , Wyoming
11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 86(2 Pt 1): 021106, 2012 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-23005722

RESUMEN

Many driven threshold systems display a spectrum of avalanche event sizes, often characterized by power-law scaling. An important problem is to compute probabilities of the largest events ("Black Swans"). We develop a data-driven approach to the problem by transforming to the event index frame, and relating this to Shannon information. For earthquakes, we find the 12-month probability for magnitude m>6 earthquakes in California increases from about 30% after the last event, to 40%-50% prior to the next one.

12.
Phys Rev E Stat Nonlin Soft Matter Phys ; 83(4 Pt 2): 046118, 2011 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-21599251

RESUMEN

Observations suggest that contemporary wildfire suppression practices in the United States have contributed to conditions that facilitate large, destructive fires. We introduce a forest-fire model with natural fire resistance that supports this theory. Fire resistance is defined with respect to the size and shape of clusters; the model yields power-law frequency-size distributions of model fires that are consistent with field observations in the United States, Canada, and Australia.

13.
Phys Rev E Stat Nonlin Soft Matter Phys ; 82(6 Pt 2): 066111, 2010 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-21230709

RESUMEN

A record-breaking temperature is the highest or lowest temperature at a station since the period of time considered began. The temperatures at a station constitute a time series. After the removal of daily and annual periodicities, the primary considerations are trends (i.e., global warming) and long-range correlations. We first carry out Monte Carlo simulations to determine the influence of trends and long-range correlations on record-breaking statistics. We take a time series that is a Gaussian white noise and give the classic record-breaking theory results for an independent and identically distributed process. We then carry out simulations to determine the influence of long-range correlations and linear temperature trends. For the range of fractional Gaussian noises that are observed to be applicable to temperature time series, the influence on the record-breaking statistics is less than 10%. We next superimpose a linear trend on a Gaussian white noise and extend the theory to include the effect of an additive trend. We determine the ratios of the number of maximum to the number of minimum record-breaking temperatures. We find the single governing parameter to be the ratio of the temperature change per year to the standard deviation of the underlying white noise. To test our approach, we consider a 30 yr record of temperatures at the Mauna Loa Observatory for 1977-2006. We determine the temperature trends by direct measurements and use our simulations to infer trends from the number of record-breaking temperatures. The two approaches give values that are in good agreement. We find that the warming trend is primarily due to an increase in the (overnight) minimum temperatures, while the maximum (daytime) temperatures are approximately constant.

14.
Phys Rev Lett ; 97(23): 238501, 2006 Dec 08.
Artículo en Inglés | MEDLINE | ID: mdl-17280253

RESUMEN

Earthquake occurrence in nature is thought to result from correlated elastic stresses, leading to clustering in space and time. We show that the occurrence of major earthquakes in California correlates with time intervals when fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering.

15.
Phys Rev Lett ; 95(21): 218501, 2005 Nov 18.
Artículo en Inglés | MEDLINE | ID: mdl-16384191

RESUMEN

In this work the distribution of interoccurrence times between earthquakes in aftershock sequences is analyzed and a model based on a nonhomogeneous Poisson (NHP) process is proposed to quantify the observed scaling. In this model the generalized Omori's law for the decay of aftershocks is used as a time-dependent rate in the NHP process. The analytically derived distribution of interoccurrence times is applied to several major aftershock sequences in California to confirm the validity of the proposed hypothesis.

16.
Philos Trans A Math Phys Eng Sci ; 360(1800): 2433-74, 2002 Nov 15.
Artículo en Inglés | MEDLINE | ID: mdl-12460475

RESUMEN

We present a flexible multi-reservoir (primitive lower mantle, depleted upper mantle, upper continental crust, lower continental crust and atmosphere) forward-transport model of the Earth, incorporating the Sm-Nd, Rb-Sr, U-Th-Pb-He and K-Ar isotope-decay systematics. Mathematically, the model consists of a series of differential equations, describing the changing abundance of each nuclide in each reservoir, which are solved repeatedly over the history of the Earth. Fluxes between reservoirs are keyed to heat production and further constrained by estimates of present-day fluxes (e.g. subduction, plume flux) and current sizes of reservoirs. Elemental transport is tied to these fluxes through 'enrichment factors', which allow for fractionation between species. A principal goal of the model is to reproduce the Pb-isotope systematics of the depleted upper mantle, which has not been done in earlier models. At present, the depleted upper mantle has low (238)U/(204)Pb (mu) and (232)Th/(238)U (kappa) ratios, but Pb-isotope ratios reflect high time-integrated values of these ratios. These features are reproduced in the model and are a consequence of preferential subduction of U and of radiogenic Pb from the upper continental crust into the depleted upper mantle. At the same time, the model reproduces the observed Sr-, Nd-, Ar- and He-isotope ratios of the atmosphere, continental crust and mantle. We show that both steady-state and time-variant concentrations of incompatible-element concentrations and ratios in the continental crust and upper mantle are possible. Indeed, in some cases, incompatible-element concentrations and ratios increase with time in the depleted mantle. Hence, assumptions of a progressively depleting or steady-state upper mantle are not justified. A ubiquitous feature of this model, as well as other evolutionary models, is early rapid depletion of the upper mantle in highly incompatible elements; hence, a near-chondritic Th/U ratio in the upper mantle throughout the Archean is unlikely. The model also suggests that the optimal value of the bulk silicate Earth's K/U ratio is close to 10000; lower values suggested recently seem unlikely.


Asunto(s)
Planeta Tierra , Evolución Química , Evolución Planetaria , Geología/métodos , Isótopos/química , Modelos Teóricos , Atmósfera/análisis , Atmósfera/química , Simulación por Computador , Sedimentos Geológicos/análisis , Isótopos/análisis , Plomo/análisis , Gases Nobles/química , Reología/métodos , Sensibilidad y Especificidad , Termodinámica
17.
Proc Natl Acad Sci U S A ; 99 Suppl 1: 2530-7, 2002 Feb 19.
Artículo en Inglés | MEDLINE | ID: mdl-11875206

RESUMEN

We consider the frequency-size statistics of two natural hazards, forest fires and landslides. Both appear to satisfy power-law (fractal) distributions to a good approximation under a wide variety of conditions. Two simple cellular-automata models have been proposed as analogs for this observed behavior, the forest fire model for forest fires and the sand pile model for landslides. The behavior of these models can be understood in terms of a self-similar inverse cascade. For the forest fire model the cascade consists of the coalescence of clusters of trees; for the sand pile model the cascade consists of the coalescence of metastable regions.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA