Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Proc Natl Acad Sci U S A ; 118(47)2021 11 23.
Artigo em Inglês | MEDLINE | ID: mdl-34782455

RESUMO

Network theory, as emerging from complex systems science, can provide critical predictive power for mitigating the global warming crisis and other societal challenges. Here we discuss the main differences of this approach to classical numerical modeling and highlight several cases where the network approach substantially improved the prediction of high-impact phenomena: 1) El Niño events, 2) droughts in the central Amazon, 3) extreme rainfall in the eastern Central Andes, 4) the Indian summer monsoon, and 5) extreme stratospheric polar vortex states that influence the occurrence of wintertime cold spells in northern Eurasia. In this perspective, we argue that network-based approaches can gainfully complement numerical modeling.

2.
Proc Natl Acad Sci U S A ; 117(1): 177-183, 2020 01 07.
Artigo em Inglês | MEDLINE | ID: mdl-31874928

RESUMO

The El Niño Southern Oscillation (ENSO) is one of the most prominent interannual climate phenomena. Early and reliable ENSO forecasting remains a crucial goal, due to its serious implications for economy, society, and ecosystem. Despite the development of various dynamical and statistical prediction models in the recent decades, the "spring predictability barrier" remains a great challenge for long-lead-time (over 6 mo) forecasting. To overcome this barrier, here we develop an analysis tool, System Sample Entropy (SysSampEn), to measure the complexity (disorder) of the system composed of temperature anomaly time series in the Niño 3.4 region. When applying this tool to several near-surface air temperature and sea surface temperature datasets, we find that in all datasets a strong positive correlation exists between the magnitude of El Niño and the previous calendar year's SysSampEn (complexity). We show that this correlation allows us to forecast the magnitude of an El Niño with a prediction horizon of 1 y and high accuracy (i.e., root-mean-square error = 0.23° C for the average of the individual datasets forecasts). For the 2018 El Niño event, our method forecasted a weak El Niño with a magnitude of 1.11±0.23° C. Our framework presented here not only facilitates long-term forecasting of the El Niño magnitude but can potentially also be used as a measure for the complexity of other natural or engineering complex systems.

3.
Proc Natl Acad Sci U S A ; 114(15): E2998-E3003, 2017 04 11.
Artigo em Inglês | MEDLINE | ID: mdl-28348227

RESUMO

The question whether a seasonal climate trend (e.g., the increase of summer temperatures in Antarctica in the last decades) is of anthropogenic or natural origin is of great importance for mitigation and adaption measures alike. The conventional significance analysis assumes that (i) the seasonal climate trends can be quantified by linear regression, (ii) the different seasonal records can be treated as independent records, and (iii) the persistence in each of these seasonal records can be characterized by short-term memory described by an autoregressive process of first order. Here we show that assumption ii is not valid, due to strong intraannual correlations by which different seasons are correlated. We also show that, even in the absence of correlations, for Gaussian white noise, the conventional analysis leads to a strong overestimation of the significance of the seasonal trends, because multiple testing has not been taken into account. In addition, when the data exhibit long-term memory (which is the case in most climate records), assumption iii leads to a further overestimation of the trend significance. Combining Monte Carlo simulations with the Holm-Bonferroni method, we demonstrate how to obtain reliable estimates of the significance of the seasonal climate trends in long-term correlated records. For an illustration, we apply our method to representative temperature records from West Antarctica, which is one of the fastest-warming places on Earth and belongs to the crucial tipping elements in the Earth system.

4.
Proc Natl Acad Sci U S A ; 111(6): 2064-6, 2014 Feb 11.
Artigo em Inglês | MEDLINE | ID: mdl-24516172

RESUMO

The most important driver of climate variability is the El Niño Southern Oscillation, which can trigger disasters in various parts of the globe. Despite its importance, conventional forecasting is still limited to 6 mo ahead. Recently, we developed an approach based on network analysis, which allows projection of an El Niño event about 1 y ahead. Here we show that our method correctly predicted the absence of El Niño events in 2012 and 2013 and now announce that our approach indicated (in September 2013 already) the return of El Niño in late 2014 with a 3-in-4 likelihood. We also discuss the relevance of the next El Niño to the question of global warming and the present hiatus in the global mean surface temperature.

5.
Proc Natl Acad Sci U S A ; 110(29): 11742-5, 2013 07 16.
Artigo em Inglês | MEDLINE | ID: mdl-23818627

RESUMO

Although anomalous episodic warming of the eastern equatorial Pacific, dubbed El Niño by Peruvian fishermen, has major (and occasionally devastating) impacts around the globe, robust forecasting is still limited to about 6 mo ahead. A significant extension of the prewarning time would be instrumental for avoiding some of the worst damages such as harvest failures in developing countries. Here we introduce a unique avenue toward El Niño prediction based on network methods, inspecting emerging teleconnections. Our approach starts from the evidence that a large-scale cooperative mode--linking the El Niño basin (equatorial Pacific corridor) and the rest of the ocean--builds up in the calendar year before the warming event. On this basis, we can develop an efficient 12-mo forecasting scheme, i.e., achieve some doubling of the early-warning period. Our method is based on high-quality observational data available since 1950 and yields hit rates above 0.5, whereas false-alarm rates are below 0.1.


Assuntos
Algoritmos , El Niño Oscilação Sul , Previsões/métodos , Oceano Pacífico , Sensibilidade e Especificidade , Temperatura , Fatores de Tempo
6.
Phys Chem Chem Phys ; 13(7): 2663-6, 2011 Feb 21.
Artigo em Inglês | MEDLINE | ID: mdl-21183980

RESUMO

Networks of inorganic particles (here SiO(2)) formed within organic liquids play an important role in science. Recently they have been considered as 'soggy sand' electrolytes for Li-based batteries with a fascinating combination of mechanical and electrical properties. In this communication we model formation and stability of the networks by Cluster-Cluster Aggregation followed by coarsening on a different time scale. The comparison of computer simulations based on our model with experimental results obtained for LiClO(4) containing polyethylene glycol reveals (i) that the percolation threshold for interfacial conductivity is very small, (ii) that the networks once formed coarsen with a time constant that is roughly independent of volume fraction and size--to a denser aggregate which then stays stable under operating condition. (iii) Trapping of the conducting solvent at high packing is also revealed.

7.
Phys Rev E Stat Nonlin Soft Matter Phys ; 80(2 Pt 2): 026131, 2009 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-19792224

RESUMO

We suggest a risk estimation method for financial records that is based on the statistics of return intervals between events above/below a certain threshold Q and is particularly suited for multifractal records. The method is based on the knowledge of the probability W(Q)(t;Deltat) that within the next Deltat units of time at least one event above Q occurs, if the last event occurred t time units ago. We propose an analytical estimate of W(Q) and show explicitly that the proposed method is superior to the conventional precursory pattern recognition technique widely used in signal analysis, which requires considerable fine tuning and is difficult to implement. We also show that the estimation of the Value at Risk, which is a standard tool in finances, can be improved considerably by the method.

8.
Phys Rev E Stat Nonlin Soft Matter Phys ; 79(6 Pt 2): 066101, 2009 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-19658558

RESUMO

Long-term memory is ubiquitous in nature and has important consequences for the occurrence of natural hazards, but its detection often is complicated by the short length of the considered records and additive white noise in the data. Here we study synthetic Gaussian distributed records x_{i} of length N that consist of a long-term correlated component (1-a)y_{i} characterized by a correlation exponent gamma , 00)=B_{a}s;{-gamma} , and E_{a}={2B_{a}/[(2-gamma)(1-gamma)]}N;{-gamma}+O(N;{-1}) . The finite-size parameter E_{a} also occurs in related quantities, for example, in the variance Delta_{N};{2}(s) of the local mean in time windows of length s : Delta_{N};{2}(s)=[Delta_{infinity};{2}(s)-E_{a}]/(1-E_{a}) . For purely long-term correlated data B_{0} congruent with(2-gamma)(1-gamma)/2 yielding E_{0} congruent withN;{-gamma} , and thus C_{N}(s)=[(2-gamma)(1-gamma)/2s;{-gamma}-N;{-gamma}]/[1-N;{-gamma}] and Delta_{N};{2}(s)=[s;{-gamma}-N;{-gamma}]/[1-N;{-gamma}] . We show how to estimate E_{a} and C_{infinity}(s) from a given data set and thus how to obtain accurately the exponent gamma and the amount of white noise a .

9.
Phys Rev E Stat Nonlin Soft Matter Phys ; 78(4 Pt 1): 041115, 2008 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-18999387

RESUMO

In this paper we extend the branching aftershock sequence model to study the role of missing data at short times and small amplitudes after a mainshock. We apply this model, which contains three parameters characterizing the missing data, to the magnitude and temporal statistics of four aftershock sequences in California. We find that the observed time-dependent deviations of the frequency-magnitude scaling from the Gutenberg-Richter power law dependency can be described quantitatively by the model. We also show that, for the same set of parameters, the model is able to explain quantitatively the observed magnitude-dependent deviations of the temporal decay of aftershocks from Omori's law. In addition, we show that the same sets of data can also reproduce quite well the various functional forms of the probability density functions of the return times between consecutive events with magnitudes above a prescribed threshold, as well as the violation of scaling at short and intermediate time scales.

10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 78(3 Pt 2): 036114, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18851112

RESUMO

We study the statistics of the interoccurrence times between events above some threshold Q in two kinds of multifractal data sets (multiplicative random cascades and multifractal random walks) with vanishing linear correlations. We show that in both data sets the relevant quantities (probability density functions and the autocorrelation function of the interoccurrence times, as well as the conditional return period) are governed by power laws with exponents that depend explicitly on the considered threshold. By studying a large number of representative financial records (market indices, stock prices, exchange rates, and commodities), we show explicitly that the interoccurrence times between large daily returns follow the same behavior, in a nearly quantitative manner. We conclude that this kind of behavior is a general consequence of the nonlinear memory inherent in the multifractal data sets.

11.
Lancet ; 367(9523): 1674-81, 2006 May 20.
Artigo em Inglês | MEDLINE | ID: mdl-16714188

RESUMO

BACKGROUND: Decreased vagal activity after myocardial infarction results in reduced heart-rate variability and increased risk of death. To distinguish between vagal and sympathetic factors that affect heart-rate variability, we used a signal-processing algorithm to separately characterise deceleration and acceleration of heart rate. We postulated that diminished deceleration-related modulation of heart rate is an important prognostic marker. Our prospective hypotheses were that deceleration capacity is a better predictor of risk than left-ventricular ejection fraction (LVEF) and standard deviation of normal-to-normal intervals (SDNN). METHODS: We quantified heart rate deceleration capacity by assessing 24-h Holter recordings from a post-infarction cohort in Munich (n=1455). We blindly validated the prognostic power of deceleration capacity in post-infarction populations in London, UK (n=656), and Oulu, Finland (n=600). We tested our hypotheses by assessment of the area under the receiver-operator characteristics curve (AUC). FINDINGS: During a median follow-up of 24 months, 70 people died in the Munich cohort and 66 in the London cohort. The Oulu cohort was followed-up for 38 months and 77 people died. In the London cohort, mean AUC of deceleration capacity was 0.80 (SD 0.03) compared with 0.67 (0.04) for LVEF and 0.69 (0.04) for SDNN. In the Oulu cohort, mean AUC of deceleration capacity was 0.74 (0.03) compared with 0.60 (0.04) for LVEF and 0.64 (0.03) for SDNN (p<0.0001 for all comparisons). Stratification by dichotomised deceleration capacity was especially powerful in patients with preserved LVEF (p<0.0001 in all cohorts). INTERPRETATION: Impaired heart rate deceleration capacity is a powerful predictor of mortality after myocardial infarction and is more accurate than LVEF and the conventional measures of heart-rate variability.


Assuntos
Eletrocardiografia Ambulatorial/métodos , Frequência Cardíaca , Infarto do Miocárdio/mortalidade , Idoso , Cardiotônicos/uso terapêutico , Estudos de Coortes , Desaceleração , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Infarto do Miocárdio/tratamento farmacológico , Valor Preditivo dos Testes , Curva ROC , Fatores de Risco , Volume Sistólico
12.
Phys Rev E Stat Nonlin Soft Matter Phys ; 75(1 Pt 1): 011128, 2007 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-17358131

RESUMO

We consider long-term correlated data with several distribution densities (Gaussian, exponential, power law, and log normal) and various correlation exponents gamma (0

13.
Phys Rev E Stat Nonlin Soft Matter Phys ; 75(4 Pt 2): 045104, 2007 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-17500948

RESUMO

We introduce an immunization method where the percentage of required vaccinations for immunity are close to the optimal value of a targeted immunization scheme of highest degree nodes. Our strategy retains the advantage of being purely local, without the need for knowledge on the global network structure or identification of the highest degree nodes. The method consists of selecting a random node and asking for a neighbor that has more links than himself or more than a given threshold and immunizing him. We compare this method to other efficient strategies on three real social networks and on a scale-free network model and find it to be significantly more effective.

14.
Sci Rep ; 7: 41096, 2017 01 24.
Artigo em Inglês | MEDLINE | ID: mdl-28117453

RESUMO

In the context of global warming, the question of why Antarctic sea ice extent (SIE) has increased is one of the most fundamental unsolved mysteries. Although many mechanisms have been proposed, it is still unclear whether the increasing trend is anthropogenically originated or only caused by internal natural variability. In this study, we employ a new method where the underlying natural persistence in the Antarctic SIE can be correctly accounted for. We find that the Antarctic SIE is not simply short-term persistent as assumed in the standard significance analysis, but actually characterized by a combination of both short- and long-term persistence. By generating surrogate data with the same persistence properties, the SIE trends over Antarctica (as well as five sub-regions) are evaluated using Monte-Carlo simulations. It is found that the SIE trends over most sub-regions of Antarctica are not statistically significant. Only the SIE over Ross Sea has experienced a highly significant increasing trend (p = 0.008) which cannot be explained by natural variability. Influenced by the positive SIE trend over Ross Sea, the SIE over the entire Antarctica also increased over the past decades, but the trend is only at the edge of being significant (p = 0.034).

15.
Sci Rep ; 7: 43034, 2017 02 22.
Artigo em Inglês | MEDLINE | ID: mdl-28225058

RESUMO

Understanding the physical principles that govern the complex DNA structural organization as well as its mechanical and thermodynamical properties is essential for the advancement in both life sciences and genetic engineering. Recently we have discovered that the complex DNA organization is explicitly reflected in the arrangement of nucleotides depicted by the universal power law tailed internucleotide interval distribution that is valid for complete genomes of various prokaryotic and eukaryotic organisms. Here we suggest a superstatistical model that represents a long DNA molecule by a series of consecutive ~150 bp DNA segments with the alternation of the local nucleotide composition between segments exhibiting long-range correlations. We show that the superstatistical model and the corresponding DNA generation algorithm explicitly reproduce the laws governing the empirical nucleotide arrangement properties of the DNA sequences for various global GC contents and optimal living temperatures. Finally, we discuss the relevance of our model in terms of the DNA mechanical properties. As an outlook, we focus on finding the DNA sequences that encode a given protein while simultaneously reproducing the nucleotide arrangement laws observed from empirical genomes, that may be of interest in the optimization of genetic engineering of long DNA molecules.


Assuntos
Bactérias/genética , DNA Bacteriano/química , Modelos Teóricos , Algoritmos , Bacillus subtilis/genética , DNA Bacteriano/metabolismo , Genoma Bacteriano , Temperatura
16.
Sci Rep ; 7: 46917, 2017 12 22.
Artigo em Inglês | MEDLINE | ID: mdl-29271401

RESUMO

This corrects the article DOI: 10.1038/srep43034.

17.
Sci Rep ; 7: 40207, 2017 01 20.
Artigo em Inglês | MEDLINE | ID: mdl-28106047

RESUMO

Nanoporous silicon produced by electrochemical etching of highly B-doped p-type silicon wafers can be prepared with tubular pores imbedded in a silicon matrix. Such materials have found many technological applications and provide a useful model system for studying phase transitions under confinement. This paper reports a joint experimental and simulation study of diffusion in such materials, covering displacements from molecular dimensions up to tens of micrometers with carefully selected probe molecules. In addition to mass transfer through the channels, diffusion (at much smaller rates) is also found to occur in directions perpendicular to the channels, thus providing clear evidence of connectivity. With increasing displacements, propagation in both axial and transversal directions is progressively retarded, suggesting a scale-dependent, hierarchical distribution of transport resistances ("constrictions" in the channels) and of shortcuts (connecting "bridges") between adjacent channels. The experimental evidence from these studies is confirmed by molecular dynamics (MD) simulation in the range of atomistic displacements and rationalized with a simple model of statistically distributed "constrictions" and "bridges" for displacements in the micrometer range via dynamic Monte Carlo (DMC) simulation. Both ranges are demonstrated to be mutually transferrable by DMC simulations based on the pore space topology determined by electron tomography.

18.
Phys Rev E Stat Nonlin Soft Matter Phys ; 73(1 Pt 2): 016130, 2006 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-16486239

RESUMO

Many natural records exhibit long-term correlations characterized by a power-law decay of the autocorrelation function, C(s) approximately s-gamma, with time lag s and correlation exponent 0

19.
PLoS One ; 11(11): e0164658, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-27893737

RESUMO

A fundamental problem in linguistics is how literary texts can be quantified mathematically. It is well known that the frequency of a (rare) word in a text is roughly inverse proportional to its rank (Zipf's law). Here we address the complementary question, if also the rhythm of the text, characterized by the arrangement of the rare words in the text, can be quantified mathematically in a similar basic way. To this end, we consider representative classic single-authored texts from England/Ireland, France, Germany, China, and Japan. In each text, we classify each word by its rank. We focus on the rare words with ranks above some threshold Q and study the lengths of the (return) intervals between them. We find that for all texts considered, the probability SQ(r) that the length of an interval exceeds r, follows a perfect Weibull-function, SQ(r) = exp(-b(ß)rß), with ß around 0.7. The return intervals themselves are arranged in a long-range correlated self-similar fashion, where the autocorrelation function CQ(s) of the intervals follows a power law, CQ(s) ∼ s-γ, with an exponent γ between 0.14 and 0.48. We show that these features lead to a pronounced clustering of the rare words in the text.


Assuntos
Linguística/métodos , Modelos Teóricos , Análise por Conglomerados , Inglaterra , França , Alemanha , Humanos , Irlanda , Idioma , Computação Matemática , Probabilidade , Vocabulário
20.
Sci Rep ; 6: 22286, 2016 Feb 29.
Artigo em Inglês | MEDLINE | ID: mdl-26924271

RESUMO

Structural, localization and functional properties of unknown proteins are often being predicted from their primary polypeptide chains using sequence alignment with already characterized proteins and consequent molecular modeling. Here we suggest an approach to predict various structural and structure-associated properties of proteins directly from the mass distributions of their proteolytic cleavage fragments. For amino-acid-specific cleavages, the distributions of fragment masses are determined by the distributions of inter-amino-acid intervals in the protein, that in turn apparently reflect its structural and structure-related features. Large-scale computer simulations revealed that for transmembrane proteins, either α-helical or ß -barrel secondary structure could be predicted with about 90% accuracy after thermolysin cleavage. Moreover, 3/4 intrinsically disordered proteins could be correctly distinguished from proteins with fixed three-dimensional structure belonging to all four SCOP structural classes by combining 3-4 different cleavages. Additionally, in some cases the protein cellular localization (cytosolic or membrane-associated) and its host organism (Firmicute or Proteobacteria) could be predicted with around 80% accuracy. In contrast to cytosolic proteins, for membrane-associated proteins exhibiting specific structural conformations, their monotopic or transmembrane localization and functional group (ATP-binding, transporters, sensors and so on) could be also predicted with high accuracy and particular robustness against missing cleavages.


Assuntos
Modelos Moleculares , Modelos Estatísticos , Fragmentos de Peptídeos/química , Conformação Proteica , Proteínas/química , Proteínas/metabolismo , Proteínas de Bactérias , Espaço Intracelular/metabolismo , Espectrometria de Massas , Peso Molecular , Fragmentos de Peptídeos/metabolismo , Domínios e Motivos de Interação entre Proteínas , Estrutura Secundária de Proteína , Transporte Proteico , Proteólise , Curva ROC , Reprodutibilidade dos Testes , Relação Estrutura-Atividade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA