Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 35
Filtrar
Mais filtros

Bases de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Proc Natl Acad Sci U S A ; 112(24): 7426-31, 2015 Jun 16.
Artigo em Inglês | MEDLINE | ID: mdl-26015563

RESUMO

A Sleeping Beauty (SB) in science refers to a paper whose importance is not recognized for several years after publication. Its citation history exhibits a long hibernation period followed by a sudden spike of popularity. Previous studies suggest a relative scarcity of SBs. The reliability of this conclusion is, however, heavily dependent on identification methods based on arbitrary threshold parameters for sleeping time and number of citations, applied to small or monodisciplinary bibliographic datasets. Here we present a systematic, large-scale, and multidisciplinary analysis of the SB phenomenon in science. We introduce a parameter-free measure that quantifies the extent to which a specific paper can be considered an SB. We apply our method to 22 million scientific papers published in all disciplines of natural and social sciences over a time span longer than a century. Our results reveal that the SB phenomenon is not exceptional. There is a continuous spectrum of delayed recognition where both the hibernation period and the awakening intensity are taken into account. Although many cases of SBs can be identified by looking at monodisciplinary bibliographic data, the SB phenomenon becomes much more apparent with the analysis of multidisciplinary datasets, where we can observe many examples of papers achieving delayed yet exceptional importance in disciplines different from those where they were originally published. Our analysis emphasizes a complex feature of citation dynamics that so far has received little attention, and also provides empirical evidence against the use of short-term citation metrics in the quantification of scientific impact.

2.
Glob Chang Biol ; 21(7): 2655-2660, 2015 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-25580828

RESUMO

We refine the information available through the IPCC AR5 with regard to recent trends in global GHG emissions from agriculture, forestry and other land uses (AFOLU), including global emission updates to 2012. Using all three available AFOLU datasets employed for analysis in the IPCC AR5, rather than just one as done in the IPCC AR5 WGIII Summary for Policy Makers, our analyses point to a down-revision of global AFOLU shares of total anthropogenic emissions, while providing important additional information on subsectoral trends. Our findings confirm that the share of AFOLU emissions to the anthropogenic total declined over time. They indicate a decadal average of 28.7 ± 1.5% in the 1990s and 23.6 ± 2.1% in the 2000s and an annual value of 21.2 ± 1.5% in 2010. The IPCC AR5 had indicated a 24% share in 2010. In contrast to previous decades, when emissions from land use (land use, land use change and forestry, including deforestation) were significantly larger than those from agriculture (crop and livestock production), in 2010 agriculture was the larger component, contributing 11.2 ± 0.4% of total GHG emissions, compared to 10.0 ± 1.2% of the land use sector. Deforestation was responsible for only 8% of total anthropogenic emissions in 2010, compared to 12% in the 1990s. Since 2010, the last year assessed by the IPCC AR5, new FAO estimates indicate that land use emissions have remained stable, at about 4.8 Gt CO2 eq yr-1 in 2012. Emissions minus removals have also remained stable, at 3.2 Gt CO2 eq yr-1 in 2012. By contrast, agriculture emissions have continued to grow, at roughly 1% annually, and remained larger than the land use sector, reaching 5.4 Gt CO2 eq yr-1 in 2012. These results are useful to further inform the current climate policy debate on land use, suggesting that more efforts and resources should be directed to further explore options for mitigation in agriculture, much in line with the large efforts devoted to REDD+ in the past decade.

3.
Phys Rev Lett ; 113(8): 088701, 2014 Aug 22.
Artigo em Inglês | MEDLINE | ID: mdl-25192129

RESUMO

We investigate the impact of community structure on information diffusion with the linear threshold model. Our results demonstrate that modular structure may have counterintuitive effects on information diffusion when social reinforcement is present. We show that strong communities can facilitate global diffusion by enhancing local, intracommunity spreading. Using both analytic approaches and numerical simulations, we demonstrate the existence of an optimal network modularity, where global diffusion requires the minimal number of early adopters.


Assuntos
Redes Comunitárias , Disseminação de Informação , Modelos Teóricos
4.
Sci Adv ; 10(15): eadh4439, 2024 Apr 12.
Artigo em Inglês | MEDLINE | ID: mdl-38608015

RESUMO

Social contagion is a ubiquitous and fundamental process that drives individual and social changes. Although social contagion arises as a result of cognitive processes and biases, the integration of cognitive mechanisms with the theory of social contagion remains an open challenge. In particular, studies on social phenomena usually assume contagion dynamics to be either simple or complex, rather than allowing it to emerge from cognitive mechanisms, despite empirical evidence indicating that a social system can exhibit a spectrum of contagion dynamics-from simple to complex-simultaneously. Here, we propose a model of interacting beliefs, from which both simple and complex contagion dynamics can organically arise. Our model also elucidates how a fundamental mechanism of complex contagion-resistance-can come about from cognitive mechanisms.

5.
PNAS Nexus ; 3(7): pgae258, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38994499

RESUMO

Social media, seen by some as the modern public square, is vulnerable to manipulation. By controlling inauthentic accounts impersonating humans, malicious actors can amplify disinformation within target communities. The consequences of such operations are difficult to evaluate due to the challenges posed by collecting data and carrying out ethical experiments that would influence online communities. Here we use a social media model that simulates information diffusion in an empirical network to quantify the impacts of adversarial manipulation tactics on the quality of content. We find that the presence of hub accounts, a hallmark of social media, exacerbates the vulnerabilities of online communities to manipulation. Among the explored tactics that bad actors can employ, infiltrating a community is the most likely to make low-quality content go viral. Such harm can be further compounded by inauthentic agents flooding the network with low-quality, yet appealing content, but is mitigated when bad actors focus on specific targets, such as influential or vulnerable individuals. These insights suggest countermeasures that platforms could employ to increase the resilience of social media users to manipulation.

6.
Phys Rev E ; 107(2-1): 024310, 2023 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-36932495

RESUMO

We investigate the avalanche temporal statistics of the susceptible-infected-susceptible (SIS) model when the dynamics is critical and takes place on finite random networks. By considering numerical simulations on annealed topologies we show that the survival probability always exhibits three distinct dynamical regimes. Size-dependent crossover timescales separating them scale differently for homogeneous and for heterogeneous networks. The phenomenology can be qualitatively understood based on known features of the SIS dynamics on networks. A fully quantitative approach based on Langevin theory is shown to perfectly reproduce the results for homogeneous networks, while failing in the heterogeneous case. The analysis is extended to quenched random networks, which behave in agreement with the annealed case for strongly homogeneous and strongly heterogeneous networks.

7.
Nat Commun ; 13(1): 1308, 2022 03 14.
Artigo em Inglês | MEDLINE | ID: mdl-35288567

RESUMO

Statistical laws of information avalanches in social media appear, at least according to existing empirical studies, not robust across systems. As a consequence, radically different processes may represent plausible driving mechanisms for information propagation. Here, we analyze almost one billion time-stamped events collected from several online platforms - including Telegram, Twitter and Weibo - over observation windows longer than ten years, and show that the propagation of information in social media is a universal and critical process. Universality arises from the observation of identical macroscopic patterns across platforms, irrespective of the details of the specific system at hand. Critical behavior is deduced from the power-law distributions, and corresponding hyperscaling relations, characterizing size and duration of avalanches of information. Statistical testing on our data indicates that a mixture of simple and complex contagion characterizes the propagation of information in social media. Data suggest that the complexity of the process is correlated with the semantic content of the information that is propagated.


Assuntos
Mídias Sociais , Humanos
8.
Nat Hum Behav ; 6(4): 495-505, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-35115677

RESUMO

Newsfeed algorithms frequently amplify misinformation and other low-quality content. How can social media platforms more effectively promote reliable information? Existing approaches are difficult to scale and vulnerable to manipulation. In this paper, we propose using the political diversity of a website's audience as a quality signal. Using news source reliability ratings from domain experts and web browsing data from a diverse sample of 6,890 US residents, we first show that websites with more extreme and less politically diverse audiences have lower journalistic standards. We then incorporate audience diversity into a standard collaborative filtering framework and show that our improved algorithm increases the trustworthiness of websites suggested to users-especially those who most frequently consume misinformation-while keeping recommendations relevant. These findings suggest that partisan audience diversity is a valuable signal of higher journalistic standards that should be incorporated into algorithmic ranking decisions.


Assuntos
Mídias Sociais , Comunicação , Humanos , Reprodutibilidade dos Testes
9.
Sci Rep ; 12(1): 5966, 2022 04 26.
Artigo em Inglês | MEDLINE | ID: mdl-35474313

RESUMO

Widespread uptake of vaccines is necessary to achieve herd immunity. However, uptake rates have varied across U.S. states during the first six months of the COVID-19 vaccination program. Misbeliefs may play an important role in vaccine hesitancy, and there is a need to understand relationships between misinformation, beliefs, behaviors, and health outcomes. Here we investigate the extent to which COVID-19 vaccination rates and vaccine hesitancy are associated with levels of online misinformation about vaccines. We also look for evidence of directionality from online misinformation to vaccine hesitancy. We find a negative relationship between misinformation and vaccination uptake rates. Online misinformation is also correlated with vaccine hesitancy rates taken from survey data. Associations between vaccine outcomes and misinformation remain significant when accounting for political as well as demographic and socioeconomic factors. While vaccine hesitancy is strongly associated with Republican vote share, we observe that the effect of online misinformation on hesitancy is strongest across Democratic rather than Republican counties. Granger causality analysis shows evidence for a directional relationship from online misinformation to vaccine hesitancy. Our results support a need for interventions that address misbeliefs, allowing individuals to make better-informed health decisions.


Assuntos
COVID-19 , Vacinas , COVID-19/prevenção & controle , Vacinas contra COVID-19 , Comunicação , Humanos , Aceitação pelo Paciente de Cuidados de Saúde , Vacinação , Hesitação Vacinal
10.
Phys Rev E ; 103(2): L020302, 2021 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-33736024

RESUMO

We investigate how the properties of inhomogeneous patterns of activity, appearing in many natural and social phenomena, depend on the temporal resolution used to define individual bursts of activity. To this end, we consider time series of microscopic events produced by a self-exciting Hawkes process, and leverage a percolation framework to study the formation of macroscopic bursts of activity as a function of the resolution parameter. We find that the very same process may result in different distributions of avalanche size and duration, which are understood in terms of the competition between the 1D percolation and the branching process universality class. Pure regimes for the individual classes are observed at specific values of the resolution parameter corresponding to the critical points of the percolation diagram. A regime of crossover characterized by a mixture of the two universal behaviors is observed in a wide region of the diagram. The hybrid scaling appears to be a likely outcome for an analysis of the time series based on a reasonably chosen, but not precisely adjusted, value of the resolution parameter.

11.
Phys Rev Lett ; 105(15): 158701, 2010 Oct 08.
Artigo em Inglês | MEDLINE | ID: mdl-21230945

RESUMO

Online popularity has an enormous impact on opinions, culture, policy, and profits. We provide a quantitative, large scale, temporal analysis of the dynamics of online content popularity in two massive model systems: the Wikipedia and an entire country's Web space. We find that the dynamics of popularity are characterized by bursts, displaying characteristic features of critical systems such as fat-tailed distributions of magnitude and interevent time. We propose a minimal model combining the classic preferential popularity increase mechanism with the occurrence of random popularity shifts due to exogenous factors. The model recovers the critical features observed in the empirical analysis of the systems analyzed here, highlighting the key factors needed in the description of popularity dynamics.

12.
Nucleic Acids Res ; 35(15): 5223-31, 2007.
Artigo em Inglês | MEDLINE | ID: mdl-17670794

RESUMO

We performed numerical simulations of DNA chains to understand how local geometry of juxtaposed segments in knotted DNA molecules can guide type II DNA topoisomerases to perform very efficient relaxation of DNA knots. We investigated how the various parameters defining the geometry of inter-segmental juxtapositions at sites of inter-segmental passage reactions mediated by type II DNA topoisomerases can affect the topological consequences of these reactions. We confirmed the hypothesis that by recognizing specific geometry of juxtaposed DNA segments in knotted DNA molecules, type II DNA topoisomerases can maintain the steady-state knotting level below the topological equilibrium. In addition, we revealed that a preference for a particular geometry of juxtaposed segments as sites of strand-passage reaction enables type II DNA topoisomerases to select the most efficient pathway of relaxation of complex DNA knots. The analysis of the best selection criteria for efficient relaxation of complex knots revealed that local structures in random configurations of a given knot type statistically behave as analogous local structures in ideal geometric configurations of the corresponding knot type.


Assuntos
DNA Topoisomerases Tipo II/metabolismo , DNA/química , Simulação por Computador , Modelos Moleculares , Conformação de Ácido Nucleico
13.
R Soc Open Sci ; 6(10): 191412, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-31824736

RESUMO

As social media replace traditional communication channels, we are often exposed to too much information to process. The presence of too many participants, for example, can turn online public spaces into noisy, overcrowded fora where no meaningful conversation can be held. Here, we analyse a large dataset of public chat logs from Twitch, a popular video-streaming platform, in order to examine how information overload affects online group communication. We measure structural and textual features of conversations such as user output, interaction and information content per message across a wide range of information loads. Our analysis reveals the existence of a transition from a conversational state to a cacophony-a state with lower per capita participation, more repetition and less information per message. This study provides a quantitative basis for further studies of the social effects of information overload, and may guide the design of more resilient online conversation systems.

14.
Sci Rep ; 8(1): 15951, 2018 10 29.
Artigo em Inglês | MEDLINE | ID: mdl-30374134

RESUMO

Algorithms that favor popular items are used to help us select among many choices, from top-ranked search engine results to highly-cited scientific papers. The goal of these algorithms is to identify high-quality items such as reliable news, credible information sources, and important discoveries-in short, high-quality content should rank at the top. Prior work has shown that choosing what is popular may amplify random fluctuations and lead to sub-optimal rankings. Nonetheless, it is often assumed that recommending what is popular will help high-quality content "bubble up" in practice. Here we identify the conditions in which popularity may be a viable proxy for quality content by studying a simple model of a cultural market endowed with an intrinsic notion of quality. A parameter representing the cognitive cost of exploration controls the trade-off between quality and popularity. Below and above a critical exploration cost, popularity bias is more likely to hinder quality. But we find a narrow intermediate regime of user attention where an optimal balance exists: choosing what is popular can help promote high-quality items to the top. These findings clarify the effects of algorithmic popularity bias on quality outcomes, and may inform the design of more principled mechanisms for techno-social cultural markets.


Assuntos
Algoritmos , Controle de Qualidade , Mídias Sociais
15.
Nat Commun ; 9(1): 4787, 2018 11 20.
Artigo em Inglês | MEDLINE | ID: mdl-30459415

RESUMO

The massive spread of digital misinformation has been identified as a major threat to democracies. Communication, cognitive, social, and computer scientists are studying the complex causes for the viral diffusion of misinformation, while online platforms are beginning to deploy countermeasures. Little systematic, data-based evidence has been published to guide these efforts. Here we analyze 14 million messages spreading 400 thousand articles on Twitter during ten months in 2016 and 2017. We find evidence that social bots played a disproportionate role in spreading articles from low-credibility sources. Bots amplify such content in the early spreading moments, before an article goes viral. They also target users with many followers through replies and mentions. Humans are vulnerable to this manipulation, resharing content posted by bots. Successful low-credibility sources are heavily supported by social bots. These results suggest that curbing social bots may be an effective strategy for mitigating the spread of online misinformation.


Assuntos
Comunicação , Mídias Sociais/estatística & dados numéricos , Mídias Sociais/normas , Rede Social , Coleta de Dados/métodos , Coleta de Dados/estatística & dados numéricos , Humanos , Disseminação de Informação/métodos
16.
PLoS One ; 13(4): e0196087, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29702657

RESUMO

Massive amounts of fake news and conspiratorial content have spread over social media before and after the 2016 US Presidential Elections despite intense fact-checking efforts. How do the spread of misinformation and fact-checking compete? What are the structural and dynamic characteristics of the core of the misinformation diffusion network, and who are its main purveyors? How to reduce the overall amount of misinformation? To explore these questions we built Hoaxy, an open platform that enables large-scale, systematic studies of how misinformation and fact-checking spread and compete on Twitter. Hoaxy captures public tweets that include links to articles from low-credibility and fact-checking sources. We perform k-core decomposition on a diffusion network obtained from two million retweets produced by several hundred thousand accounts over the six months before the election. As we move from the periphery to the core of the network, fact-checking nearly disappears, while social bots proliferate. The number of users in the main core reaches equilibrium around the time of the election, with limited churn and increasingly dense connections. We conclude by quantifying how effectively the network can be disrupted by penalizing the most central nodes. These findings provide a first look at the anatomy of a massive online misinformation diffusion network.


Assuntos
Comunicação , Mídias Sociais , Inteligência Artificial , Humanos , Disseminação de Informação , Política , Estados Unidos
17.
Nat Biotechnol ; 21(6): 697-700, 2003 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-12740586

RESUMO

Determining protein function is one of the most challenging problems of the post-genomic era. The availability of entire genome sequences and of high-throughput capabilities to determine gene coexpression patterns has shifted the research focus from the study of single proteins or small complexes to that of the entire proteome. In this context, the search for reliable methods for assigning protein function is of primary importance. There are various approaches available for deducing the function of proteins of unknown function using information derived from sequence similarity or clustering patterns of co-regulated genes, phylogenetic profiles, protein-protein interactions (refs. 5-8 and Samanta, M.P. and Liang, S., unpublished data), and protein complexes. Here we propose the assignment of proteins to functional classes on the basis of their network of physical interactions as determined by minimizing the number of protein interactions among different functional categories. Function assignment is proteome-wide and is determined by the global connectivity pattern of the protein network. The approach results in multiple functional assignments, a consequence of the existence of multiple equivalent solutions. We apply the method to analyze the yeast Saccharomyces cerevisiae protein-protein interaction network. The robustness of the approach is tested in a system containing a high percentage of unclassified proteins and also in cases of deletion and insertion of specific protein interactions.


Assuntos
Proteínas Fúngicas/classificação , Proteínas Fúngicas/metabolismo , Proteômica/métodos , Saccharomyces cerevisiae/metabolismo , Algoritmos , Bases de Dados de Proteínas , Proteínas Fúngicas/química , Proteínas Fúngicas/genética , Substâncias Macromoleculares , Ligação Proteica/genética , Controle de Qualidade , Reprodutibilidade dos Testes , Saccharomyces cerevisiae/química , Sensibilidade e Especificidade
19.
Phys Rev E Stat Nonlin Soft Matter Phys ; 73(3 Pt 1): 031921, 2006 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-16605572

RESUMO

The functionality of proteins is governed by their structure in the native state. Protein structures are made up of emergent building blocks of helices and almost planar sheets. A simple coarse-grained geometrical model of a flexible tube barely subject to compaction provides a unified framework for understanding the common character of globular proteins. We argue that a recent critique of the tube idea is not well founded.


Assuntos
Modelos Químicos , Modelos Moleculares , Nanotubos/química , Nanotubos/ultraestrutura , Proteínas/química , Proteínas/ultraestrutura , Simulação por Computador , Ligação de Hidrogênio , Conformação Proteica , Estereoisomerismo
20.
Sci Rep ; 5: 9452, 2015 May 19.
Artigo em Inglês | MEDLINE | ID: mdl-25989177

RESUMO

Online traces of human activity offer novel opportunities to study the dynamics of complex knowledge exchange networks, in particular how emergent patterns of collective attention determine what new information is generated and consumed. Can we measure the relationship between demand and supply for new information about a topic? We propose a normalization method to compare attention bursts statistics across topics with heterogeneous distribution of attention. Through analysis of a massive dataset on traffic to Wikipedia, we find that the production of new knowledge is associated to significant shifts of collective attention, which we take as proxy for its demand. This is consistent with a scenario in which allocation of attention toward a topic stimulates the demand for information about it, and in turn the supply of further novel information. However, attention spikes only for a limited time span, during which new content has higher chances of receiving traffic, compared to content created later or earlier on. Our attempt to quantify demand and supply of information, and our finding about their temporal ordering, may lead to the development of the fundamental laws of the attention economy, and to a better understanding of social exchange of knowledge information networks.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA