Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Tipo de documento
País/Região como assunto
Ano de publicação
Intervalo de ano de publicação
1.
Nature ; 618(7964): 342-348, 2023 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-37225979

RESUMO

If popular online platforms systematically expose their users to partisan and unreliable news, they could potentially contribute to societal issues such as rising political polarization1,2. This concern is central to the 'echo chamber'3-5 and 'filter bubble'6,7 debates, which critique the roles that user choice and algorithmic curation play in guiding users to different online information sources8-10. These roles can be measured as exposure, defined as the URLs shown to users by online platforms, and engagement, defined as the URLs selected by users. However, owing to the challenges of obtaining ecologically valid exposure data-what real users were shown during their typical platform use-research in this vein typically relies on engagement data4,8,11-16 or estimates of hypothetical exposure17-23. Studies involving ecological exposure have therefore been rare, and largely limited to social media platforms7,24, leaving open questions about web search engines. To address these gaps, we conducted a two-wave study pairing surveys with ecologically valid measures of both exposure and engagement on Google Search during the 2018 and 2020 US elections. In both waves, we found more identity-congruent and unreliable news sources in participants' engagement choices, both within Google Search and overall, than they were exposed to in their Google Search results. These results indicate that exposure to and engagement with partisan or unreliable news on Google Search are driven not primarily by algorithmic curation but by users' own choices.


Assuntos
Comportamento de Escolha , Fonte de Informação , Política , Preconceito , Ferramenta de Busca , Humanos , Fonte de Informação/estatística & dados numéricos , Fonte de Informação/provisão & distribuição , Preconceito/psicologia , Reprodutibilidade dos Testes , Ferramenta de Busca/métodos , Ferramenta de Busca/normas , Inquéritos e Questionários , Estados Unidos , Algoritmos
2.
Perspect Psychol Sci ; : 17456916231186779, 2023 Nov 27.
Artigo em Inglês | MEDLINE | ID: mdl-38010888

RESUMO

It is critical to understand how algorithms structure the information people see and how those algorithms support or undermine society's core values. We offer a normative framework for the assessment of the information curation algorithms that determine much of what people see on the internet. The framework presents two levels of assessment: one for individual-level effects and another for systemic effects. With regard to individual-level effects we discuss whether (a) the information is aligned with the user's interests, (b) the information is accurate, and (c) the information is so appealing that it is difficult for a person's self-regulatory resources to ignore ("agency hacking"). At the systemic level we discuss whether (a) there are adverse civic-level effects on a system-level variable, such as political polarization; (b) there are negative distributional or discriminatory effects; and (c) there are anticompetitive effects, with the information providing an advantage to the platform. The objective of this framework is both to inform the direction of future scholarship as well as to offer tools for intervention for policymakers.

3.
Sci Adv ; 9(35): eadd8080, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37647396

RESUMO

Do online platforms facilitate the consumption of potentially harmful content? Using paired behavioral and survey data provided by participants recruited from a representative sample in 2020 (n = 1181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment. These viewers often subscribe to these channels (prompting recommendations to their videos) and follow external links to them. In contrast, nonsubscribers rarely see or follow recommendations to videos from these channels. Our findings suggest that YouTube's algorithms were not sending people down "rabbit holes" during our observation window in 2020, possibly due to changes that the company made to its recommender system in 2019. However, the platform continues to play a key role in facilitating exposure to content from alternative and extremist channels among dedicated audiences.


Assuntos
Mídias Sociais , Algoritmos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA