Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 266
Filtrar
Más filtros

Intervalo de año de publicación
1.
Proc Natl Acad Sci U S A ; 119(48): e2216035119, 2022 11 29.
Artículo en Inglés | MEDLINE | ID: mdl-36417442

RESUMEN

Since their emergence a few years ago, artificial intelligence (AI)-synthesized media-so-called deep fakes-have dramatically increased in quality, sophistication, and ease of generation. Deep fakes have been weaponized for use in nonconsensual pornography, large-scale fraud, and disinformation campaigns. Of particular concern is how deep fakes will be weaponized against world leaders during election cycles or times of armed conflict. We describe an identity-based approach for protecting world leaders from deep-fake imposters. Trained on several hours of authentic video, this approach captures distinct facial, gestural, and vocal mannerisms that we show can distinguish a world leader from an impersonator or deep-fake imposter.


Asunto(s)
Inteligencia Artificial , Decepción , Gestos
2.
Artículo en Inglés | MEDLINE | ID: mdl-38951386

RESUMEN

OBJECTIVE: Understand if cancer fatalism among adult social media users in the United States is linked to social media informational awareness and if the relationship varies by education level. METHODS: Cross-sectional data from the 2022 Health Information National Trends Survey (n = 3,948) were analyzed using multivariable linear probability models. The study population was defined as social media users active within the past year. The outcome variable was cancer fatalism and the predictor variables were social media informational awareness and education level. RESULTS: Participants with low social media informational awareness were 9% (95% CI = 3, 15), 6% (95% CI = 1, 11), and 21% (95% CI = 14, 27) percentage points more likely to agree that it seems like everything causes cancer, you cannot lower your chances of getting cancer, and there are too many cancer prevention recommendations to follow, respectively. Participants with a college degree or higher level of education and who reported high social media informational awareness were the least likely to agree that everything causes cancer (60%; 95% CI = 54, 66), you cannot lower your chances of getting cancer (14%; 95% CI = 10, 19), and there are too many cancer prevention recommendations to follow (52%; 95% CI = 46, 59). CONCLUSION: Social media informational awareness was associated with lower levels of cancer fatalism among adult social media users. College graduates with high social media informational awareness were the least likely to report cancer fatalism.

3.
Br J Psychiatry ; 224(2): 33-35, 2024 02.
Artículo en Inglés | MEDLINE | ID: mdl-37881016

RESUMEN

With the recent advances in artificial intelligence (AI), patients are increasingly exposed to misleading medical information. Generative AI models, including large language models such as ChatGPT, create and modify text, images, audio and video information based on training data. Commercial use of generative AI is expanding rapidly and the public will routinely receive messages created by generative AI. However, generative AI models may be unreliable, routinely make errors and widely spread misinformation. Misinformation created by generative AI about mental illness may include factual errors, nonsense, fabricated sources and dangerous advice. Psychiatrists need to recognise that patients may receive misinformation online, including about medicine and psychiatry.


Asunto(s)
Trastornos Mentales , Psiquiatría , Humanos , Inteligencia Artificial , Psiquiatras , Comunicación
4.
BMC Infect Dis ; 24(1): 848, 2024 Aug 21.
Artículo en Inglés | MEDLINE | ID: mdl-39169315

RESUMEN

BACKGROUND: The Coronavirus disease-2019 (COVID-19) vaccines were rolled out in many countries; however, sub-optimal COVID-19 vaccine uptake remains a major public health concern globally. This study aimed at assessing the factors that affected the uptake, hesitancy, and resistance of the COVID-19 vaccine among university undergraduate students in Malawi, a least developed country in Africa. METHODS: A descriptive cross-sectional study design was conducted using an online semi-structured questionnaire. A total of 343 University undergraduate students in Blantyre participated in this study after obtaining ethical clearance. Data was exported from Survey Monkey to Microsoft Excel version-21 for cleaning and was analysed using SPSS version-29. Descriptive statistics, including percentages, were performed to define the sample characteristics. Pearson Chi-square and Fisher's exact test were performed to identify significant relationships between vaccine uptake and demographics. A 95% confidence interval was set, and a p-value of < 0.05 was considered statistically significant. RESULTS: Of the 343 participants, 43% were vaccinated. Among the vaccinated, the majority (47.3%, n = 69/146) received Johnson & Johnson vaccine followed by AstraZeneca (46.6%, n = 68/146). The commonly reported reason for vaccine acceptance was 'to protect me against getting COVID-19' (49%); whereas vaccine hesitancy was attributed to 'lack of knowledge (34%), and concerns about vaccine safety (25%). CONCLUSIONS: This study found that adequate knowledge about benefits and safety of COVID-19 vaccine could potentially increase uptake. Lack of credible information or misinformation contributed to vaccine hesitancy. The findings provide insights for design of strategies to increase future vaccine uptake and reduce determinants of vaccine hesitancy. To reduce vaccination hesitancy in any population with or without higher education, we recommend that institutions entrusted with vaccine management must optimise health messaging, and reduce mis-information and dis-information.


Asunto(s)
Vacunas contra la COVID-19 , COVID-19 , Estudiantes , Vacunación , Humanos , Estudios Transversales , Vacunas contra la COVID-19/administración & dosificación , Masculino , Malaui , Estudiantes/psicología , Estudiantes/estadística & datos numéricos , Femenino , Universidades , COVID-19/prevención & control , Adulto Joven , Encuestas y Cuestionarios , Adulto , Vacunación/estadística & datos numéricos , Vacunación/psicología , Vacilación a la Vacunación/estadística & datos numéricos , Vacilación a la Vacunación/psicología , SARS-CoV-2 , Adolescente , Conocimientos, Actitudes y Práctica en Salud , Escolaridad
5.
J Exp Child Psychol ; 244: 105952, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38718681

RESUMEN

The strategic use of deliberate omissions, conveying true but selective information for deceptive purposes, is a prevalent and pernicious disinformation tactic. Crucially, its recognition requires engaging in a sophisticated, multi-part social cognitive reasoning process. In two preregistered studies, we investigated the development of children's ability to engage in this process and successfully recognize this form of deception, finding that children even as young as 5 years are capable of doing so, but only with sufficient scaffolding. This work highlights the key role that social cognition plays in the ability to recognize the manipulation techniques that underpin disinformation. It suggests that the interrelated development of pragmatic competence and epistemic vigilance can be harnessed in the design of tools and strategies to help bolster psychological resistance against disinformation in even our youngest citizens-children at the outset of formal education.


Asunto(s)
Desarrollo Infantil , Decepción , Humanos , Masculino , Femenino , Preescolar , Niño , Cognición Social , Reconocimiento en Psicología
6.
Am J Ind Med ; 67(1): 55-72, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-37963719

RESUMEN

BACKGROUND: Despite some emerging lessons learned from the COVID-19 pandemic, evidence suggests the world remains largely underprepared for-and vulnerable to-similar threats in the future. METHODS: In 2022, researchers at the US National Institute for Occupational Safety and Health (NIOSH) led a team of volunteers to explore how future disruptions, such as pandemics, might impact work and the practice of occupational safety and health (OSH). This qualitative inquiry was framed as a strategic foresight project and included a series of activities designed to help better understand, prepare for, and influence the future. RESULTS: Findings from a thorough search for indicators of change were synthesized into nine critical uncertainties and four plausible future scenarios. Analysis of these outputs elucidated three key challenges that may impact OSH research, policy, and practice during future disruptions: (1) data access, (2) direct-to-worker communications, and (3) mis- and dis-information management. CONCLUSIONS: A robust strategic response is offered to address these challenges, and next steps are proposed to enhance OSH preparedness and institutionalize strategic foresight across the OSH community.


Asunto(s)
COVID-19 , Salud Laboral , Estados Unidos , Humanos , Fuerza Laboral en Salud , Pandemias/prevención & control , COVID-19/epidemiología , COVID-19/prevención & control , Recursos Humanos
7.
Proc Natl Acad Sci U S A ; 118(15)2021 04 13.
Artículo en Inglés | MEDLINE | ID: mdl-33837143

RESUMEN

A summary of the public opinion research on misinformation in the realm of science/health reveals inconsistencies in how the term has been defined and operationalized. A diverse set of methodologies have been employed to study the phenomenon, with virtually all such work identifying misinformation as a cause for concern. While studies completely eliminating misinformation impacts on public opinion are rare, choices around the packaging and delivery of correcting information have shown promise for lessening misinformation effects. Despite a growing number of studies on the topic, there remain many gaps in the literature and opportunities for future studies.


Asunto(s)
Decepción , Comunicación en Salud/tendencias , Opinión Pública , Comunicación en Salud/ética , Comunicación en Salud/normas , Humanos , Alfabetización Informacional
8.
Proc Natl Acad Sci U S A ; 118(15)2021 04 13.
Artículo en Inglés | MEDLINE | ID: mdl-33837146

RESUMEN

Humans learn about the world by collectively acquiring information, filtering it, and sharing what we know. Misinformation undermines this process. The repercussions are extensive. Without reliable and accurate sources of information, we cannot hope to halt climate change, make reasoned democratic decisions, or control a global pandemic. Most analyses of misinformation focus on popular and social media, but the scientific enterprise faces a parallel set of problems-from hype and hyperbole to publication bias and citation misdirection, predatory publishing, and filter bubbles. In this perspective, we highlight these parallels and discuss future research directions and interventions.


Asunto(s)
Investigación Biomédica/ética , Comunicación en Salud/ética , Publicaciones Periódicas como Asunto/tendencias , Comunicación en Salud/tendencias , Humanos , Medios de Comunicación de Masas/ética , Medios de Comunicación de Masas/tendencias , Publicaciones Periódicas como Asunto/ética
9.
BMC Palliat Care ; 23(1): 99, 2024 Apr 13.
Artículo en Inglés | MEDLINE | ID: mdl-38609945

RESUMEN

It seems probable that some form of medically-assisted dying will become legal in England and Wales in the foreseeable future. Assisted dying Bills are at various stages of preparation in surrounding jurisdictions (Scotland, Republic of Ireland, Isle of Man, Jersey), and activists campaign unceasingly for a change in the law in England and Wales. There is generally uncritical supportive media coverage, and individual autonomy is seen as the unassailable trump card: 'my life, my death'.However, devising a law which is 'fit for purpose' is not an easy matter. The challenge is to achieve an appropriate balance between compassion and patient autonomy on the one hand, and respect for human life generally and medical autonomy on the other. More people should benefit from a change in the law than be harmed. In relation to medically-assisted dying, this may not be possible. Protecting the vulnerable is a key issue. Likewise, not impacting negatively on societal attitudes towards the disabled and frail elderly, particularly those with dementia.This paper compares three existing models of physician-assisted suicide: Switzerland, Oregon (USA), and Victoria (Australia). Vulnerability and autonomy are discussed, and concern expressed about the biased nature of much of the advocacy for assisted dying, tantamount to disinformation. A 'hidden' danger of assisted dying is noted, namely, increased suffering as more patients decline referral to palliative-hospice care because they fear they will be 'drugged to death'.Finally, suggestions are made for a possible 'least worse' way forward. One solution would seem to be for physician-assisted suicide to be the responsibility of a stand-alone Department for Assisted Dying overseen by lawyers or judges and operated by technicians. Doctors would be required only to confirm a patient's medical eligibility. Palliative-hospice care should definitely not be involved, and healthcare professionals must have an inviolable right to opt out of involvement. There is also an urgent need to improve the provision of care for all terminally ill patients.


Asunto(s)
Suicidio Asistido , Anciano , Humanos , Inglaterra , Miedo , Anciano Frágil , Victoria
10.
J Med Internet Res ; 26: e48130, 2024 Mar 29.
Artículo en Inglés | MEDLINE | ID: mdl-38551638

RESUMEN

BACKGROUND: Although researchers extensively study the rapid generation and spread of misinformation about the novel coronavirus during the pandemic, numerous other health-related topics are contaminating the internet with misinformation that have not received as much attention. OBJECTIVE: This study aims to gauge the reach of the most popular medical content on the World Wide Web, extending beyond the confines of the pandemic. We conducted evaluations of subject matter and credibility for the years 2021 and 2022, following the principles of evidence-based medicine with assessments performed by experienced clinicians. METHODS: We used 274 keywords to conduct web page searches through the BuzzSumo Enterprise Application. These keywords were chosen based on medical topics derived from surveys administered to medical practitioners. The search parameters were confined to 2 distinct date ranges: (1) January 1, 2021, to December 31, 2021; (2) January 1, 2022, to December 31, 2022. Our searches were specifically limited to web pages in the Polish language and filtered by the specified date ranges. The analysis encompassed 161 web pages retrieved in 2021 and 105 retrieved in 2022. Each web page underwent scrutiny by a seasoned doctor to assess its credibility, aligning with evidence-based medicine standards. Furthermore, we gathered data on social media engagements associated with the web pages, considering platforms such as Facebook, Pinterest, Reddit, and Twitter. RESULTS: In 2022, the prevalence of unreliable information related to COVID-19 saw a noteworthy decline compared to 2021. Specifically, the percentage of noncredible web pages discussing COVID-19 and general vaccinations decreased from 57% (43/76) to 24% (6/25) and 42% (10/25) to 30% (3/10), respectively. However, during the same period, there was a considerable uptick in the dissemination of untrustworthy content on social media pertaining to other medical topics. The percentage of noncredible web pages covering cholesterol, statins, and cardiology rose from 11% (3/28) to 26% (9/35) and from 18% (5/28) to 26% (6/23), respectively. CONCLUSIONS: Efforts undertaken during the COVID-19 pandemic to curb the dissemination of misinformation seem to have yielded positive results. Nevertheless, our analysis suggests that these interventions need to be consistently implemented across both established and emerging medical subjects. It appears that as interest in the pandemic waned, other topics gained prominence, essentially "filling the vacuum" and necessitating ongoing measures to address misinformation across a broader spectrum of health-related subjects.


Asunto(s)
COVID-19 , Medios de Comunicación Sociales , Humanos , COVID-19/epidemiología , COVID-19/prevención & control , Pandemias , Polonia/epidemiología , Infodemiología , Comunicación , Lenguaje
11.
Artículo en Alemán | MEDLINE | ID: mdl-38332143

RESUMEN

Misinformation and disinformation in social media have become a challenge for effective public health measures. Here, we examine factors that influence believing and sharing false information, both misinformation and disinformation, at individual, social, and contextual levels and discuss intervention possibilities.At the individual level, knowledge deficits, lack of skills, and emotional motivation have been associated with believing in false information. Lower health literacy, a conspiracy mindset and certain beliefs increase susceptibility to false information. At the social level, the credibility of information sources and social norms influence the sharing of false information. At the contextual level, emotions and the repetition of messages affect belief in and sharing of false information.Interventions at the individual level involve measures to improve knowledge and skills. At the social level, addressing social processes and social norms can reduce the sharing of false information. At the contextual level, regulatory approaches involving social networks is considered an important point of intervention.Social inequalities play an important role in the exposure to and processing of misinformation. It remains unclear to which degree the susceptibility to belief in and share misinformation is an individual characteristic and/or context dependent. Complex interventions are required that should take into account multiple influencing factors.


Asunto(s)
Comunicación en Salud , Medios de Comunicación Sociales , Humanos , Desinformación , Salud Digital , Alemania , Comunicación
12.
Polit Res Q ; 77(3): 1010-1025, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-39130727

RESUMEN

Disinformation has transformed into a global issue and while it is seen as a growing concern to democracy today, autocrats have long used it as a part of their propaganda repertoire. Yet, no study has tested the effect of disinformation on regime stability and breakdown beyond country-specific studies. Drawing on novel measures from the Digital Society Project (DSP) estimating the levels of disinformation disseminated by governments across 148 countries between 2000-2022 and from the Episodes of Regime Transformation (ERT) dataset, we provide the first global comparative study of disinformation and survival of democratic and authoritarian regimes, respectively. The results show that in authoritarian regimes, disinformation helps rulers to stay in power as regimes with higher levels of disinformation are less likely to experience democratization episodes. In democracies, on the other hand, disinformation increases the probability of autocratization onsets. As such, this study is the first to provide comparative evidence on the negative effects of disinformation on democracy as well as on the prospects of democratization.

13.
Int J ; 79(2): 297-311, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-39219556

RESUMEN

This article details the Russian government's efforts to influence Canadians' perceptions of the war in Ukraine. Specifically, we examined Russian information campaigns tailored to Canadian audiences on X (formerly known as Twitter) and the supportive ecosystems of accounts that amplify those campaigns. By 2023, this ecosystem included at least 200,000 X accounts that have shared content with millions of Canadians. We identified ninety accounts with an outsized influence. The vast majority of the influential Canadian accounts were far right or far left in orientation. These networks were among Canada's most prolific and influential political communities online. We determined this by comparing these networks' potential influence to the online community engaging with Canada's 338 members of Parliament on X and a sample of twenty influential X accounts in Canada. The sophistication and proliferation of Canada-tailored narratives suggest a highly organized and well-funded effort to target Canadian support for Ukraine.

14.
Lupus ; 32(7): 887-892, 2023 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-37171120

RESUMEN

BACKGROUND: Lupus comprises a complex group of inflammatory disorders including cutaneous lupus erythematosus (CLE) and systemic lupus erythematosus (SLE). The issue of health misinformation is increasingly problematic, although the content of misinformation related to lupus available online has not been deeply explored. This study aimed to qualitatively assess the type of misinformation related to lupus available online. METHODS: A literature search on PubMed was conducted, using search terms "cutaneous lupus" OR "discoid lupus" OR "lupus" AND "misinformation" OR "conspiracy" OR "disinformation." Further searches were also performed on Google, YouTube, Twitter, Facebook, Instagram, and TikTok. RESULTS: Published literature describing lupus-related misinformation was minimal, with only three manuscripts identified. Conversely, a variety of points of misinformation were identified online and on social media. Key themes identified in online content included suggestion of incorrect causes such as infection or aspartame consumption, false risk assessments such as lupus never developing in males, false claims about conventional treatments, and promotion of alternative treatments or "cures" without evidence. CONCLUSION: Dermatologists, rheumatologists, and all clinicians treating patients with lupus play an essential role in dispelling the pervasive misinformation surrounding the disease and its treatments, encouraging patients to seek reliable sources of information, and advocating for evidence-based guidance.


Asunto(s)
Lupus Eritematoso Cutáneo , Lupus Eritematoso Sistémico , Humanos , Masculino , Llanto , Comunicación
15.
BMC Public Health ; 23(1): 1662, 2023 08 30.
Artículo en Inglés | MEDLINE | ID: mdl-37644563

RESUMEN

BACKGROUND: The proliferation of false and misleading health claims poses a major threat to public health. This ongoing "infodemic" has prompted numerous organizations to develop tools and approaches to manage the spread of falsehoods and communicate more effectively in an environment of mistrust and misleading information. However, these tools and approaches have not been systematically characterized, limiting their utility. This analysis provides a characterization of the current ecosystem of infodemic management strategies, allowing public health practitioners, communicators, researchers, and policy makers to gain an understanding of the tools at their disposal. METHODS: A multi-pronged search strategy was used to identify tools and approaches for combatting health-related misinformation and disinformation. The search strategy included a scoping review of academic literature; a review of gray literature from organizations involved in public health communications and misinformation/disinformation management; and a review of policies and infodemic management approaches from all U.S. state health departments and select local health departments. A team of annotators labelled the main feature(s) of each tool or approach using an iteratively developed list of tags. RESULTS: We identified over 350 infodemic management tools and approaches. We introduce the 4 i Framework for Advancing Communication and Trust (4 i FACT), a modified social-ecological model, to characterize different levels of infodemic intervention: informational, individual, interpersonal, and institutional. Information-level strategies included those designed to amplify factual information, fill information voids, debunk false information, track circulating information, and verify, detect, or rate the credibility of information. Individual-level strategies included those designed to enhance information literacy and prebunking/inoculation tools. Strategies at the interpersonal/community level included resources for public health communicators and community engagement approaches. Institutional and structural approaches included resources for journalists and fact checkers, tools for managing academic/scientific literature, resources for infodemic researchers/research, resources for infodemic managers, social media regulation, and policy/legislation. CONCLUSIONS: The 4 i FACT provides a useful way to characterize the current ecosystem of infodemic management strategies. Recognizing the complex and multifaceted nature of the ongoing infodemic, efforts should be taken to utilize and integrate strategies across all four levels of the modified social-ecological model.


Asunto(s)
Comunicación en Salud , Confianza , Humanos , Ecosistema , Personal Administrativo , Instituciones de Salud
16.
J Behav Med ; 46(1-2): 239-252, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-35896853

RESUMEN

Although social media can be a source of guidance about HPV vaccination for parents, the information may not always be complete or accurate. We conducted a retrospective content analysis to identify content and frequencies of occurrence of disinformation and misinformation about HPV vaccine posted on Twitter between December 15, 2019, through March 31, 2020, among 3876 unique, English language #HPV Tweets, excluding retweets. We found that 24% of Tweets contained disinformation or misinformation, and the remaining 76% contained support/education. The most prevalent categories of disinformation/misinformation were (1) adverse health effects (59%), (2) mandatory vaccination (19%), and (3) inefficacy of the vaccine (14%). Among the adverse health effects Tweets, non-specific harm/injury (51%) and death (23%) were most frequent. Disinformation/misinformation Tweets vs. supportive Tweets had 5.44 (95% CI 5.33-5.56) times the incidence rate of retweet. In conclusion, almost one-quarter of #HPV Tweets contained disinformation or misinformation about the HPV vaccine and these tweets received higher audience engagement including likes and retweets. Implications for vaccine hesitancy are discussed.


Asunto(s)
Infecciones por Papillomavirus , Vacunas contra Papillomavirus , Medios de Comunicación Sociales , Humanos , Estudios Retrospectivos , Comunicación
17.
J Med Internet Res ; 25: e45069, 2023 08 08.
Artículo en Inglés | MEDLINE | ID: mdl-37552535

RESUMEN

BACKGROUND: Developing an understanding of the public discourse on COVID-19 vaccination on social media is important not only for addressing the ongoing COVID-19 pandemic but also for future pathogen outbreaks. There are various research efforts in this domain, although, a need still exists for a comprehensive topic-wise analysis of tweets in favor of and against COVID-19 vaccines. OBJECTIVE: This study characterizes the discussion points in favor of and against COVID-19 vaccines posted on Twitter during the first year of the pandemic. The aim of this study was primarily to contrast the views expressed by both camps, their respective activity patterns, and their correlation with vaccine-related events. A further aim was to gauge the genuineness of the concerns expressed in antivax tweets. METHODS: We examined a Twitter data set containing 75 million English tweets discussing the COVID-19 vaccination from March 2020 to March 2021. We trained a stance detection algorithm using natural language processing techniques to classify tweets as antivax or provax and examined the main topics of discourse using topic modeling techniques. RESULTS: Provax tweets (37 million) far outnumbered antivax tweets (10 million) and focused mostly on vaccine development, whereas antivax tweets covered a wide range of topics, including opposition to vaccine mandate and concerns about safety. Although some antivax tweets included genuine concerns, there was a large amount of falsehood. Both stances discussed many of the same topics from opposite viewpoints. Memes and jokes were among the most retweeted messages. Most tweets from both stances (9,007,481/10,566,679, 85.24% antivax and 24,463,708/37,044,507, 66.03% provax tweets) came from dual-stance users who posted both provax and antivax tweets during the observation period. CONCLUSIONS: This study is a comprehensive account of COVID-19 vaccine discourse in the English language on Twitter from March 2020 to March 2021. The broad range of discussion points covered almost the entire conversation, and their temporal dynamics revealed a significant correlation with COVID-19 vaccine-related events. We did not find any evidence of polarization and prevalence of antivax discourse over Twitter. However, targeted countering of falsehoods is important because only a small fraction of antivax discourse touched on a genuine issue. Future research should examine the role of memes and humor in driving web-based social media activity.


Asunto(s)
COVID-19 , Medios de Comunicación Sociales , Vacunas , Humanos , Comunicación , COVID-19/prevención & control , COVID-19/epidemiología , Vacunas contra la COVID-19 , Pandemias
18.
J Med Internet Res ; 25: e45731, 2023 08 09.
Artículo en Inglés | MEDLINE | ID: mdl-37556184

RESUMEN

BACKGROUND: Misinformation poses a serious challenge to clinical and policy decision-making in the health field. The COVID-19 pandemic amplified interest in misinformation and related terms and witnessed a proliferation of definitions. OBJECTIVE: We aim to assess the definitions of misinformation and related terms used in health-related literature. METHODS: We conducted a scoping review of systematic reviews by searching Ovid MEDLINE, Embase, Cochrane, and Epistemonikos databases for articles published within the last 5 years up till March 2023. Eligible studies were systematic reviews that stated misinformation or related terms as part of their objectives, conducted a systematic search of at least one database, and reported at least 1 definition for misinformation or related terms. We extracted definitions for the terms misinformation, disinformation, fake news, infodemic, and malinformation. Within each definition, we identified concepts and mapped them across misinformation-related terms. RESULTS: We included 41 eligible systematic reviews, out of which 32 (78%) reviews addressed the topic of public health emergencies (including the COVID-19 pandemic) and contained 75 definitions for misinformation and related terms. The definitions consisted of 20 for misinformation, 19 for disinformation, 10 for fake news, 24 for infodemic, and 2 for malinformation. "False/inaccurate/incorrect" was mentioned in 15 of 20 definitions of misinformation, 13 of 19 definitions of disinformation, 5 of 10 definitions of fake news, 6 of 24 definitions of infodemic, and 0 of 2 definitions of malinformation. Infodemic had 19 of 24 definitions addressing "information overload" and malinformation had 2 of 2 definitions with "accurate" and 1 definition "used in the wrong context." Out of all the definitions, 56 (75%) were referenced from other sources. CONCLUSIONS: While the definitions of misinformation and related terms in the health field had inconstancies and variability, they were largely consistent. Inconstancies related to the intentionality in misinformation definitions (7 definitions mention "unintentional," while 5 definitions have "intentional"). They also related to the content of infodemic (9 definitions mention "valid and invalid info," while 6 definitions have "false/inaccurate/incorrect"). The inclusion of concepts such as "intentional" may be difficult to operationalize as it is difficult to ascertain one's intentions. This scoping review has the strength of using a systematic method for retrieving articles but does not cover all definitions in the extant literature outside the field of health. This scoping review of the health literature identified several definitions for misinformation and related terms, which showed variability and included concepts that are difficult to operationalize. Health practitioners need to exert caution before labeling a piece of information as misinformation or any other related term and only do so after ascertaining accurateness and sometimes intentionality. Additional efforts are needed to allow future consensus around clear and operational definitions.


Asunto(s)
COVID-19 , Humanos , Pandemias , Revisiones Sistemáticas como Asunto , Consenso , Comunicación
19.
J Med Internet Res ; 25: e49416, 2023 11 10.
Artículo en Inglés | MEDLINE | ID: mdl-37948118

RESUMEN

BACKGROUND: While there has been substantial analysis of social media content deemed to spread misinformation about electronic nicotine delivery systems use, the strategic use of misinformation accusations to undermine opposing views has received limited attention. OBJECTIVE: This study aims to fill this gap by analyzing how social media users discuss the topic of misinformation related to electronic nicotine delivery systems, notably vaping products. Additionally, this study identifies and analyzes the actors commonly blamed for spreading such misinformation and how these claims support both the provaping and antivaping narratives. METHODS: Using Twitter's (subsequently rebranded as X) academic application programming interface, we collected tweets referencing #vape and #vaping and keywords associated with fake news and misinformation. This study uses systematic content analysis to analyze the tweets and identify common themes and actors who discuss or possibly spread misinformation. RESULTS: This study found that provape users dominate the platform regarding discussions about misinformation about vaping, with provaping tweets being more frequent and having higher overall user engagement. The most common narrative for provape tweets surrounds the conversation of vaping being perceived as safe. On the other hand, the most common topic from the antivape narrative is that vaping is indeed harmful. This study also points to a general distrust in authority figures, with news outlets, public health authorities, and political actors regularly accused of spreading misinformation, with both placing blame. However, specific actors differ depending on their positionalities. The vast number of accusations from provaping advocates is found to shape what is considered misinformation and works to silence other narratives. Additionally, allegations against reliable and proven sources, such as public health authorities, work to discredit assessments about the health impacts, which is detrimental to public health overall for both provaping and antivaping advocates. CONCLUSIONS: We conclude that the spread of misinformation and the accusations of misinformation dissemination using terms such as "fact check," "misinformation," "fake news," and "disinformation" have become weaponized and co-opted by provaping actors to delegitimize criticisms about vaping and to increase confusion about the potential health risks. The study discusses the mixed types of impact of vaping on public health for both smokers and nonsmokers. Additionally, we discuss the implications for effective health education and communication about vaping and how misinformation claims can affect evidence-based discourse on Twitter as well as informed vaping decisions.


Asunto(s)
Medios de Comunicación Sociales , Vapeo , Humanos , Vapeo/efectos adversos , Comunicación , Salud Pública , Programas Informáticos
20.
Behav Sci Law ; 41(5): 231-245, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36582021

RESUMEN

Misinformation is widespread in political discourse, mental health literature, and hard science. This article describes recurrent publication of the same misinformation regarding parental alienation (PA), that is, variations of the statement: "PA theory assumes that the favored parent has caused PA in the child simply because the child refuses to have a relationship with the rejected parent, without identifying or proving alienating behaviors by the favored parent." Ninety-four examples of the same misinformation were identified and subjected to citation analysis using Gephi software, which displays the links between citing material and cited material. The recurrent misinformation reported here is not trivial; these statements are significant misrepresentations of PA theory. Plausible explanations for this trail of misinformation are the psychological mindset of the authors (i.e., confirmation bias) and the authors' writing skills (e.g., sloppy research practices such as persistent use of secondary sources for their information). The authors of this article recommend that publications containing significant misinformation should be corrected or retracted.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA