Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Psychophysiology ; : e14628, 2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38961523

RESUMO

This study tackles the Garden of Forking Paths, as a challenge for replicability and reproducibility of ERP studies. Here, we applied a multiverse analysis to a sample ERP N400 dataset, donated by an independent research team. We analyzed this dataset using 14 pipelines selected to showcase the full range of methodological variability found in the N400 literature using systematic review approach. The selected pipelines were compared in depth by looking into statistical test outcomes, descriptive statistics, effect size, data quality, and statistical power. In this way we provide a worked example of how analytic flexibility can impact results in research fields with high dimensionality such as ERP, when analyzed using standard null-hypothesis significance testing. Out of the methodological decisions that were varied, high-pass filter cut-off, artifact removal method, baseline duration, reference, measurement latency and locations, and amplitude measure (peak vs. mean) were all shown to affect at least some of the study outcome measures. Low-pass filtering was the only step which did not notably influence any of these measures. This study shows that even some of the seemingly minor procedural deviations can influence the conclusions of an ERP study. We demonstrate the power of multiverse analysis in both identifying the most reliable effects in a given study, and for providing insights into consequences of methodological decisions.

2.
Neuroimage ; 257: 119056, 2022 08 15.
Artigo em Inglês | MEDLINE | ID: mdl-35283287

RESUMO

Good scientific practice (GSP) refers to both explicit and implicit rules, recommendations, and guidelines that help scientists to produce work that is of the highest quality at any given time, and to efficiently share that work with the community for further scrutiny or utilization. For experimental research using magneto- and electroencephalography (MEEG), GSP includes specific standards and guidelines for technical competence, which are periodically updated and adapted to new findings. However, GSP also needs to be regularly revisited in a broader light. At the LiveMEEG 2020 conference, a reflection on GSP was fostered that included explicitly documented guidelines and technical advances, but also emphasized intangible GSP: a general awareness of personal, organizational, and societal realities and how they can influence MEEG research. This article provides an extensive report on most of the LiveMEEG contributions and new literature, with the additional aim to synthesize ongoing cultural changes in GSP. It first covers GSP with respect to cognitive biases and logical fallacies, pre-registration as a tool to avoid those and other early pitfalls, and a number of resources to enable collaborative and reproducible research as a general approach to minimize misconceptions. Second, it covers GSP with respect to data acquisition, analysis, reporting, and sharing, including new tools and frameworks to support collaborative work. Finally, GSP is considered in light of ethical implications of MEEG research and the resulting responsibility that scientists have to engage with societal challenges. Considering among other things the benefits of peer review and open access at all stages, the need to coordinate larger international projects, the complexity of MEEG subject matter, and today's prioritization of fairness, privacy, and the environment, we find that current GSP tends to favor collective and cooperative work, for both scientific and for societal reasons.


Assuntos
Eletroencefalografia , Humanos
3.
Neuropsychol Rev ; 32(3): 577-600, 2022 09.
Artigo em Inglês | MEDLINE | ID: mdl-34374003

RESUMO

Given the complexity of ERP recording and processing pipeline, the resulting variability of methodological options, and the potential for these decisions to influence study outcomes, it is important to understand how ERP studies are conducted in practice and to what extent researchers are transparent about their data collection and analysis procedures. The review gives an overview of methodology reporting in a sample of 132 ERP papers, published between January 1980 - June 2018 in journals included in two large databases: Web of Science and PubMed. Because ERP methodology partly depends on the study design, we focused on a well-established component (the N400) in the most commonly assessed population (healthy neurotypical adults), in one of its most common modalities (visual images). The review provides insights into 73 properties of study design, data pre-processing, measurement, statistics, visualization of results, and references to supplemental information across studies within the same subfield. For each of the examined methodological decisions, the degree of consistency, clarity of reporting and deviations from the guidelines for best practice were examined. Overall, the results show that each study had a unique approach to ERP data recording, processing and analysis, and that at least some details were missing from all papers. In the review, we highlight the most common reporting omissions and deviations from established recommendations, as well as areas in which there was the least consistency. Additionally, we provide guidance for a priori selection of the N400 measurement window and electrode locations based on the results of previous studies.


Assuntos
Eletroencefalografia , Potenciais Evocados , Adulto , Feminino , Humanos , Masculino , Reprodutibilidade dos Testes , Projetos de Pesquisa
4.
Neuroimage ; 245: 118721, 2021 12 15.
Artigo em Inglês | MEDLINE | ID: mdl-34826594

RESUMO

As the number of EEG papers increases, so too do the number of guidelines for how to report what has been done. However, current guidelines and checklists appear to have limited adoption, as systematic reviews have shown the journal article format is highly prone to errors, ambiguities and omissions of methodological details. This is a problem for transparency in the scientific record, along with reproducibility and metascience. Following lessons learned in the high complexity fields of aviation and surgery, we conclude that new tools are needed to overcome the limitations of written methodology descriptions, and that these tools should be developed through community consultation to ensure that they have the most utility for EEG stakeholders. As a first step in tool development, we present the ARTEM-IS Statement describing what action will be needed to create an Agreed Reporting Template for Electroencephalography Methodology - International Standard (ARTEM-IS), along with ARTEM-IS Design Guidelines for developing tools that use an evidence-based approach to error reduction. We first launched the statement at the LiveMEEG conference in 2020 along with a draft of an ARTEM-IS template for public consultation. Members of the EEG community are invited to join this collective effort to create evidence-based tools that will help make the process of reporting methodology intuitive to complete and foolproof by design.


Assuntos
Eletroencefalografia , Guias como Assunto , Relatório de Pesquisa/normas , Humanos , Publicações Periódicas como Assunto , Reprodutibilidade dos Testes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA