Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters











Database
Language
Publication year range
1.
Behav Brain Sci ; 47: e56, 2024 Feb 05.
Article in English | MEDLINE | ID: mdl-38311446

ABSTRACT

We expect that consensus meetings, where researchers come together to discuss their theoretical viewpoints, prioritize the factors they agree are important to study, standardize their measures, and determine a smallest effect size of interest, will prove to be a more efficient solution to the lack of coordination and integration of claims in science than integrative experiments.


Subject(s)
Consensus
2.
Cortex ; 172: 14-37, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38154375

ABSTRACT

In behavioral, cognitive, and social sciences, reaction time measures are an important source of information. However, analyses on reaction time data are affected by researchers' analytical choices and the order in which these choices are applied. The results of a systematic literature review, presented in this paper, revealed that the justification for and order in which analytical choices are conducted are rarely reported, leading to difficulty in reproducing results and interpreting mixed findings. To address this methodological shortcoming, we created a checklist on reporting reaction time pre-processing to make these decisions more explicit, improve transparency, and thus, promote best practices within the field. The importance of the pre-processing checklist was additionally supported by an expert consensus survey and a multiverse analysis. Consequently, we appeal for maximal transparency on all methods applied and offer a checklist to improve replicability and reproducibility of studies that use reaction time measures.


Subject(s)
Reaction Time , Reaction Time/physiology , Humans , Checklist , Research Design/standards , Reproducibility of Results
3.
Open Res Eur ; 3: 179, 2023.
Article in English | MEDLINE | ID: mdl-39036539

ABSTRACT

Background: Many interventions, especially those linked to open science, have been proposed to improve reproducibility in science. To what extent these propositions are based on scientific evidence from empirical evaluations is not clear. Aims: The primary objective is to identify Open Science interventions that have been formally investigated regarding their influence on reproducibility and replicability. A secondary objective is to list any facilitators or barriers reported and to identify gaps in the evidence. Methods: We will search broadly by using electronic bibliographic databases, broad internet search, and contacting experts in the field of reproducibility, replicability, and open science. Any study investigating interventions for their influence on the reproducibility and replicability of research will be selected, including those studies additionally investigating drivers and barriers to the implementation and effectiveness of interventions. Studies will first be selected by title and abstract (if available) and then by reading the full text by at least two independent reviewers. We will analyze existing scientific evidence using scoping review and evidence gap mapping methodologies. Results: The results will be presented in interactive evidence maps, summarized in a narrative synthesis, and serve as input for subsequent research. Review registration: This protocol has been pre-registered on OSF under doi https://doi.org/10.17605/OSF.IO/D65YS.

SELECTION OF CITATIONS
SEARCH DETAIL