Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Banco de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
PLoS Biol ; 20(7): e3001680, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-35797414

RESUMEN

Early career researchers (ECRs) are important stakeholders leading efforts to catalyze systemic change in research culture and practice. Here, we summarize the outputs from a virtual unconventional conference (unconference), which brought together 54 invited experts from 20 countries with extensive experience in ECR initiatives designed to improve the culture and practice of science. Together, we drafted 2 sets of recommendations for (1) ECRs directly involved in initiatives or activities to change research culture and practice; and (2) stakeholders who wish to support ECRs in these efforts. Importantly, these points apply to ECRs working to promote change on a systemic level, not only those improving aspects of their own work. In both sets of recommendations, we underline the importance of incentivizing and providing time and resources for systems-level science improvement activities, including ECRs in organizational decision-making processes, and working to dismantle structural barriers to participation for marginalized groups. We further highlight obstacles that ECRs face when working to promote reform, as well as proposed solutions and examples of current best practices. The abstract and recommendations for stakeholders are available in Dutch, German, Greek (abstract only), Italian, Japanese, Polish, Portuguese, Spanish, and Serbian.


Asunto(s)
Investigadores , Informe de Investigación , Humanos , Poder Psicológico
2.
Psychol Sci ; 34(4): 512-522, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-36730433

RESUMEN

In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.


Asunto(s)
Políticas Editoriales , Publicaciones Periódicas como Asunto , Psicología , Humanos , Reproducibilidad de los Resultados , Investigación/normas , Difusión de la Información
3.
R Soc Open Sci ; 8(10): 210155, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34659776

RESUMEN

In recent years, open science practices have become increasingly popular in psychology and related sciences. These practices aim to increase rigour and transparency in science as a potential response to the challenges posed by the replication crisis. Many of these reforms-including the increasingly used preregistration-have been designed for purely experimental work that tests straightforward hypotheses with standard inferential statistical analyses, such as assessing whether an experimental manipulation has an effect on a variable of interest. But psychology is a diverse field of research. The somewhat narrow focus of the prevalent discussions surrounding and templates for preregistration has led to debates on how appropriate these reforms are for areas of research with more diverse hypotheses and more intricate methods of analysis, such as cognitive modelling research within mathematical psychology. Our article attempts to bridge the gap between open science and mathematical psychology, focusing on the type of cognitive modelling that Crüwell et al. (Crüwell S, Stefan AM, Evans NJ. 2019 Robust standards in cognitive science. Comput. Brain Behav. 2, 255-265) labelled model application, where researchers apply a cognitive model as a measurement tool to test hypotheses about parameters of the cognitive model. Specifically, we (i) discuss several potential researcher degrees of freedom within model application, (ii) provide the first preregistration template for model application and (iii) provide an example of a preregistered model application using our preregistration template. More broadly, we hope that our discussions and concrete proposals constructively advance the mostly abstract current debate surrounding preregistration in cognitive modelling, and provide a guide for how preregistration templates may be developed in other diverse or intricate research contexts.

4.
R Soc Open Sci ; 7(2): 190806, 2020 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-32257301

RESUMEN

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts. In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017. Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]). Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]). Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]). Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA