Your browser doesn't support javascript.
loading
Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions.
Lin, Che-Wei; Clinciu, Daniel L; Salcedo, Daniel; Huang, Chih-Wei; Kang, Enoch Yi No; Li, Yu-Chuan Jack.
Affiliation
  • Lin CW; Department of Education and Humanities in Medicine, School of Medicine, College of Medicine, Taipei Medical University, Taipei, Taiwan.
  • Clinciu DL; Graduate Institute of Biomedical Science, China Medical University, Taichung, Taiwan.
  • Salcedo D; Taipei Municipal Wanfang Hospital, Taipei, Taiwan.
  • Huang CW; International Center for Health Information Technology, College of Medical Science and Technology, Taipei Medical University, Taipei, Taiwan.
  • Kang EYN; Department of Education, Wan Fang Hospital, Taipei Medical University, Taipei, Taiwan.
  • Li YJ; Evidence-Based Medicine Center, Wan Fang Hospital, Taipei Medical University, Taipei, Taiwan.
PLoS One ; 18(11): e0278571, 2023.
Article in En | MEDLINE | ID: mdl-37917751
ABSTRACT
The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users' needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating.
Subject(s)

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Crowdsourcing Limits: Humans Language: En Journal: PLoS One Journal subject: CIENCIA / MEDICINA Year: 2023 Document type: Article Affiliation country:

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Crowdsourcing Limits: Humans Language: En Journal: PLoS One Journal subject: CIENCIA / MEDICINA Year: 2023 Document type: Article Affiliation country: