Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters











Database
Language
Publication year range
1.
Med Teach ; 46(2): 239-244, 2024 02.
Article in English | MEDLINE | ID: mdl-37605843

ABSTRACT

PURPOSE: To assess interrater reliability and examiners' characteristics, especially specialty, associated with scoring of neurology objective structured clinical examination (OSCE). MATERIAL AND METHODS: During a neurology mock OSCE, five randomly chosen students volunteers were filmed while performing 1 of the 5 stations. Video recordings were scored by physicians from the Lyon and Clermont-Ferrand university teaching hospitals to assess students performance using both a checklist scoring and a global rating scale. Interrater reliability between examiners were assessed using intraclass coefficient correlation. Multivariable linear regression models including video recording as random effect dependent variable were performed to detect factors associated with scoring. RESULTS: Thirty examiners including 15 (50%) neurologists participated. The intraclass correlation coefficient of checklist scores and global ratings between examiners were 0.71 (CI95% [0.45-0.95]) and 0.54 (CI95% [0.28-0.91]), respectively. In multivariable analyses, no factor was associated with checklist scores, while male gender of examiner was associated with lower global rating (ß coefficient = -0.37; CI 95% [-0.62-0.11]). CONCLUSIONS: Our study demonstrated through a video-based scoring method that agreement among examiners was good using checklist scoring while moderate using global rating scale in neurology OSCE. Examiner's specialty did not affect scoring whereas gender was associated with global rating scale.


Subject(s)
Medicine , Neurology , Students, Medical , Humans , Male , Reproducibility of Results , Educational Measurement/methods , Clinical Competence
SELECTION OF CITATIONS
SEARCH DETAIL