Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
J Surg Educ ; 76(3): 814-823, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30472061

RESUMO

OBJECTIVE: Providing feedback to surgical trainees is a critical component for assessment of technical skills, yet remains costly and time consuming. We hypothesize that statistical selection can identify a homogenous group of nonexpert crowdworkers capable of accurately grading inanimate surgical video. DESIGN: Applicants auditioned by grading 9 training videos using the Objective Structured Assessment of Technical Skills (OSATS) tool and an error-based checklist. The summed OSATS, summed errors, and OSATS summary score were tested for outliers using Cronbach's Alpha and single measure intraclass correlation. Accepted crowdworkers then submitted grades for videos in 3 different compositions: full video 1× speed, full video 2× speed, and critical section segmented video. Graders were blinded to this study and a similar statistical analysis was performed. SETTING: The study was conducted at the University of Pittsburgh Medical Center (Pittsburgh, PA), a tertiary care academic teaching hospital. PARTICIPANTS: Thirty-six premedical students participated as crowdworker applicants and 2 surgery experts were compared as the gold-standard. RESULTS: The selected hire intraclass correlation was 0.717 for Total Errors and 0.794 for Total OSATS for the first hire group and 0.800 for Total OSATS and 0.654 for Total Errors for the second hire group. There was very good correlation between full videos at 1× and 2× speed with an interitem statistic of 0.817 for errors and 0.86 for OSATS. Only moderate correlation was found with critical section segments. In 1 year 275hours of inanimate video was graded costing $22.27/video or $1.03/minute. CONCLUSIONS: Statistical selection can be used to identify a homogenous cohort of crowdworkers used for grading trainees' inanimate drills. Crowdworkers can distinguish OSATS metrics and errors in full videos at 2× speed but were less consistent with segmented videos. The program is a comparatively cost-effective way to provide feedback to surgical trainees.


Assuntos
Anastomose Cirúrgica/educação , Competência Clínica , Crowdsourcing , Educação de Pós-Graduação em Medicina/métodos , Avaliação Educacional/métodos , Procedimentos Cirúrgicos Robóticos/educação , Oncologia Cirúrgica/educação , Lista de Checagem , Currículo , Feedback Formativo , Humanos , Internato e Residência , Pennsylvania , Treinamento por Simulação , Gravação em Vídeo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA