Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Global Surg Educ ; 2(1): 30, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38013865

RESUMO

Purpose: In response to the COVID-19 pandemic, many educational activities in general surgery residency have shifted to a virtual environment, including the American Board of Surgery (ABS) Certifying Exam. Virtual exams may become the new standard. In response, we developed an evaluation instrument, the ACES-Pro, to assess surgical trainee performance with a focus on examsmanship in virtual oral board examinations. The purpose of this study was two-fold: (1) to assess the utility and validity of the evaluation instrument, and (2) to characterize the unique components of strong examsmanship in the virtual setting, which has distinct challenges when compared to in-person examsmanship. Methods: We developed a 15-question evaluation instrument, the ACES-Pro, to assess oral board performance in the virtual environment. Nine attending surgeons viewed four pre-recorded oral board exam scenarios and scored examinees using this instrument. Evaluations were compared to assess for inter-rater reliability. Faculty were also surveyed about their experience using the instrument. Results: Pilot evaluators found the ACES-Pro instrument easy to use and felt it appropriately captured key professionalism metrics of oral board exam performance. We found acceptable inter-rater reliability in the domains of verbal communication, non-verbal communication, and effective use of technology (Guttmann's lambda-2 were 0.796, 0.916, and 0.739, respectively). Conclusions: The ACES-Pro instrument is an assessment with evidence for validity as understood by Kane's framework to evaluate multiple examsmanship domains in the virtual exam setting. Examinees must consider best practices for virtual examsmanship to perform well in this environment. Supplementary Information: The online version contains supplementary material available at 10.1007/s44186-023-00107-7.

2.
Am J Surg ; 225(5): 841-846, 2023 05.
Artigo em Inglês | MEDLINE | ID: mdl-36764899

RESUMO

BACKGROUND: As a community-based medical school which recruited faculty preceptors new to teaching, we sought to create objective assessments for fourth-year surgery experiences via administration of an oral exam. Students provided three authentic cases, which faculty used as a springboard to ascertain student proficiency in five entrustable professional activities: 1-oral presentation, 2-recognition of urgency/instability, 3-calling consults, 4-transitions of care, 5-informed consent. We present proof-of-concept and analysis of student case submissions. METHODS: Twenty-seven student submissions (79 cases in total) were evaluated for case complexity, level-appropriateness, and an estimation of the ability to conduct a quality exam based on the information provided (subjective measures). Objective metrics included word count, instruction adherence, inclusion of figures/captions. A resident-in-training rated cases via the same metrics. In-examination data was separately culled. RESULTS: The average word count was 281.70 (SD 140.23; range 40-743). Figures were included in 26.1% of cases. Faculty raters scored 29.0% as low-complexity, 37.7% medium-complexity, and 33.3% high-complexity. Raters felt 62.3% of cases provided enough information to conduct a quality exam. The majority of cases submitted (65.2%) were level-appropriate or higher. The resident rater scored cases more favorably than surgeons (Cohen's kappa of -0.5), suggesting low inter-rater agreement between those of differing experience levels. CONCLUSION: Student's case submissions lessened faculty burden and provided assessors with adequate information to deliver a quality exam to assess proficiency in clinical skills essential for residency. Cases demonstrated sufficient complexity and level-appropriateness. The request to correlate case rating with exam performance is under review by our institution's assessment office. Near-peer tutoring by resident alumni is a program under development.


Assuntos
Internato e Residência , Estudantes de Medicina , Humanos , Competência Clínica , Docentes de Medicina , Diagnóstico Bucal
3.
Front Psychol ; 13: 992314, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-36591083

RESUMO

Introduction: Based on self-determination theory, we investigated whether examinees are classifiable into profiles based on basic need strength and perceived need support that differ in stress parameters and achievement in the context of a standardized oral exam. Methods: 92 students reported their basic need strength before and perceived need support provided by the examiner once after the exam. Students indicated their emotions and stress perception at four measurement points and we measured their saliva cortisol concurrently, analyzing stress-related changes over time. Results: Latent class analyses revealed two higher-quality (low/high, high/high) and two lower-quality (low/low, high/low) need strength/need support classes. Physio-affective stress development was typical of exam situations. Higher-quality classes that met or exceeded the needs displayed more beneficial stress and emotion response patterns than lower-quality classes. Gain-related emotions mediated achievement in the higher-quality classes. Discussion: Need-supportive examiners can promote student well-being and achievement when they succeed in providing high need satisfaction.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA