Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 65
Filter
1.
Paediatr Child Health ; 28(8): 463-467, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38638538

ABSTRACT

Objectives: In 2017, Queen's University launched Competency-Based Medical Education (CBME) across 29 programs simultaneously. Two years post-implementation, we asked key stakeholders (faculty, residents, and program leaders) within the Pediatrics program for their perspectives on and experiences with CBME so far. Methods: Program leadership explicitly described the intended outcomes of implementing CBME. Focus groups and interviews were conducted with all stakeholders to describe the enacted implementation. The intended versus enacted implementations were compared to provide insight into needed adaptations for program improvement. Results: Overall, stakeholders saw value in the concept of CBME. Residents felt they received more specific feedback and monthly Competence Committee (CC) meetings and Academic Advisors were helpful. Conversely, all stakeholders noted the increased expectations had led to a feeling of assessment fatigue. Faculty noted that direct observation and not knowing a resident's previous performance information was challenging. Residents wanted to see faculty initiate assessments and improved transparency around progress and promotion decisions. Discussion: The results provided insight into how well the intended outcomes had been achieved as well as areas for improvement. Proposed adaptations included a need for increased direct observation and exploration of faculty accessing residents' previous performance information. Education was provided on the performance expectations of residents and how progress and promotion decisions are made. As well, "flex blocks" were created to help residents customize their training experience to meet their learning needs. The results of this study can be used to inform and guide implementation and adaptations in other programs and institutions.

2.
Med Teach ; 44(7): 781-789, 2022 07.
Article in English | MEDLINE | ID: mdl-35199617

ABSTRACT

PURPOSE: This study evaluated the fidelity of competence committee (CC) implementation in Canadian postgraduate specialist training programs during the transition to competency-based medical education (CBME). METHODS: A national survey of CC chairs was distributed to all CBME training programs in November 2019. Survey questions were derived from guiding documents published by the Royal College of Physicians and Surgeons of Canada reflecting intended processes and design. RESULTS: Response rate was 39% (113/293) with representation from all eligible disciplines. Committee size ranged from 3 to 20 members, 42% of programs included external members, and 20% included a resident representative. Most programs (72%) reported that a primary review and synthesis of resident assessment data occurs prior to the meeting, with some data reviewed collectively during meetings. When determining entrustable professional activity (EPA) achievement, most programs followed the national specialty guidelines closely with some exceptions (53%). Documented concerns about professionalism, EPA narrative comments, and EPA entrustment scores were most highly weighted when determining resident progress decisions. CONCLUSIONS: Heterogeneity in CC implementation likely reflects local adaptations, but may also explain some of the variable challenges faced by programs during the transition to CBME. Our results offer educational leaders important fidelity data that can help inform the larger evaluation and transformation of CBME.


Subject(s)
Internship and Residency , Physicians , Canada , Clinical Competence , Competency-Based Education , Humans , Specialization
3.
Med Teach ; 44(8): 886-892, 2022 08.
Article in English | MEDLINE | ID: mdl-36083123

ABSTRACT

PURPOSE: Organizational readiness is critical for successful implementation of an innovation. We evaluated program readiness to implement Competence by Design (CBD), a model of Competency-Based Medical Education (CBME), among Canadian postgraduate training programs. METHODS: A survey of program directors was distributed 1 month prior to CBD implementation in 2019. Questions were informed by the R = MC2 framework of organizational readiness and addressed: program motivation, general capacity for change, and innovation-specific capacity. An overall readiness score was calculated. An ANOVA was conducted to compare overall readiness between disciplines. RESULTS: Survey response rate was 42% (n = 79). The mean overall readiness score was 74% (30-98%). There was no difference in scores between disciplines. The majority of respondents agreed that successful implementation of CBD was a priority (74%), and that their leadership (94%) and faculty and residents (87%) were supportive of change. Fewer perceived that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). Curriculum mapping, competence committees and programmatic assessment activities were completed by >90% of programs, while <50% had engaged off-service disciplines. CONCLUSION: Our study highlights important areas where programs excelled in their preparation for CBD, as well as common challenges that serve as targets for future intervention to improve program readiness for CBD implementation.


Subject(s)
Competency-Based Education , Education, Medical , Canada , Curriculum , Humans , Leadership
4.
Ann Emerg Med ; 77(6): 613-619, 2021 06.
Article in English | MEDLINE | ID: mdl-33160719

ABSTRACT

STUDY OBJECTIVE: Little is known about the cause or optimal treatment of hyperemesis in habitual cannabis users. Anecdotal evidence supports the use of haloperidol over traditional antiemetics for this newly recognized disorder. We compare haloperidol with ondansetron for cannabis hyperemesis syndrome. METHODS: We randomized cannabis users with active emesis to either haloperidol (with a nested randomization to either 0.05 or 0.1 mg/kg) or ondansetron 8 mg intravenously in a triple-blind fashion. The primary outcome was the reduction from baseline in abdominal pain and nausea (each measured on a 10-cm visual analog scale) at 2 hours after treatment. Although the trial allowed for crossover, the primary analysis used only the first treatment period because few subjects crossed over. RESULTS: We enrolled 33 subjects, of whom 30 (16 men, aged 29 years [SD 11 years] using 1.5 g/day [SD 0.9 g/day] since age 19 years [SD 2 years]) received at least 1 treatment (haloperidol 13, ondansetron 17). Haloperidol at either dose was superior to ondansetron (difference 2.3 cm [95% confidence interval 0.6 to 4.0 cm]; P=.01), with similar improvements in both pain and nausea, as well as less use of rescue antiemetics (31% versus 59%; difference -28% [95% confidence interval -61% to 13%]) and shorter time to emergency department (ED) departure (3.1 hours [SD 1.7] versus 5.6 hours [SD 4.5]; difference 2.5 hours [95% confidence interval 0.1 to 5.0 hours]; P=.03). There were 2 return visits for acute dystonia, both in the higher-dose haloperidol group. CONCLUSION: In this clinical trial, haloperidol was superior to ondansetron for the acute treatment of cannabis-associated hyperemesis. The efficacy of haloperidol over ondansetron provides insight into the pathophysiology of this now common diagnosis in many EDs.


Subject(s)
Antiemetics/administration & dosage , Haloperidol/administration & dosage , Marijuana Abuse/complications , Ondansetron/administration & dosage , Vomiting/chemically induced , Vomiting/drug therapy , Administration, Intravenous , Adult , Female , Humans , Male , Pain Measurement , Syndrome
5.
Med Educ ; 55(9): 1047-1055, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34060651

ABSTRACT

PURPOSE: Competency-based medical education (CBME) has prompted widespread implementation of workplace-based assessment (WBA) tools using entrustment anchors. This study aimed to identify factors that influence faculty's rating choices immediately following assessment and explore their experiences using WBAs with entrustment anchors, specifically the Ottawa Surgical Competency Operating Room Evaluation scale. METHOD: A convenience sample of 50 semi-structured interviews with Emergency Medicine (EM) physicians from a single Canadian hospital were conducted between July and August 2019. All interviews occurred within two hours of faculty completing a WBA of a trainee. Faculty were asked what they considered when rating the trainee's performance and whether they considered an alternate rating. Two team members independently analysed interview transcripts using conventional content analysis with line-by-line coding to identify themes. RESULTS: Interviews captured interactions between 70% (26/37) of full-time EM faculty and 86% (19/22) of EM trainees. Faculty most commonly identified the amount of guidance the trainee required as influencing their rating. Other variables such as clinical context, trainee experience, past experiences with the trainee, perceived competence and confidence were also identified. While most faculty did not struggle to assign ratings, some had difficulty interpreting the language of entrustment anchors, being unsure whether their assessment should be retrospective or prospective in nature, and if/how the assessment should change whether they were 'in the room' or not. CONCLUSIONS: By going to the frontline during WBA encounters, this study captured authentic and honest reflections from physicians immediately engaged in assessment using entrustment anchors. While many of the factors identified are consistent with previous retrospective work, we highlight how some faculty consider factors outside the prescribed approach and struggle with the language of entrustment anchors. These results further our understanding of 'in-the-moment' assessments using entrustment anchors and may facilitate effective faculty development regarding WBA in CBME.


Subject(s)
Internship and Residency , Workplace , Canada , Clinical Competence , Faculty, Medical , Humans
6.
Teach Learn Med ; 33(3): 258-269, 2021.
Article in English | MEDLINE | ID: mdl-33302734

ABSTRACT

Phenomenon: Visual expertise in medicine involves a complex interplay between expert visual behavior patterns and higher-level cognitive processes. Previous studies of visual expertise in medicine have centered around traditionally visually intensive disciplines such as radiology and pathology. However, there is limited study of visual expertise in electrocardiogram (ECG) interpretation, a common clinical task that is associated with high error rates. This qualitatively driven multi-methods study aimed to describe differences in cognitive approaches to ECG interpretation between medical students, emergency medicine (EM) residents, and EM attending physicians. Approach: Ten medical students, 10 EM residents, and 10 EM attending physicians were recruited from one tertiary academic center to participate in this study. Participants interpreted 10 ECGs with a screen-based eye-tracking device, then underwent a subjective re situ interview augmented by playback of the participants' own gaze scan-paths via eye-tracking. Interviews were transcribed verbatim and an emergent thematic analysis was performed across participant groups. Diagnostic speed, accuracy, and heat maps of fixation distribution were collected to supplement the qualitative findings. Findings: Qualitative analysis demonstrated differences among the cohorts in three major themes: dual-process reasoning, ability to prioritize, and clinical implications. These qualitative findings were aligned with differences in visual behavior demonstrated by heat maps of fixation distribution across each ECG. More experienced participants completed ECG interpretation significantly faster and more accurately than less experienced participants. Insights: The cognitive processes related to ECG interpretation differed between novices and more experienced providers in EM. Understanding the differences in cognitive approaches to ECG interpretation between these groups may help inform best practices in teaching this ubiquitous diagnostic skill.


Subject(s)
Emergency Medicine , Students, Medical , Clinical Competence , Electrocardiography , Emergency Medicine/education , Eye-Tracking Technology , Humans
7.
Med Teach ; 43(7): 745-750, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34020580

ABSTRACT

The international movement to competency-based medical education (CBME) marks a major transition in medical education that requires a shift in educators' and learners' approach to clinical experiences, the way assessment data are collected and integrated, and in learners' mindsets. Learners entering a CBME curriculum must actively drive their learning experiences and education goals. For some, this expectation may be a significant change from their previous approach to learning in medicine. This paper highlights 12 tips to help learners succeed within a CBME model.


Subject(s)
Competency-Based Education , Education, Medical , Curriculum , Humans , Learning
8.
Med Teach ; 43(7): 801-809, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34033512

ABSTRACT

Medical education is situated within health care and educational organizations that frequently lag in their use of data to learn, develop, and improve performance. How might we leverage competency-based medical education (CBME) assessment data at the individual, program, and system levels, with the goal of redefining CBME from an initiative that supports the development of physicians to one that also fosters the development of the faculty, administrators, and programs within our organizations? In this paper we review the Deliberately Developmental Organization (DDO) framework proposed by Robert Kegan and Lisa Lahey, a theoretical framework that explains how organizations can foster the development of their people. We then describe the DDO's conceptual alignment with CBME and outline how CBME assessment data could be used to spur the transformation of health care and educational organizations into digitally integrated DDOs. A DDO-oriented use of CBME assessment data will require intentional investment into both the digitalization of assessment data and the development of the people within our organizations. By reframing CBME in this light, we hope that educational and health care leaders will see their investments in CBME as an opportunity to spur the evolution of a developmental culture.


Subject(s)
Education, Medical , Physicians , Competency-Based Education , Humans , Learning
9.
Med Teach ; 43(7): 758-764, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34061700

ABSTRACT

Programmatic assessment as a concept is still novel for many in clinical education, and there may be a disconnect between the academics who publish about programmatic assessment and the front-line clinical educators who must put theory into practice. In this paper, we clearly define programmatic assessment and present high-level guidelines about its implementation in competency-based medical education (CBME) programs. The guidelines are informed by literature and by lessons learned from established programmatic assessment approaches. We articulate five steps to consider when implementing programmatic assessment in CBME contexts: articulate the purpose of the program of assessment, determine what must be assessed, choose tools fit for purpose, consider the stakes of assessments, and define processes for interpreting assessment data. In the process, we seek to offer a helpful guide or template for front-line clinical educators. We dispel some myths about programmatic assessment to help training programs as they look to design-or redesign-programs of assessment. In particular, we highlight the notion that programmatic assessment is not 'one size fits all'; rather, it is a system of assessment that results when shared common principles are considered and applied by individual programs as they plan and design their own bespoke model of programmatic assessment for CBME in their unique context.


Subject(s)
Competency-Based Education , Education, Medical , Humans
10.
Med Teach ; 43(7): 794-800, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34121596

ABSTRACT

There is an urgent need to capture the outcomes of the ongoing global implementation of competency-based medical education (CBME). However, the measurement of downstream outcomes following educational innovations, such as CBME is fraught with challenges stemming from the complexities of medical training, the breadth and variability of inputs, and the difficulties attributing outcomes to specific educational elements. In this article, we present a logic model for CBME to conceptualize an impact pathway relating to CBME and facilitate outcomes evaluation. We further identify six strategies to mitigate the challenges of outcomes measurement: (1) clearly identify the outcome of interest, (2) distinguish between outputs and outcomes, (3) carefully consider attribution versus contribution, (4) connect outcomes to the fidelity and integrity of implementation, (5) pay attention to unanticipated outcomes, and (6) embrace methodological pluralism. Embracing these challenges, we argue that careful and thoughtful evaluation strategies will move us forward in answering the all-important question: Are the desired outcomes of CBME being achieved?


Subject(s)
Competency-Based Education , Education, Medical , Humans
11.
Med Teach ; 43(7): 751-757, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34410891

ABSTRACT

The ongoing adoption of competency-based medical education (CBME) across health professions training draws focus to learner-centred educational design and the importance of fostering a growth mindset in learners, teachers, and educational programs. An emerging body of literature addresses the instructional practices and features of learning environments that foster the skills and strategies necessary for trainees to be partners in their own learning and progression to competence and to develop skills for lifelong learning. Aligned with this emerging area is an interest in Dweck's self theory and the concept of the growth mindset. The growth mindset is an implicit belief held by an individual that intelligence and abilities are changeable, rather than fixed and immutable. In this paper, we present an overview of the growth mindset and how it aligns with the goals of CBME. We describe the challenges associated with shifting away from the fixed mindset of most traditional medical education assumptions and practices and discuss potential solutions and strategies at the individual, relational, and systems levels. Finally, we present future directions for research to better understand the growth mindset in the context of CBME.


Subject(s)
Competency-Based Education , Education, Medical , Health Occupations , Humans , Learning
12.
Med Teach ; 43(7): 810-816, 2021 07.
Article in English | MEDLINE | ID: mdl-34038645

ABSTRACT

Competency-based medical education has been advocated as the future of medical education for nearly a half-century. Inherent to this is the promise that advancement and transitions in training would be defined by readiness to practice rather than by time. Of the logistical problems facing competency-based, time-variable (CBTV) training, enacting time variability may be the largest hurdle to clear. Although it is true that an 'all or nothing' approach to CBTV training would require massive overhauls of both medical education and health care systems, the authors propose that training institutions should gradually evolve within their current environments to incrementally move toward the best version of CBTV training for learners, supervisors, and patients. In support of this evolution, the authors seek to demonstrate the feasibility of advancing toward the goal of realistic CBTV training by detailing examples of successful CBTV training and describing key features of initial steps toward CBTV training implementation.


Subject(s)
COVID-19 , Pandemics , Clinical Competence , Competency-Based Education , Humans , SARS-CoV-2
13.
Med Teach ; 43(7): 788-793, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34038673

ABSTRACT

As the global transformation of postgraduate medical training continues, there are persistent calls for program evaluation efforts to understand the impact and outcomes of competency-based medical education (CBME) implementation. The measurement of a complex educational intervention such as CBME is challenging because of the multifaceted nature of activities and outcomes. What is needed, therefore, is an organizational taxonomy to both conceptualize and categorize multiple outcomes. In this manuscript we propose a taxonomy that builds on preceding works to organize CBME outcomes across three domains: focus (educational, clinical), level (micro, meso, macro), and timeline (training, transition to practice, practice). We also provide examples of how to conceptualize outcomes of educational interventions across medical specialties using this taxonomy. By proposing a shared language for outcomes of CBME, we hope that this taxonomy will help organize ongoing evaluation work and catalyze those seeking to engage in the evaluation effort to help understand the impact and outcomes of CBME.


Subject(s)
Curriculum , Education, Medical , Competency-Based Education , Humans , Language , Program Evaluation
14.
Med Teach ; 42(7): 756-761, 2020 07.
Article in English | MEDLINE | ID: mdl-32450049

ABSTRACT

The COVID-19 pandemic has disrupted healthcare systems around the world, impacting how we deliver medical education. The normal day-to-day routines have been altered for a number of reasons, including changes to scheduled training rotations, physical distancing requirements, trainee redeployment, and heightened level of concern. Medical educators will likely need to adapt their programs to maximize learning, maintain effective care delivery, and ensure competent graduates. Along with a continued focus on learner/faculty wellness, medical educators will have to optimize existing training experiences, adapt those that are no longer viable, employ new technologies, and be flexible when assessing competencies. These practical tips offer guidance on how to adapt medical education programs within the constraints of the pandemic landscape, stressing the need for communication, innovation, collaboration, flexibility, and planning within the era of competency-based medical education.


Subject(s)
Coronavirus Infections/epidemiology , Health Occupations/education , Mental Health , Pneumonia, Viral/epidemiology , Adaptation, Psychological , Betacoronavirus , COVID-19 , Healthy Lifestyle , Humans , Organizational Culture , Organizational Innovation , Pandemics , SARS-CoV-2 , Social Support , Students, Health Occupations/psychology
15.
Ann Emerg Med ; 74(5): 647-659, 2019 11.
Article in English | MEDLINE | ID: mdl-31080034

ABSTRACT

STUDY OBJECTIVE: Simulation is commonly used to teach crisis resource management skills and assess them in emergency medicine residents. However, our understanding of the cognitive processes underlying crisis resource management skills is limited because these processes are difficult to assess and describe. The objective of this study is to uncover and characterize the cognitive processes underlying crisis resource management skills and to describe how these processes vary between residents according to performance in a simulation-based examination. METHODS: Twenty-two of 24 eligible emergency medicine trainees from 1 tertiary academic center completed 1 or 2 resuscitation-based examinations in the simulation laboratory. Resident performance was assessed by a blinded expert using an entrustment-based scoring tool. Participants wore eye-tracking glasses that generated first-person video that was used to augment subsequent interviews led by an emergency medicine faculty member. Interviews were audio recorded and then transcribed. An emergent thematic analysis was completed with a codebook that was developed by 4 research assistants, with subsequent analyses conducted by the lead research assistant with input from emergency medicine faculty. Themes from high- and low-performing residents were subsequently qualitatively compared. RESULTS: Higher-performing residents were better able to anticipate, selectively attend to relevant information, and manage cognitive demands, and took a concurrent (as opposed to linear) approach to managing the simulated patient. CONCLUSION: The results provide new insights into residents' cognitive processes while managing simulated patients in an examination environment and how these processes vary with performance. More work is needed to determine how best to apply these findings to improve crisis resource management education.


Subject(s)
Clinical Competence/standards , Emergency Medicine/education , Internship and Residency , Patient Simulation , Resuscitation , Cognition , Competency-Based Education , Educational Measurement , Evaluation Studies as Topic , Humans , Physical Examination , Resuscitation/education , Resuscitation/standards , Video Recording
16.
Med Teach ; 41(4): 385-390, 2019 04.
Article in English | MEDLINE | ID: mdl-30973801

ABSTRACT

Advances in technology make it possible to supplement in-person teaching activities with digital learning, use electronic records in patient care, and communicate through social media. This relatively new "digital learning environment" has changed how medical trainees learn, participate in patient care, are assessed, and provide feedback. Communication has changed with the use of digital health records, the evolution of interdisciplinary and interprofessional communication, and the emergence of social media. Learning has evolved with the proliferation of online tools such as apps, blogs, podcasts, and wikis, and the formation of virtual communities. Assessment of learners has progressed due to the increasing amounts of data being collected and analyzed. Digital technologies have also enhanced learning in resource-poor environments by making resources and expertise more accessible. While digital technology offers benefits to learners, the teachers, and health care systems, there are concerns regarding the ownership, privacy, safety, and management of patient and learner data. We highlight selected themes in the domains of digital communication, digital learning resources, and digital assessment and close by providing practical recommendations for the integration of digital technology into education, with the aim of maximizing its benefits while reducing risks.


Subject(s)
Communication , Education, Medical/organization & administration , Environment , Information Systems/organization & administration , Learning , Clinical Competence/standards , Computer Security/standards , Education, Medical/standards , Health Information Management/organization & administration , Humans , Internet , Social Environment , Social Media/organization & administration
18.
Ann Emerg Med ; 77(5): 555-556, 2021 05.
Article in English | MEDLINE | ID: mdl-33902835
19.
CJEM ; 26(3): 179-187, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38374281

ABSTRACT

OBJECTIVE: Approximately five years ago, the Royal College emergency medicine programs in Canada implemented a competency-based paradigm and introduced the use of Entrustable Professional Activities (EPAs) for assessment of units of professional activity to assess trainees. Many competency-based medical education (CBME) based curricula, involve assessing for entrustment through observations of EPAs. While EPAs are frequently assessed in clinical settings, simulation is also used. This study aimed to characterize the use of simulation for EPA assessment. METHODS: A study interview guide was jointly developed by all study authors and followed best practices for survey development. A national interview was conducted with program directors or assistant program directors across all the Royal College emergency medicine programs across Canada. Interviews were conducted over Microsoft Teams, interviews were recorded and transcribed, using Microsoft Teams transcribing service. Sample transcripts were analyzed for theme development. Themes were then reviewed by co-authors to ensure they were representative of the participants' views. RESULTS: A 64.7% response rate was achieved. Simulation has been widely adopted by EM training programs. All interviewees demonstrated support for the use of simulation for EPA assessment for many reasons, however, PDs acknowledged limitations and thematic analysis revealed certain themes and tensions for using simulation for EPA assessment. Thematic analysis revealed six major themes: widespread support for the use of simulation for EPA assessment, concerns regarding the potential for EPA assessment to become a "tick- box" exercise, logistical barriers limiting the use of simulation for EPA assessment, varied perceptions about the authenticity of using simulation for EPA assessment, the potential for simulation for EPA assessment to compromise learner psychological safety, and suggestions for the optimization of use of simulation for EPA assessment. CONCLUSIONS: Our findings offer insight for other programs and specialties on how simulation for EPA assessment can best be utilized. Programs should use these findings when considering using simulation for EPA assessment.


RéSUMé: OBJECTIF: Il y a environ cinq ans, les programmes de médecine d'urgence du Collège royal au Canada ont mis en place un paradigme basé sur les compétences et ont introduit l'utilisation d'activités professionnelles confiables (APC) pour l'évaluation des unités d'activité professionnelle afin d'évaluer les stagiaires. De nombreux programmes d'enseignement médical basés sur les compétences (CBME) prévoient l'évaluation des compétences par l'observation des APC. Bien que les APC soient fréquemment évaluées en milieu clinique, la simulation est également utilisée. Cette étude visait à caractériser l'utilisation de la simulation pour l'évaluation de l'APC. MéTHODES: Un guide d'entretien d'étude a été élaboré conjointement par tous les auteurs de l'étude et a suivi les meilleures pratiques en matière d'élaboration d'enquêtes. Un entretien national a été réalisé avec les directeurs de programmes ou les directeurs adjoints de tous les programmes de médecine d'urgence du Collège royal au Canada. Les entretiens ont été menés sur Microsoft Teams, enregistrés et transcrits à l'aide du service de transcription de Microsoft Teams. Les transcriptions des échantillons ont été analysées pour développer des thèmes. Les thèmes ont ensuite été revus par les co-auteurs pour s'assurer qu'ils étaient représentatifs des points de vue des participants. RéSULTATS: Un taux de réponse de 64,7 % a été obtenu. La simulation a été largement adoptée par les programmes de formation en médecine d'urgence. Toutes les personnes interrogées se sont montrées favorables à l'utilisation de la simulation pour l'évaluation de l'APE pour de nombreuses raisons. Cependant, les DP ont reconnu des limites et l'analyse thématique a révélé certains thèmes et tensions liés à l'utilisation de la simulation pour l'évaluation de l'APC. L'analyse thématique a révélé six thèmes majeurs : un appui généralisé à l'utilisation de la simulation pour l'évaluation de l'APC, inquiétudes concernant la possibilité que l'évaluation de l'APC devienne un exercice de type « case à cocher ¼, des obstacles logistiques limitant l'utilisation de la simulation pour l'évaluation de l'APC, les perceptions variées quant à l'authenticité de l'utilisation de la simulation pour l'évaluation de l'APC, le potentiel de la simulation pour l'évaluation de l'APC de compromettre la sécurité psychologique des apprenants, et des suggestions pour l'optimisation de l'utilisation de la simulation pour l'évaluation de l'APC. CONCLUSIONS: Nos résultats offrent un aperçu à d'autres programmes et spécialités sur la meilleure façon d'utiliser la simulation pour l'évaluation de l'APC. Les programmes devraient utiliser ces résultats lorsqu'ils envisagent d'utiliser la simulation pour l'évaluation de l'APC.


Subject(s)
Emergency Medicine , Internship and Residency , Humans , Curriculum , Competency-Based Education , Clinical Competence , Emergency Medicine/education
20.
Simul Healthc ; 19(1S): S32-S40, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-38240616

ABSTRACT

ABSTRACT: Although just-in-time training (JIT) is increasingly used in simulation-based health professions education, its impact on learning, performance, and patient outcomes remains uncertain. The aim of this study was to determine whether JIT simulation training leads to improved learning and performance outcomes. We included randomized or nonrandomized interventional studies assessing the impact of JIT simulation training (training conducted in temporal or spatial proximity to performance) on learning outcomes among health professionals (trainees or practitioners). Of 4077 citations screened, 28 studies were eligible for inclusion. Just-in-time training simulation training has been evaluated for a variety of medical, resuscitation, and surgical procedures. Most JIT simulation training occurred immediately before procedures and lasted between 5 and 30 minutes. Despite the very low certainty of evidence, this systematic review suggests JIT simulation training can improve learning and performance outcomes, in particular time to complete skills. There remains limited data on better patient outcomes and collateral educational effects.


Subject(s)
Health Personnel , Simulation Training , Humans , Health Personnel/education , Learning , Computer Simulation , Delivery of Health Care
SELECTION OF CITATIONS
SEARCH DETAIL