Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 124
Filtrar
1.
Ann Surg ; 279(1): 180-186, 2024 01 01.
Artículo en Inglés | MEDLINE | ID: mdl-37436889

RESUMEN

OBJECTIVE: To determine the relationship between, and predictive utility of, milestone ratings and subsequent American Board of Surgery (ABS) vascular surgery in-training examination (VSITE), vascular qualifying examination (VQE), and vascular certifying examination (VCE) performance in a national cohort of vascular surgery trainees. BACKGROUND: Specialty board certification is an important indicator of physician competence. However, predicting future board certification examination performance during training continues to be challenging. METHODS: This is a national longitudinal cohort study examining relational and predictive associations between Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings and performance on VSITE, VQE, and VCE for all vascular surgery trainees from 2015 to 2021. Predictive associations between milestone ratings and VSITE were conducted using cross-classified random-effects regression. Cross-classified random-effects logistic regression was used to identify predictive associations between milestone ratings and VQE and VCE. RESULTS: Milestone ratings were obtained for all residents and fellows(n=1,118) from 164 programs during the study period (from July 2015 to June 2021), including 145,959 total trainee assessments. Medical knowledge (MK) and patient care (PC) milestone ratings were strongly predictive of VSITE performance across all postgraduate years (PGYs) of training, with MK ratings demonstrating a slightly stronger predictive association overall (MK coefficient 17.26 to 35.76, ß = 0.15 to 0.23). All core competency ratings were predictive of VSITE performance in PGYs 4 and 5. PGY 5 MK was highly predictive of VQE performance [OR 4.73, (95% CI, 3.87-5.78), P <0.001]. PC subcompetencies were also highly predictive of VQE performance in the final year of training [OR 4.14, (95% CI, 3.17-5.41), P <0.001]. All other competencies were also significantly predictive of first-attempt VQE pass with ORs of 1.53 and higher. PGY 4 ICS ratings [OR 4.0, (95% CI, 3.06-5.21), P <0.001] emerged as the strongest predictor of VCE first-attempt pass. Again, all subcompetency ratings remained significant predictors of first-attempt pass on CE with ORs of 1.48 and higher. CONCLUSIONS: ACGME Milestone ratings are highly predictive of future VSITE performance, and first-attempt pass achievement on VQE and VCE in a national cohort of surgical trainees.


Asunto(s)
Internado y Residencia , Humanos , Estados Unidos , Estudios Longitudinales , Evaluación Educacional , Competencia Clínica , Educación de Postgrado en Medicina , Acreditación
2.
Med Educ ; 58(1): 93-104, 2024 01.
Artículo en Inglés | MEDLINE | ID: mdl-37455291

RESUMEN

BACKGROUND: The conceptualisation of medical competence is central to its use in competency-based medical education. Calls for 'fixed standards' with 'flexible pathways', recommended in recent reports, require competence to be well defined. Making competence explicit and measurable has, however, been difficult, in part due to a tension between the need for standardisation and the acknowledgment that medical professionals must also be valued as unique individuals. To address these conflicting demands, a multilayered conceptualisation of competence is proposed, with implications for the definition of standards and approaches to assessment. THE MODEL: Three layers are elaborated. This first is a core layer of canonical knowledge and skill, 'that, which every professional should possess', independent of the context of practice. The second layer is context-dependent knowledge, skill, and attitude, visible through practice in health care. The third layer of personalised competence includes personal skills, interests, habits and convictions, integrated with one's personality. This layer, discussed with reference to Vygotsky's concept of Perezhivanie, cognitive load theory, self-determination theory and Maslow's 'self-actualisation', may be regarded as the art of medicine. We propose that fully matured professional competence requires all three layers, but that the assessment of each layer is different. IMPLICATIONS: The assessment of canonical knowledge and skills (Layer 1) can be approached with classical psychometric conditions, that is, similar tests, circumstances and criteria for all. Context-dependent medical competence (Layer 2) must be assessed differently, because conditions of assessment across candidates cannot be standardised. Here, multiple sources of information must be merged and intersubjective expert agreement should ground decisions about progression and level of clinical autonomy of trainees. Competence as the art of medicine (Layer 3) cannot be standardised and should not be assessed with the purpose of permission to practice. The pursuit of personal excellence in this level, however, can be recognised and rewarded.


Asunto(s)
Medicina , Competencia Profesional , Humanos , Actitud , Atención a la Salud , Psicometría , Competencia Clínica
3.
Ann Surg ; 277(4): e971-e977, 2023 04 01.
Artículo en Inglés | MEDLINE | ID: mdl-35129524

RESUMEN

OBJECTIVE: This study aims to investigate at-risk scores of semiannual Accreditation Council for Graduate Medical Education (ACGME) Milestone ratings for vascular surgical trainees' final achievement of competency targets. SUMMARY BACKGROUND DATA: ACGME Milestones assessments have been collected since 2015 for Vascular Surgery. It is unclear whether milestone ratings throughout training predict achievement of recommended performance targets upon graduation. METHODS: National ACGME Milestones data were utilized for analyses. All trainees completing 2-year vascular surgery fellowships in June 2018 and 5-year integrated vascular surgery residencies in June 2019 were included. A generalized estimating equations model was used to obtain at-risk scores for each of the 31 subcompetencies by semiannual review periods, to estimate the probability of trainees achieving the recommended graduation target based on their previous ratings. RESULTS: A total of 122 vascular surgery fellows (VSFs) (95.3%) and 52 integrated vascular surgery residents (IVSRs) (100%) were included. VSFs and IVSRs did not achieve level 4.0 competency targets at a rate of 1.6% to 25.4% across subcompetencies, which was not significantly different between the 2 groups for any of the subcompetencies ( P = 0.161-0.999). Trainees were found to be at greater risk of not achieving competency targets when lower milestone ratings were assigned, and at later time-points in training. At a milestone rating of 2.5, with 1 year remaining before graduation, the at-risk score for not achieving the target level 4.0 milestone ranged from 2.9% to 77.9% for VSFs and 33.3% to 75.0% for IVSRs. CONCLUSION: The ACGME Milestones provide early diagnostic and predictive information for vascular surgery trainees' achievement of competence at completion of training.


Asunto(s)
Internado y Residencia , Humanos , Evaluación Educacional , Competencia Clínica , Educación de Postgrado en Medicina , Acreditación , Procedimientos Quirúrgicos Vasculares
4.
J Vasc Surg ; 76(5): 1388-1397, 2022 11.
Artículo en Inglés | MEDLINE | ID: mdl-35798280

RESUMEN

BACKGROUND: The quality and effectiveness of vascular surgery education should be evaluated based on patient care outcomes. To investigate predictive associations between trainee performance and subsequent patient outcomes, a critical first step is to determine the conceptual alignment of educational competencies with clinical outcomes in practice. We sought to generate expert consensus on the conceptual alignment of the Accreditation Council for Graduate Medical Education (ACGME) Vascular Surgery subcompetencies with patient care outcomes across different Vascular Quality Initiative (VQI) registries. METHODS: A national panel of vascular surgeons with expertise in both clinical care and education were recruited to participate in a modified Delphi expert consensus building process to map ACGME Vascular Surgery subcompetencies (educational markers of resident performance) to VQI clinical modules (patient outcomes). A master list of items for rating was created, including the 31 ACGME Vascular Surgery subcompetencies and 8 VQI clinical registries (endovascular abdominal aortic aneurysm repair, open abdominal aortic aneurysm, thoracic endovascular aortic repair, carotid endarterectomy, carotid artery stent, infrainguinal, suprainguinal, and peripheral vascular intervention). These items were entered into an iterative Delphi process. Positive consensus was reached when 75% or more of the participants ranked an item as mandatory. Intraclass correlations (ICCs) were used to evaluate consistency between experts for each Delphi round. RESULTS: A total of 13 experts who contributed to the development of the Vascular Surgery Milestones participated; 12 experts (92%) participated in both rounds of the Delphi process. Two rounds of Delphi were conducted, as suggested by excellent expert agreement (round 1, ICC = 0.79 [95% confidence interval, 0.74-0.84]; round 2, ICC = 0.97 [95% confidence interval, 0.960-.98]). Using the predetermined consensus cutoff threshold, the Delphi process reduced the number of subcompetencies mapped to patient care outcomes from 31 to a range of 9 to 15 across the 8 VQI clinical registries. Practice-based learning and improvement, and professionalism subcompetencies were identified as less relevant to patient outcome variables captured by the VQI registries after the final round, and the only the systems-based practice subcompetency that was identified as relevant was radiation safety in two of the endovascular registries. CONCLUSIONS: A national panel of vascular surgeon experts reported a high degree of agreement on the relevance of ACGME subcompetencies to patient care outcomes as captured in the VQI clinical registry. Systems-based practice, practice-based learning and improvement, and professionalism competencies were identified as less relevant to patient outcomes after specific surgical procedures.


Asunto(s)
Aneurisma de la Aorta Abdominal , Humanos , Aneurisma de la Aorta Abdominal/cirugía , Consenso , Competencia Clínica , Educación de Postgrado en Medicina , Procedimientos Quirúrgicos Vasculares/educación , Acreditación
5.
Med Teach ; 44(8): 886-892, 2022 08.
Artículo en Inglés | MEDLINE | ID: mdl-36083123

RESUMEN

PURPOSE: Organizational readiness is critical for successful implementation of an innovation. We evaluated program readiness to implement Competence by Design (CBD), a model of Competency-Based Medical Education (CBME), among Canadian postgraduate training programs. METHODS: A survey of program directors was distributed 1 month prior to CBD implementation in 2019. Questions were informed by the R = MC2 framework of organizational readiness and addressed: program motivation, general capacity for change, and innovation-specific capacity. An overall readiness score was calculated. An ANOVA was conducted to compare overall readiness between disciplines. RESULTS: Survey response rate was 42% (n = 79). The mean overall readiness score was 74% (30-98%). There was no difference in scores between disciplines. The majority of respondents agreed that successful implementation of CBD was a priority (74%), and that their leadership (94%) and faculty and residents (87%) were supportive of change. Fewer perceived that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). Curriculum mapping, competence committees and programmatic assessment activities were completed by >90% of programs, while <50% had engaged off-service disciplines. CONCLUSION: Our study highlights important areas where programs excelled in their preparation for CBD, as well as common challenges that serve as targets for future intervention to improve program readiness for CBD implementation.


Asunto(s)
Educación Basada en Competencias , Educación Médica , Canadá , Curriculum , Humanos , Liderazgo
6.
Am J Obstet Gynecol ; 224(3): 308.e1-308.e25, 2021 03.
Artículo en Inglés | MEDLINE | ID: mdl-33098812

RESUMEN

BACKGROUND: Since the launch of the Outcome Project in 2001, the graduate medical education community has been working to implement the 6 general competencies. In 2014, all Obstetrics and Gynecology residency programs implemented specialty-specific milestones to advance competency-based assessment. Each clinical competency committee of the Obstetrics and Gynecology program assesses all residents twice a year on the milestones. These data are reported to the Accreditation Council for Graduate Medical Education as part of a continuous quality improvement effort in graduate medical education. OBJECTIVE: This study aimed to evaluate the correlation between the Accreditation Council for Graduate Medical Education Obstetrics and Gynecology Milestones and residency program graduates' performance on the American Board of Obstetrics and Gynecology qualifying (written) examination. STUDY DESIGN: We conducted a validity study of all graduating (postgraduate year 4) Obstetrics and Gynecology residents in 2017 within Accreditation Council for Graduate Medical Education-accredited United States training programs (1260 residents from 242 programs). This cohort of residents began receiving milestone assessments during their postgraduate year 2 in 2014; the first-year milestones were implemented for all Accreditation Council for Graduate Medical Education-accredited Obstetrics and Gynecology programs. This cohort completed their sixth and final milestone assessment at graduation in June 2017 for a total of 6 periods of milestone assessments. Data regarding each resident's milestone ratings in each of the 28 Accreditation Council for Graduate Medical Education subcompetencies for Obstetrics and Gynecology were assessed for their association with candidates' American Board of Obstetrics and Gynecology qualifying examination scores using a generalized estimating equation regression model. RESULTS: Data were available and analyzed from 1184 residents from 240 programs, representing 94% of the total academic year 2017 graduates of Obstetrics and Gynecology residency training programs. There was a substantial association between most milestone ratings at the 6 assessment points and candidates' performance on the American Board of Obstetrics and Gynecology qualifying examination. The strongest associations with the American Board of Obstetrics and Gynecology were within all 7 of the subcompetencies of Medical Knowledge (range of slope correlation coefficients at final milestone ratings 3.84-5.17; slope coefficients can be interpreted as the gain in qualifying examination points per unit increase in milestone level). At the final milestone assessment, but more modest associations with the American Board of Obstetrics and Gynecology qualifying examination scores were also seen with 9 of the 11 Patient Care and Procedural Skills subcompetencies, the 2 of 2 Practice-Based Learning and Improvement subcompetencies, the 2 of 2 Systems-Based Practice subcompetencies, and 2 of the 3 Professionalism subcompetencies. Only 1 of the 3 Interpersonal and Communication Skills subcompetencies was associated with American Board of Obstetrics and Gynecology qualifying examination scores. CONCLUSION: The pattern of associations between the qualifying examination scores and milestone ratings for the 2017 graduating cohort of Obstetrics and Gynecology residents followed a logical pattern, with the strongest associations seen in Medical Knowledge, and lower to no associations in subcompetencies not as effectively assessed on multiple-choice examinations. Although some positive associations were noted for non-Medical Knowledge milestones, these associations could be caused by correlational rating errors with further study needed to better understand these patterns.


Asunto(s)
Acreditación , Educación de Postgrado en Medicina/normas , Ginecología/educación , Obstetricia/educación , Consejos de Especialidades , Estudios de Cohortes , Correlación de Datos , Evaluación Educacional , Estados Unidos
7.
Ann Vasc Surg ; 76: 463-471, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-33905852

RESUMEN

BACKGROUND: Surgeons provide patient care in complex health care systems and must be able to participate in improving both personal performance and the performance of the system. The Accreditation Council for Graduate Medical Education (ACGME) Vascular Surgery Milestones are utilized to assess vascular surgery fellows' (VSF) achievement of graduation targets in the competencies of Systems Based Practice (SBP) and Practice Based Learning and Improvement (PBLI). We investigate the predictive value of semiannual milestones ratings for final achievement within these competencies at the time of graduation. METHODS: National ACGME milestones data were utilized for analysis. All trainees entering the 2-year vascular surgery fellowship programs in July 2016 were included in the analysis (n = 122). Predictive probability values (PPVs) were obtained for each SBP and PBLI sub-competencies by biannual review periods, to estimate the probability of VSFs not reaching the recommended graduation target based on their previous milestones ratings. RESULTS: The rate of nonachievement of the graduation target level 4.0 on the SBP and PBLI sub-competencies at the time of graduation for VSFs was 13.1-25.4%. At the first time point of assessment, 6 months into the fellowship program, the PPV of the SBP and PBLI milestones for nonachievement of level 4.0 upon graduation ranged from 16.3-60.2%. Six months prior to graduation, the PPVs across the 6 sub-competencies ranged from 14.6-82.9%. CONCLUSIONS: A significant percentage of VSFs do not achieve the ACGME Vascular Surgery Milestone targets for graduation in the competencies of SBP and PBLI, suggesting a need to improve curricula and assessment strategies in these domains across vascular surgery fellowship programs. Reported milestones levels across all time point are predictive of ultimate achievement upon graduation and should be utilized to provide targeted feedback and individualized learning plans to ensure graduates are prepared to engage in personal and health care system improvement once in unsupervised practice.


Asunto(s)
Competencia Clínica , Educación de Postgrado en Medicina , Evaluación Educacional , Escolaridad , Internado y Residencia , Aprendizaje , Cirujanos/educación , Análisis de Sistemas , Educación Basada en Competencias , Humanos , Teoría de Sistemas
8.
Med Teach ; 43(7): 801-809, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-34033512

RESUMEN

Medical education is situated within health care and educational organizations that frequently lag in their use of data to learn, develop, and improve performance. How might we leverage competency-based medical education (CBME) assessment data at the individual, program, and system levels, with the goal of redefining CBME from an initiative that supports the development of physicians to one that also fosters the development of the faculty, administrators, and programs within our organizations? In this paper we review the Deliberately Developmental Organization (DDO) framework proposed by Robert Kegan and Lisa Lahey, a theoretical framework that explains how organizations can foster the development of their people. We then describe the DDO's conceptual alignment with CBME and outline how CBME assessment data could be used to spur the transformation of health care and educational organizations into digitally integrated DDOs. A DDO-oriented use of CBME assessment data will require intentional investment into both the digitalization of assessment data and the development of the people within our organizations. By reframing CBME in this light, we hope that educational and health care leaders will see their investments in CBME as an opportunity to spur the evolution of a developmental culture.


Asunto(s)
Educación Médica , Médicos , Educación Basada en Competencias , Humanos , Aprendizaje
9.
Med Teach ; 43(7): 780-787, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-34020576

RESUMEN

Health care revolves around trust. Patients are often in a position that gives them no other choice than to trust the people taking care of them. Educational programs thus have the responsibility to develop physicians who can be trusted to deliver safe and effective care, ultimately making a final decision to entrust trainees to graduate to unsupervised practice. Such entrustment decisions deserve to be scrutinized for their validity. This end-of-training entrustment decision is arguably the most important one, although earlier entrustment decisions, for smaller units of professional practice, should also be scrutinized for their validity. Validity of entrustment decisions implies a defensible argument that can be analyzed in components that together support the decision. According to Kane, building a validity argument is a process designed to support inferences of scoring, generalization across observations, extrapolation to new instances, and implications of the decision. A lack of validity can be caused by inadequate evidence in terms of, according to Messick, content, response process, internal structure (coherence) and relationship to other variables, and in misinterpreted consequences. These two leading frameworks (Kane and Messick) in educational and psychological testing can be well applied to summative entrustment decision-making. The authors elaborate the types of questions that need to be answered to arrive at defensible, well-argued summative decisions regarding performance to provide a grounding for high-quality safe patient care.


Asunto(s)
Internado y Residencia , Médicos , Competencia Clínica , Educación Basada en Competencias , Toma de Decisiones , Humanos , Confianza
10.
Med Teach ; 43(7): 737-744, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-33989100

RESUMEN

With the rapid uptake of entrustable professional activties and entrustment decision-making as an approach in undergraduate and graduate education in medicine and other health professions, there is a risk of confusion in the use of new terminologies. The authors seek to clarify the use of many words related to the concept of entrustment, based on existing literature, with the aim to establish logical consistency in their use. The list of proposed definitions includes independence, autonomy, supervision, unsupervised practice, oversight, general and task-specific trustworthiness, trust, entrust(ment), entrustable professional activity, entrustment decision, entrustability, entrustment-supervision scale, retrospective and prospective entrustment-supervision scales, and entrustment-based discussion. The authors conclude that a shared understanding of the language around entrustment is critical to strengthen bridges among stages of training and practice, such as undergraduate medical education, graduate medical education, and continuing professional development. Shared language and understanding provide the foundation for consistency in interpretation and implementation across the educational continuum.


Asunto(s)
Educación de Pregrado en Medicina , Internado y Residencia , Competencia Clínica , Educación Basada en Competencias , Educación de Postgrado en Medicina , Estudios Prospectivos , Estudios Retrospectivos
11.
Med Teach ; 42(12): 1369-1373, 2020 12.
Artículo en Inglés | MEDLINE | ID: mdl-32847447

RESUMEN

In response to the numerous challenges resident trainees currently face in their ability to competently acquire the requisite skills, knowledge and attitudes upon graduation, medical educators have looked to a competency-based medical education (CBME) approach as a possible solution. As CBME has already been implemented in many jurisdictions around the world, certain challenges in implementation have been experienced. One important challenge identified relates to how regulatory bodies can either assist or unintentionally hinder implementation. By examining the varied experiences from Canada, the USA and the Netherlands in implementing CBME, this paper identifies how regulatory bodies can support and advance worldwide efforts of furthering its implementation. If regulatory bodies restructure accreditation and regulatory criteria to align with CBME principles, work together in a coordinated fashion to ensure alignment of vital regulatory meaures throughout the training and practice continuum of a physician, and allow for (if not incentivize) individuals and programs to be innovative in adapting CBME to meet their local environments, it is likely that the worldwide implementation of CBME will occur successfully.


Asunto(s)
Educación Médica , Médicos , Canadá , Educación Basada en Competencias , Humanos , Países Bajos
12.
Int J Qual Health Care ; 30(1): 16-22, 2018 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-29194491

RESUMEN

IMPORTANCE: Emergency resuscitation of critically ill patients can challenge team communication and situational awareness. Tools facilitating team performance may enhance patient safety. OBJECTIVES: To determine resuscitation team members' perceptions of the Situational Awareness Display's utility. DESIGN: We conducted focus groups with healthcare providers during Situational Awareness Display development. After simulations assessing the display, we conducted debriefs with participants. SETTING: Dual site tertiary care level 1 trauma centre in Ottawa, Canada. PARTICIPANTS: We recruited by email physicians, nurses and respiratory therapist. INTERVENTION: Situational Awareness Display, a visual cognitive aid that provides key clinical information to enhance resuscitation team communication and situational awareness. MAIN OUTCOMES AND MEASURES: Themes emerging from focus groups and simulation debriefs. Three reviewers independently coded and analysed transcripts using content qualitative analysis. RESULTS: We recruited a total of 33 participants in two focus groups (n = 20) and six simulation debriefs with three 4-5 member teams (n = 13). Majority of participants (10/13) strongly endorsed the Situational Awareness Display's utility in simulation (very or extremely useful). Focus groups and debrief themes included improved perception of patient data, comprehension of context and ability to project to future decisions. Participants described potentially positive and negative impacts on patient safety and positive impacts on provider performance and team communication. Participants expressed a need for easy data entry incorporated into clinical workflow and training on how to use the display. CONCLUSION: Emergency resuscitation team participants felt the Situational Awareness Display has potential to improve provider performance, team communication and situational awareness, ultimately enhancing quality of care.


Asunto(s)
Concienciación , Servicio de Urgencia en Hospital/organización & administración , Resucitación , Comunicación , Femenino , Grupos Focales , Personal de Salud , Humanos , Masculino , Ontario , Grupo de Atención al Paciente , Seguridad del Paciente , Investigación Cualitativa , Centros Traumatológicos/organización & administración
13.
Med Educ ; 51(1): 61-71, 2017 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-27981660

RESUMEN

CONTEXT: Complete reporting of research is essential to enable consumers to accurately appraise, interpret and apply findings. Quality appraisal checklists are giving way to tools that judge the risk for bias. OBJECTIVES: We sought to determine the prevalence of these complementary aspects of research reports (completeness of reporting and perceived risk for bias) of randomised studies in health professions education. METHODS: We searched bibliographic databases for randomised studies of health professions education. We appraised two cohorts representing different time periods (2008-2010 and 2014, respectively) and worked in duplicate to apply the CONSORT guidelines and Cochrane Risk of Bias tool. We explored differences between time periods using independent-samples t-tests or the chi-squared test, as appropriate. RESULTS: We systematically identified 180 randomised studies (2008-2010, n = 150; 2014, n = 30). Frequencies of reporting of CONSORT elements within full-text reports were highly variable and most elements were reported in fewer than 50% of studies. We found a statistically significant difference in the CONSORT reporting index (maximum score: 500) between the 2008-2010 (mean ± standard deviation [SD]: 242.7 ± 55.6) and 2014 (mean ± SD: 311.6 ± 53.2) cohorts (p < 0.001). High or unclear risk for bias was most common for allocation concealment (157, 87%) and blinding of participants (147, 82%), personnel (152, 84%) and outcome assessors (112, 62%). Most risk for bias elements were judged to be unclear (range: 51-84%). Risk for bias elements significantly improved over time for blinding of participants (p = 0.007), incomplete data (p < 0.001) and the presence of other sources of bias (p < 0.001). CONCLUSIONS: Reports of randomised studies in health professions education frequently omit elements recommended by the CONSORT statement. Most reports were assessed as having a high or unclear risk for bias. Greater attention to how studies are reported at study outset and in manuscript preparation could improve levels of complete transparent reporting.


Asunto(s)
Sesgo , Empleos en Salud/educación , Ensayos Clínicos Controlados Aleatorios como Asunto , Bases de Datos Bibliográficas/estadística & datos numéricos , Humanos , Evaluación de Resultado en la Atención de Salud , Encuestas y Cuestionarios/normas
14.
Med Educ ; 51(7): 755-767, 2017 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-28418162

RESUMEN

CONTEXT: Although health professions education scholarship units (HPESUs) share a commitment to the production and dissemination of rigorous educational practices and research, they are situated in many different contexts and have a wide range of structures and functions. OBJECTIVES: In this study, the authors explore the institutional logics common across HPESUs, and how these logics influence the organisation and activities of HPESUs. METHODS: The authors analysed interviews with HPESU leaders in Canada (n = 12), Australia (n = 21), New Zealand (n = 3) and the USA (n = 11). Using an iterative process, they engaged in inductive and deductive analyses to identify institutional logics across all participating HPESUs. They explored the contextual factors that influence how these institutional logics impact each HPESU's structure and function. RESULTS: Participants identified three institutional logics influencing the organisational structure and functions of an HPESU: (i) the logic of financial accountability; (ii) the logic of a cohesive education continuum, and (iii) the logic of academic research, service and teaching. Although most HPESUs embodied all three logics, the power of the logics varied among units. The relative power of each logic influenced leaders' decisions about how members of the unit allocate their time, and what kinds of scholarly contribution and product are valued by the HPESU. CONCLUSIONS: Identifying the configuration of these three logics within and across HPESUs provides insights into the reasons why individual units are structured and function in particular ways. Having a common language in which to discuss these logics can enhance transparency, facilitate evaluation, and help leaders select appropriate indicators of HPESU success.


Asunto(s)
Becas/economía , Administración Financiera , Empleos en Salud , Liderazgo , Australia , Canadá , Administración Financiera/economía , Empleos en Salud/economía , Humanos , Lógica , Nueva Zelanda
15.
Ann Intern Med ; 165(5): 356-62, 2016 09 06.
Artículo en Inglés | MEDLINE | ID: mdl-27159244

RESUMEN

BACKGROUND: High-quality assessment of resident performance is needed to guide individual residents' development and ensure their preparedness to provide patient care. To facilitate this aim, reporting milestones are now required across all internal medicine (IM) residency programs. OBJECTIVE: To describe initial milestone ratings for the population of IM residents by IM residency programs. DESIGN: Cross-sectional study. SETTING: IM residency programs. PARTICIPANTS: All IM residents whose residency program directors submitted milestone data at the end of the 2013-2014 academic year. MEASUREMENTS: Ratings addressed 6 competencies and 22 subcompetencies. A rating of "not assessable" indicated insufficient information to evaluate the given subcompetency. Descriptive statistics were calculated to describe ratings across competencies and training years. RESULTS: Data were available for all 21 774 U.S. IM residents from all 383 programs. Overall, 2889 residents (1621 in postgraduate year 1 [PGY-1], 902 in PGY-2, and 366 in PGY-3) had at least 1 subcompetency rated as not assessable. Summaries of average ratings by competency and training year showed higher ratings for PGY-3 residents in all competencies. Overall ratings for each of the 6 individual competencies showed that fewer than 1% of third-year residents were rated as "unsatisfactory" or "conditional on improvement." However, when subcompetency milestone ratings were used, 861 residents (12.8%) who successfully completed training had at least 1 competency with all corresponding subcompetencies graded below the threshold of "readiness for unsupervised practice." LIMITATION: Data were derived from a point in time in the first reporting period in which milestones were used. CONCLUSION: The initial milestone-based evaluations of IM residents nationally suggest that documenting developmental progression of competency is possible over training years. Subcompetencies may identify areas in which residents might benefit from additional feedback and experience. Future work is needed to explore how milestones are used to support residents' development and enhance residency curricula. PRIMARY FUNDING SOURCE: None.


Asunto(s)
Competencia Clínica , Educación Basada en Competencias , Evaluación Educacional , Medicina Interna/educación , Internado y Residencia/normas , Estudios Transversales , Humanos , Estados Unidos
16.
BMC Med Educ ; 17(1): 210, 2017 Nov 14.
Artículo en Inglés | MEDLINE | ID: mdl-29137674

RESUMEN

BACKGROUND: Parents can assess residents' non-technical skills (NTS) in pediatric emergency departments (EDs). There are no assessment tools, with validity evidence, for parental use in pediatric EDs. The purpose of this study was to develop the Parents' Assessment of Residents Enacting Non-Technical Skills (PARENTS) educational assessment tool and collect three sources of validity evidence (i.e., content, response process, internal structure) for it. METHODS: We established content evidence for the PARENTS through interviews with physician-educators and residents, focus groups with parents, a literature review, and a modified nominal group technique with experts. We collected response process evidence through cognitive interviews with parents. To examine the internal structure evidence, we administered the PARENTS and performed exploratory factor analysis. RESULTS: Initially, a 20-item PARENTS was developed. Cognitive interviews led to the removal of one closed-ended item, the addition of resident photographs, and wording/formatting changes. Thirty-seven residents and 434 parents participated in the administration of the resulting 19-item PARENTS. Following factor analysis, a one-factor model prevailed. CONCLUSIONS: The study presents initial validity evidence for the PARENTS. It also highlights strategies for potentially: (a) involving parents in the assessment of residents, (b) improving the assessment of NTS in pediatric EDs, and


Asunto(s)
Competencia Clínica/normas , Servicio de Urgencia en Hospital/normas , Padres , Pediatría , Médicos/normas , Acceso a la Información , Adolescente , Niño , Preescolar , Técnicas de Apoyo para la Decisión , Evaluación Educacional , Grupos Focales , Humanos , Lactante , Recién Nacido , Internado y Residencia , Pediatría/educación , Pediatría/normas , Relaciones Profesional-Familia , Desarrollo de Programa
17.
Adv Health Sci Educ Theory Pract ; 21(3): 719-29, 2016 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-26303112

RESUMEN

Psychometrics has recently undergone extensive criticism within the medical education literature. The use of quantitative measurement using psychometric instruments such as response scales is thought to emphasize a narrow range of relevant learner skills and competencies. Recent reviews and commentaries suggest that a paradigm shift might be presently underway. We argue for caution, in that the psychometrics approach and the quantitative account of competencies that it reflects is based on a rich discussion regarding measurement and scaling that led to the establishment of this paradigm. Rather than reflecting a homogeneous discipline focused on core competencies devoid of consideration of context, the psychometric community has a history of discourse and debate within the field, with an acknowledgement that the techniques and instruments developed within psychometrics are heuristics that must be used pragmatically.


Asunto(s)
Evaluación Educacional/historia , Psicometría/historia , Educación Médica/historia , Educación Médica/normas , Evaluación Educacional/métodos , Historia del Siglo XX , Historia del Siglo XXI , Humanos , Psicometría/métodos
18.
J Ultrasound Med ; 35(7): 1457-63, 2016 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-27246661

RESUMEN

OBJECTIVES: Increased use of point-of-care ultrasound (US) requires the development of assessment tools that measure the competency of learners. In this study, we developed and tested a tool to assess the quality of point-of-care cardiac US studies performed by novices. METHODS: In phase 1, the Rapid Assessment of Competency in Echocardiography (RACE) scale was developed on the basis of structured interviews with subject matter experts; the tool was then piloted on a small series of US studies in phase 2. In phase 3, the tool was applied to a sample of 154 point-of-care US studies performed by 12 learners; each study was independently rated by 2 experts, with quantitative analysis subsequently performed. RESULTS: Evidence of the content validity of the RACE scale was supported by a consensus exercise, wherein experts agreed on the assessment dimensions and specific items that made up the RACE scale. The tool showed good inter-rater reliability. An analysis of inter-item correlations provided support for the internal structure of the scale, and the tool was able to discriminate between learners early in their point-of-care US learning and those who were more advanced in their training. CONCLUSIONS: The RACE scale provides a straightforward means to assess learner performance with minimal requirements for evaluator training. Our results support the conclusion that the tool is an effective means of making valid judgments regarding competency in point-of-care cardiac US.


Asunto(s)
Competencia Clínica/estadística & datos numéricos , Ecocardiografía/métodos , Evaluación Educacional/métodos , Evaluación Educacional/normas , Sistemas de Atención de Punto , Ultrasonido/educación , Evaluación Educacional/estadística & datos numéricos , Humanos , Reproducibilidad de los Resultados
19.
Med Teach ; 38(11): 1118-1124, 2016 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-27111641

RESUMEN

BACKGROUND: Residents must strive for excellence in their nontechnical skills (NTS). However, NTS have not traditionally been well-assessed in pediatric emergency departments (EDs). One underutilized assessment strategy is to have parents assess the residents caring for their children. Prior to involving parents in resident assessment, it is essential to identify which NTS parents in pediatric EDs can assess. AIM: To explore which resident NTS parents in pediatric EDs can assess. METHODS: An exploratory qualitative study design was used. It included interviews with faculty members involved in the supervision and assessment of residents in a pediatric ED and residents who had experience working in a pediatric ED, as well as focus groups with parents who had visited a pediatric ED at least twice in the past year. RESULTS: Participants in this study suggested that parents, if provided with the opportunity, can assess residents' communication skills, comfort in a pediatric setting, adaptability, and collaboration. CONCLUSIONS: This study demystifies how parents can become involved in the assessment of residents' NTS. The findings will inform the development of assessment strategies and could be used to develop assessment instruments that enable parents to become actively involved in the assessment of residents in pediatric EDs.


Asunto(s)
Comunicación , Evaluación Educacional/métodos , Medicina de Emergencia/educación , Internado y Residencia/métodos , Padres , Pediatría/educación , Conducta Cooperativa , Servicio de Urgencia en Hospital/organización & administración , Femenino , Grupos Focales , Humanos , Liderazgo , Masculino , Rol del Médico , Relaciones Profesional-Familia , Investigación Cualitativa
20.
JAMA ; 316(21): 2253-2262, 2016 Dec 06.
Artículo en Inglés | MEDLINE | ID: mdl-27923089

RESUMEN

Importance: US internal medicine residency programs are now required to rate residents using milestones. Evidence of validity of milestone ratings is needed. Objective: To compare ratings of internal medicine residents using the pre-2015 resident annual evaluation summary (RAES), a nondevelopmental rating scale, with developmental milestone ratings. Design, Setting, and Participants: Cross-sectional study of US internal medicine residency programs in the 2013-2014 academic year, including 21 284 internal medicine residents (7048 postgraduate-year 1 [PGY-1], 7233 PGY-2, and 7003 PGY-3). Exposures: Program director ratings on the RAES and milestone ratings. Main Outcomes and Measures: Correlations of RAES and milestone ratings by training year; correlations of medical knowledge ratings with American Board of Internal Medicine (ABIM) certification examination scores; rating of unprofessional behavior using the 2 systems. Results: Corresponding RAES ratings and milestone ratings showed progressively higher correlations across training years, ranging among competencies from 0.31 (95% CI, 0.29 to 0.33) to 0.35 (95% CI, 0.33 to 0.37) for PGY-1 residents to 0.43 (95% CI, 0.41 to 0.45) to 0.52 (95% CI, 0.50 to 0.54) for PGY-3 residents (all P values <.05). Linear regression showed ratings differed more between PGY-1 and PGY-3 years using milestone ratings than the RAES (all P values <.001). Of the 6260 residents who attempted the certification examination, the 618 who failed had lower ratings using both systems for medical knowledge than did those who passed (RAES difference, -0.9; 95% CI, -1.0 to -0.8; P < .001; milestone medical knowledge 1 difference, -0.3; 95% CI, -0.3 to -0.3; P < .001; and medical knowledge 2 difference, -0.2; 95% CI, -0.3 to -0.2; P < .001). Of the 26 PGY-3 residents with milestone ratings indicating deficiencies on either of the 2 medical knowledge subcompetencies, 12 failed the certification examination. Correlation of RAES ratings for professionalism with residents' lowest professionalism milestone ratings was 0.44 (95% CI, 0.43 to 0.45; P < .001). Conclusions and Relevance: Among US internal medicine residents in the 2013-2014 academic year, milestone-based ratings correlated with RAES ratings but with a greater difference across training years. Both rating systems for medical knowledge correlated with ABIM certification examination scores. Milestone ratings may better detect problems with professionalism. These preliminary findings may inform establishment of the validity of milestone-based assessment.


Asunto(s)
Certificación/normas , Competencia Clínica/estadística & datos numéricos , Medicina Interna/educación , Internado y Residencia/estadística & datos numéricos , Adulto , Evaluación Educacional , Femenino , Humanos , Masculino , Mala Conducta Profesional , Consejos de Especialidades , Estados Unidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA