Rating the Quality of Entrustable Professional Activities: Content Validation and Associations with the Clinical Context.
J Gen Intern Med
; 31(5): 518-23, 2016 May.
Article
en En
| MEDLINE
| ID: mdl-26902239
ABSTRACT
BACKGROUND:
Entrustable professional activities (EPAs) have been developed to assess resident physicians with respect to Accreditation Council for Graduate Medical Education (ACGME) competencies and milestones. Although the feasibility of using EPAs has been reported, we are unaware of previous validation studies on EPAs and potential associations between EPA quality scores and characteristics of educational programs.OBJECTIVES:
Our aim was to validate an instrument for assessing the quality of EPAs for assessment of internal medicine residents, and to examine associations between EPA quality scores and features of rotations.DESIGN:
This was a prospective content validation study to design an instrument to measure the quality of EPAs that were written for assessing internal medicine residents.PARTICIPANTS:
Residency leadership at Mayo Clinic, Rochester participated in this study. This included the Program Director, Associate program directors and individual rotation directors.INTERVENTIONS:
The authors reviewed salient literature. Items were developed to reflect domains of EPAs useful for assessment. The instrument underwent further testing and refinement. Each participating rotation director created EPAs that they felt would be meaningful to assess learner performance in their area. These 229 EPAs were then assessed with the QUEPA instrument to rate the quality of each EPA. MAINMEASURES:
Performance characteristics of the QUEPA are reported. Quality ratings of EPAs were compared to the primary ACGME competency, inpatient versus outpatient setting and specialty type. KEYRESULTS:
QUEPA tool scores demonstrated excellent reliability (ICC range 0.72 to 0.94). Higher ratings were given to inpatient versus outpatient (3.88, 3.66; p = 0.03) focused EPAs. Medical knowledge EPAs scored significantly lower than EPAs assessing other competencies (3.34, 4.00; p < 0.0001).CONCLUSIONS:
The QUEPA tool is supported by good validity evidence and may help in rating the quality of EPAs developed by individual programs. Programs should take care when writing EPAs for the outpatient setting or to assess medical knowledge, as these tended to be rated lower.Palabras clave
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Competencia Clínica
/
Educación de Postgrado en Medicina
/
Evaluación Educacional
Tipo de estudio:
Observational_studies
/
Risk_factors_studies
Límite:
Humans
País/Región como asunto:
America do norte
Idioma:
En
Revista:
J Gen Intern Med
Asunto de la revista:
MEDICINA INTERNA
Año:
2016
Tipo del documento:
Article
País de afiliación:
Estados Unidos