Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
5.
Med Teach ; 32(6): 503-8, 2010.
Artículo en Inglés | MEDLINE | ID: mdl-20515382

RESUMEN

BACKGROUND: Though progress tests have been used for several decades in various medical education settings, a few studies have offered analytic frameworks that could be used by practitioners to model growth of knowledge as a function of curricular and other variables of interest. AIM: To explore the use of one form of progress testing in clinical education by modeling growth of knowledge in various disciplines as well as by assessing the impact of recent training (core rotation order) on performance using hierarchical linear modeling (HLM) and analysis of variance (ANOVA) frameworks. METHODS: This study included performances across four test administrations occurring between July 2006 and July 2007 for 130 students from a US medical school who graduated in 2008. Measures-nested-in-examinees HLM growth curve analyses were run to estimate clinical science knowledge growth over time and repeated measures ANOVAs were run to assess the effect of recent training on performance. RESULTS: Core rotation order was related to growth rates for total and pediatrics scores only. Additionally, scores were higher in a given discipline if training had occurred immediately prior to the test administration. CONCLUSIONS: This study provides a useful progress testing framework for assessing medical students' growth of knowledge across their clinical science education and the related impact of training.


Asunto(s)
Medicina Clínica/educación , Evaluación Educacional/métodos , Facultades de Medicina , Prácticas Clínicas , Proyectos Piloto , Estados Unidos
6.
Acad Med ; 84(10 Suppl): S116-9, 2009 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-19907371

RESUMEN

BACKGROUND: To gather evidence of external validity for the Foundations of Medicine (FOM) examination by assessing the relationship between its subscores and local grades for a sample of Portuguese medical students. METHOD: Correlations were computed between six FOM subscores and nine Minho University grades for a sample of 90 medical students. A canonical correlation analysis was run between FOM and Minho measures. RESULTS: Moderate correlations were noted between FOM subscores and Minho grades, ranging from -0.02 to 0.53. One canonical correlation was statistically significant. The FOM variate accounted for 44% of variance in FOM subscores and 22% of variance in Minho end-of-year grades. The Minho canonical variate accounted for 34% of variance in Minho grades and 17% of the FOM subscore variances. CONCLUSIONS: The FOM examination seems to supplement local assessments by targeting constructs not currently measured. Therefore, it may contribute to a more comprehensive assessment of basic and clinical sciences knowledge.


Asunto(s)
Educación Médica , Evaluación Educacional , Portugal , Reproducibilidad de los Resultados , Universidades
8.
J Contin Educ Health Prof ; 23(3): 182-90, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-14528790

RESUMEN

The introduction of a clinical skills examination (CSE) to Step 2 of the U.S. Medical Licensing Examination (USMLE) has focused attention on the design and delivery of large-scale standardized tests of clinical skills and raised the question of the appropriateness of evaluation of these competencies across the span of a physician's career. This initiative coincides with growing pressure to periodically assess the continued competence of physicians in practice. The USMLE CSE is designed to certify that candidates have the basic clinical skills required for the safe and effective practice of medicine in the supervised environment of postgraduate training. These include history taking, physical examination, effective communication with patients and other members of the health care team, and clear and accurate documentation of diagnostic impressions and plans for further assessment. The USMLE CSE does not assess procedural skills. As physicians progress through training and enter practice, both knowledge base and requisite technical skills become more diverse. A variety of indirect and direct measures are available for evaluating physicians, but, at present, no single method permits high-stake inferences about clinical skills. Systematic and standardized assessments make a contribution to comprehensive evaluations, but they retain an element of assessing capacity rather than authentic performance in practice. Much work is needed to identify the optimal combination of methods to be employed in support of programs to ensure maintenance of competence of practicing physicians.


Asunto(s)
Competencia Clínica/normas , Educación de Pregrado en Medicina/normas , Evaluación Educacional , Médicos/normas , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA