Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 14 de 14
1.
MedEdPublish (2016) ; 13: 11, 2023.
Article En | MEDLINE | ID: mdl-38028656

Accreditation processes for health care professions are designed to ensure that individuals and programs in these fields meet established standards of quality and effectiveness. The accelerating pace of globalization in the health care professions has increased the need for a shared understanding of the vocabulary of evaluation, assessment, and accreditation. The psychometric principles of valid and reliable assessment are commonly accepted, but the terminology is confusing. We believe that all stakeholders - evaluators, faculty, students but also the community - will benefit from a shared language and common set of definitions. We recognize that not all readers will agree with the definitions we propose, but we hope that this guide will help to ensure clarity, consistency, transparency, and fairness, and that it will promote through the stimulation of a debate greater collaboration across national and international boundaries.

4.
J Bone Joint Surg Am ; 99(13): e72, 2017 Jul 05.
Article En | MEDLINE | ID: mdl-28678133

Most U.S. medical schools follow the 4-year model, consisting of 2 preclinical years, core clinical experience, and a fourth year intended to permit students to increase clinical competency, to explore specialty areas, and to transition to residency. Although the design and delivery of Years 1 through 3 have evolved to meet new challenges and expectations, the structure of Year 4 remains largely unchanged. For most students considering a career in orthopaedics, Year 4 is a series of elective rotations in which educational objectives become secondary to interviewing for residency programs. Most accreditation bodies recognize the importance of attainment of competency over the duration of medical school as the goal of educating physicians, and thus, there is a growing interest in reexamining the traditional medical school curriculum with the goal of integrating the final phases of undergraduate education and the first phases of postgraduate education.A literature search was undertaken to identify publications on the duration of medical education. Pilot approaches to competency-based integration of undergraduate medical school and postgraduate training in orthopaedic surgery were reviewed.There have been few data suggesting that 4 years of medical education is superior to shorter-duration programs. Three approaches to competency-based integration of undergraduate medical school and postgraduate training are presented. Their goal is to use student and faculty time more effectively. Each approach offers the opportunity to lower the cost and to decrease the time required for Board Certification in Orthopaedic Surgery. Two approaches shorten the entire duration of medical school and graduate training by using various proportions of the fourth year to begin residency, and one approach expands the duration of orthopaedic training by starting in the fourth year of medical school and including training equivalent to a fellowship program into the residency experience.The effectiveness of such programs will form the basis for revisions to the current orthopaedic training paradigm, resulting in a more effective, efficient, and integrated orthopaedic training curriculum.


Career Choice , Education, Medical, Graduate/organization & administration , Education, Medical, Undergraduate/organization & administration , Orthopedics/education , Clinical Competence , Curriculum , Humans , Time Factors , United States
8.
Med Teach ; 32(6): 503-8, 2010.
Article En | MEDLINE | ID: mdl-20515382

BACKGROUND: Though progress tests have been used for several decades in various medical education settings, a few studies have offered analytic frameworks that could be used by practitioners to model growth of knowledge as a function of curricular and other variables of interest. AIM: To explore the use of one form of progress testing in clinical education by modeling growth of knowledge in various disciplines as well as by assessing the impact of recent training (core rotation order) on performance using hierarchical linear modeling (HLM) and analysis of variance (ANOVA) frameworks. METHODS: This study included performances across four test administrations occurring between July 2006 and July 2007 for 130 students from a US medical school who graduated in 2008. Measures-nested-in-examinees HLM growth curve analyses were run to estimate clinical science knowledge growth over time and repeated measures ANOVAs were run to assess the effect of recent training on performance. RESULTS: Core rotation order was related to growth rates for total and pediatrics scores only. Additionally, scores were higher in a given discipline if training had occurred immediately prior to the test administration. CONCLUSIONS: This study provides a useful progress testing framework for assessing medical students' growth of knowledge across their clinical science education and the related impact of training.


Clinical Medicine/education , Educational Measurement/methods , Schools, Medical , Clinical Clerkship , Pilot Projects , United States
9.
Acad Med ; 84(10 Suppl): S116-9, 2009 Oct.
Article En | MEDLINE | ID: mdl-19907371

BACKGROUND: To gather evidence of external validity for the Foundations of Medicine (FOM) examination by assessing the relationship between its subscores and local grades for a sample of Portuguese medical students. METHOD: Correlations were computed between six FOM subscores and nine Minho University grades for a sample of 90 medical students. A canonical correlation analysis was run between FOM and Minho measures. RESULTS: Moderate correlations were noted between FOM subscores and Minho grades, ranging from -0.02 to 0.53. One canonical correlation was statistically significant. The FOM variate accounted for 44% of variance in FOM subscores and 22% of variance in Minho end-of-year grades. The Minho canonical variate accounted for 34% of variance in Minho grades and 17% of the FOM subscore variances. CONCLUSIONS: The FOM examination seems to supplement local assessments by targeting constructs not currently measured. Therefore, it may contribute to a more comprehensive assessment of basic and clinical sciences knowledge.


Education, Medical , Educational Measurement , Portugal , Reproducibility of Results , Universities
11.
Teach Learn Med ; 17(1): 14-20, 2005.
Article En | MEDLINE | ID: mdl-15691809

BACKGROUND: The Ministry of Health of the Republic of Panama is currently developing a national examination system that will be used to license graduates to practice medicine in that country, as well as to undertake postgraduate medical training. As part of these efforts, a preliminary project was undertaken between the National Board of Medical Examiners (NBME) and the Faculty of Medicine of the University of Panama to develop a Residency Selection Process Examination (RSPE). PURPOSE: The purpose of this study was to assess the reliability and validity of RSPE scores for a sample of candidates who wished to obtain a residency slot in Panama. METHODS: The RSPE, composed of 200 basic and clinical sciences multiple-choice items, was administered to 261 residency applicants at the University of Panama. RESULTS: The reliability estimate computed was comparable with that reported with other high-stakes examinations (Cronbach's alpha = 0.89). Also, a Rasch examinee proficiency item difficulty plot showed that the RSPE was well targeted to the proficiency levels of candidates. Finally, a moderate correlation was noted between local grade point averages and RSPE scores for University of Panama students (r = 0.38). CONCLUSIONS: Findings suggest that it is possible to translate and adapt test materials for use in other contexts.


Internship and Residency , School Admission Criteria , Schools, Medical/organization & administration , Licensure , Panama
12.
J Contin Educ Health Prof ; 23(3): 182-90, 2003.
Article En | MEDLINE | ID: mdl-14528790

The introduction of a clinical skills examination (CSE) to Step 2 of the U.S. Medical Licensing Examination (USMLE) has focused attention on the design and delivery of large-scale standardized tests of clinical skills and raised the question of the appropriateness of evaluation of these competencies across the span of a physician's career. This initiative coincides with growing pressure to periodically assess the continued competence of physicians in practice. The USMLE CSE is designed to certify that candidates have the basic clinical skills required for the safe and effective practice of medicine in the supervised environment of postgraduate training. These include history taking, physical examination, effective communication with patients and other members of the health care team, and clear and accurate documentation of diagnostic impressions and plans for further assessment. The USMLE CSE does not assess procedural skills. As physicians progress through training and enter practice, both knowledge base and requisite technical skills become more diverse. A variety of indirect and direct measures are available for evaluating physicians, but, at present, no single method permits high-stake inferences about clinical skills. Systematic and standardized assessments make a contribution to comprehensive evaluations, but they retain an element of assessing capacity rather than authentic performance in practice. Much work is needed to identify the optimal combination of methods to be employed in support of programs to ensure maintenance of competence of practicing physicians.


Clinical Competence/standards , Education, Medical, Undergraduate/standards , Educational Measurement , Physicians/standards , Humans
13.
Ann Med Interne (Paris) ; 154(3): 148-56, 2003 May.
Article Fr | MEDLINE | ID: mdl-12910041

Medical training is undergoing extensive revision in France. A nationwide comprehensive clinical competency examination will be administered for the first time in 2004, relying exclusively on essay-questions. Unfortunately, these questions have psychometric shortcomings, particularly their typically low reliability. High score reliability is mandatory in a high-stakes context. The National Board of Medical Examiners-designed multiple choice-questions (MCQ) are well adapted to assess clinical competency with a high reliability score. The purpose of this study was to test the hypothesis that French medical students could take an American-designed and French-adapted comprehensive clinical knowledge examination with this MCQ format. Two hundred and eighty five French students, from four Medical Schools across France, took an examination composed of 200 MCQs under standardized conditions. Their scores were compared with those of American students. This examination was found assess French students' clinical knowledge with a high level of reliability. French students' scores were slightly lower than those of American students, mostly due to a lack of familiarity with this particular item format, and a lower motivational level. Another study is being designed, with a larger group, to address some of the shortcomings of the initial study. If these preliminary results are replicated, the MCQ format might be a more defendable and sensible alternative to the proposed essay questions.


Clinical Competence/standards , Education, Medical/standards , Educational Measurement , Licensure, Medical/standards , Adult , Female , France , Humans , Male , Pilot Projects , Reproducibility of Results , United States
14.
Acad Med ; 78(5): 509-17, 2003 May.
Article En | MEDLINE | ID: mdl-12742789

PURPOSE: The French government, as part of medical education reforms, has affirmed that an examination program for national residency selection will be implemented by 2004. The purpose of this study was to develop a French multiple-choice (MC) examination using the National Board of Medical Examiners' (NBME) expertise and materials. METHOD: The Evaluation Standardisée du Second Cycle (ESSC), a four-hour clinical sciences examination, was administered in January 2002 to 285 medical students at four university test sites in France. The ESSC had 200 translated and adapted MC items selected from the Comprehensive Clinical Sciences Examination (CCSE), an NBME subject test. RESULTS: Less than 10% of the ESSC items were rejected as inappropriate to French practice. Also, the distributions of ESSC item characteristics were similar to those reported with the CCSE. The ESSC also appeared to be very well targeted to examinees' proficiencies and yielded a reliability coefficient of.91. However, because of a higher word count, the ESSC did show evidence of speededness. Regarding overall performance, the mean proficiency estimate for French examinees was about 0.4 SD below that of a CCSE population. CONCLUSIONS: This study provides strong evidence for the usefulness of the model adopted in this first collaborative effort between the NBME and a consortium of French medical schools. Overall, the performance of French students was comparable to that of CCSE students, which was encouraging given the differences in motivation and the speeded nature of the French test. A second phase with the participation of larger numbers of French medical schools and students is being planned.


Clinical Medicine/education , Educational Measurement , Schools, Medical , Students, Medical , Female , France , Humans , Male
...