Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
1.
Ann Emerg Med ; 43(6): 756-69, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15159710

ABSTRACT

In response to public pressure for greater accountability from the medical profession, a transformation is occurring in the approach to medical education and assessment of physician competency. Over the past 5 years, the Accreditation Council for Graduate Medical Education (ACGME) has implemented the Outcomes and General Competencies projects to better ensure that physicians are appropriately trained in the knowledge and skills of their specialties. Concurrently, the American Board of Medical Specialties, including the American Board of Emergency Medicine (ABEM), has embraced the competency concept. The core competencies have been integral in ABEM's development of Emergency Medicine Continuous Certification and the development of the Model of Clinical Practice of Emergency Medicine (Model). ABEM has used the Model as a significant part of its blueprint for the written and oral certification examinations in emergency medicine and is fully supportive of the effort to more fully define and integrate the ACGME core competencies into training emergency medicine specialists. To incorporate these competencies into our specialty, an Emergency Medicine Competency Taskforce (Taskforce) was formed by the Residency Review Committee-Emergency Medicine to determine how these general competencies fit in the Model. This article represents a consensus of the Taskforce with the input of multiple organizations in emergency medicine. It provides a framework for organizations such as the Council of Emergency Medicine Residency Directors (CORD) and the Society for Academic Emergency Medicine to develop a curriculum in emergency medicine and program requirement revisions by the Residency Review Committee-Emergency Medicine. In this report, we describe the approach taken by the Taskforce to integrate the ACGME core competencies into the Model. Ultimately, as competency-based assessment is implemented in emergency medicine training, program directors, governing bodies such as the ACGME, and individual patients can be assured that physicians are competent in emergency medicine.


Subject(s)
Accreditation , Clinical Competence/standards , Education, Medical, Graduate/standards , Emergency Medicine/education , Internship and Residency/standards , Curriculum , Humans , Models, Educational , Patient Care , Problem-Based Learning , United States
4.
Acad Emerg Med ; 14(5): 463-73, 2007 May.
Article in English | MEDLINE | ID: mdl-17395960

ABSTRACT

OBJECTIVES: To report the results of a project designed to develop and implement a prototype methodology for identifying candidate patient care quality measures for potential use in assessing the outcomes and effectiveness of graduate medical education in emergency medicine. METHODS: A workgroup composed of experts in emergency medicine residency education and patient care quality measurement was convened. Workgroup members performed a modified Delphi process that included iterative review of potential measures; individual expert rating of the measures on four dimensions, including measures quality of care and educational effectiveness; development of consensus on measures to be retained; external stakeholder rating of measures followed by a final workgroup review; and a post hoc stratification of measures. The workgroup completed a structured exercise to examine the linkage of patient care process and outcome measures to educational effectiveness. RESULTS: The workgroup selected 62 measures for inclusion in its final set, including 43 measures for 21 clinical conditions, eight medication measures, seven measures for procedures, and four measures for department efficiency. Twenty-six measures met the more stringent criteria applied post hoc to further stratify and prioritize measures for development. Nineteen of these measures received high ratings from 75% of the workgroup and external stakeholder raters on importance for care in the ED, measures quality of care, and measures educational effectiveness; the majority of the raters considered these indicators feasible to measure. The workgroup utilized a simple framework for exploring the relationship of residency program educational activities, competencies from the six Accreditation Council for Graduate Medical Education general competency domains, patient care quality measures, and external factors that could intervene to affect care quality. CONCLUSIONS: Numerous patient care quality measures have potential for use in assessing the educational effectiveness and performance of graduate medical education programs in emergency medicine. The measures identified in this report can be used as a starter set for further development, implementation, and study. Implementation of the measures, especially for high-stakes use, will require resolution of significant measurement issues.


Subject(s)
Education, Medical, Graduate/standards , Educational Measurement , Emergency Medicine/education , Emergency Treatment/standards , Outcome and Process Assessment, Health Care , Quality of Health Care , Delphi Technique , Humans
5.
Med Educ ; 40(6): 576-83, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16700774

ABSTRACT

BACKGROUND: It is unclear which learners would most benefit from the more individualised, student-structured, interactive approaches characteristic of problem-based and computer-assisted learning. The validity of learning style measures is uncertain, and there is no unifying learning style construct identified to predict such learners. OBJECTIVE: This study was conducted to validate learning style constructs and to identify the learners most likely to benefit from problem-based and computer-assisted curricula. METHODS: Using a cross-sectional design, 3 established learning style inventories were administered to 97 post-Year 2 medical students. Cognitive personality was measured by the Group Embedded Figures Test, information processing by the Learning Styles Inventory, and instructional preference by the Learning Preference Inventory. The 11 subscales from the 3 inventories were factor-analysed to identify common learning constructs and to verify construct validity. Concurrent validity was determined by intercorrelations of the 11 subscales. RESULTS: A total of 94 pre-clinical medical students completed all 3 inventories. Five meaningful learning style constructs were derived from the 11 subscales: student- versus teacher-structured learning; concrete versus abstract learning; passive versus active learning; individual versus group learning, and field-dependence versus field-independence. The concurrent validity of 10 of 11 subscales was supported by correlation analysis. Medical students most likely to thrive in a problem-based or computer-assisted learning environment would be expected to score highly on abstract, active and individual learning constructs and would be more field-independent. CONCLUSIONS: Learning style measures were validated in a medical student population and learning constructs were established for identifying learners who would most likely benefit from a problem-based or computer-assisted curriculum.


Subject(s)
Education, Medical, Graduate/methods , Learning , Students, Medical/psychology , Teaching/methods , Cognition , Computer-Assisted Instruction , Cross-Sectional Studies , Curriculum , Female , Humans , Male , Personality
SELECTION OF CITATIONS
SEARCH DETAIL