Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Anat Sci Educ ; 10(2): 109-119, 2017 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-27458988

RESUMEN

With integrated curricula and multidisciplinary assessments becoming more prevalent in medical education, there is a continued need for educational research to explore the advantages, consequences, and challenges of integration practices. This retrospective analysis investigated the number of items needed to reliably assess anatomical knowledge in the context of gross anatomy and histology. A generalizability analysis was conducted on gross anatomy and histology written and practical examination items that were administered in a discipline-based format at Indiana University School of Medicine and in an integrated fashion at the University of Alabama School of Medicine and Rush University Medical College. Examination items were analyzed using a partially nested design s×(i:o) in which items were nested within occasions (i:o) and crossed with students (s). A reliability standard of 0.80 was used to determine the minimum number of items needed across examinations (occasions) to make reliable and informed decisions about students' competence in anatomical knowledge. Decision study plots are presented to demonstrate how the number of items per examination influences the reliability of each administered assessment. Using the example of a curriculum that assesses gross anatomy knowledge over five summative written and practical examinations, the results of the decision study estimated that 30 and 25 items would be needed on each written and practical examination to reach a reliability of 0.80, respectively. This study is particularly relevant to educators who may question whether the amount of anatomy content assessed in multidisciplinary evaluations is sufficient for making judgments about the anatomical aptitude of students. Anat Sci Educ 10: 109-119. © 2016 American Association of Anatomists.


Asunto(s)
Anatomía/educación , Educación de Pregrado en Medicina/métodos , Evaluación Educacional/métodos , Generalización Psicológica , Histología/educación , Teoría Psicológica , Estudiantes de Medicina/psicología , Encuestas y Cuestionarios , Curriculum , Escolaridad , Humanos , Psicometría , Reproducibilidad de los Resultados , Estudios Retrospectivos , Facultades de Medicina , Factores de Tiempo , Estados Unidos
2.
Adv Health Sci Educ Theory Pract ; 21(3): 571-85, 2016 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-26597452

RESUMEN

In UK medical schools, five-option single-best answer (SBA) questions are the most widely accepted format of summative knowledge assessment. However, writing SBA questions with four effective incorrect options is difficult and time consuming, and consequently, many SBAs contain a high frequency of implausible distractors. Previous research has suggested that fewer than five-options could hence be used for assessment, without deterioration in quality. Despite an existing body of empirical research in this area however, evidence from undergraduate medical education is sparse. The study investigated the frequency of non-functioning distractors in a sample of 480 summative SBA questions at Cardiff University. Distractor functionality was analysed, and then various question models were tested to investigate the impact of reducing the number of distractors per question on examination difficulty, reliability, discrimination and pass rates. A survey questionnaire was additionally administered to 108 students (33 % response rate) to gain insight into their perceptions of these models. The simulation of various exam models revealed that, for four and three-option SBA models, pass rates, reliability, and mean item discrimination remained relatively constant. The average percentage mark however consistently increased by 1-3 % with the four and three-option models, respectively. The questionnaire survey revealed that the student body had mixed views towards the proposed format change. This study is one of the first to comprehensively investigate distractor performance in SBA examinations in undergraduate medical education. It provides evidence to suggest that using three-option SBA questions would maximise efficiency whilst maintaining, or possibly improving, psychometric quality, through allowing a greater number of questions per exam paper.


Asunto(s)
Evaluación Educacional/métodos , Educación Médica/métodos , Educación Médica/normas , Humanos , Reproducibilidad de los Resultados , Facultades de Medicina , Encuestas y Cuestionarios
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...