Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
J Med Educ Curric Dev ; 10: 23821205231179534, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37435475

RESUMO

OBJECTIVES: In-training examinations (ITEs) are a popular teaching tool for certification programs. This study examines the relationship between examinees' performance on the National Commission for Certification of Anesthesiologist Assistants (NCCAA) ITE and the high-stakes NCCAA Certification Examination. METHODS: We utilized a mixed-methods approach in our study. Before estimating the models for the predictive validity study, a series of interviews with program directors were conducted to discuss the role of the ITE in students' education. Multiple linear regression analysis was then used to assess the strength of the relationship between the ITE and Certification Examination scores, while considering the percentage of program examinees completed in their anesthesiologist assistant program between their ITE and Certification Examination attempts. Logistic regression analysis was used to estimate the probability of passing the Certification Examination as a function of ITE score. RESULTS: Interviews with program directors confirmed that the ITE provided a valuable testing experience for students and highlighted the areas where students need to focus. Moreover, both the ITE score and the percentage of the program between exams were deemed statistically significant predictors for Certification Examination scores. The logistic regression model indicated that higher scores on the ITE implied a higher probability of passing the Certification Examination. CONCLUSION: This research demonstrated the high predictive validity of the ITE examination scores in predicting the Certification Examination outcomes. Together with the proportion of the program covered between exams, the variables explain a significant amount of variability in Certification Examination scores. The ITE feedback helped students assess their preparedness and better focus their studies for the high-stakes certification examination for the profession.

2.
Acad Med ; 92(6): 809-819, 2017 06.
Artigo em Inglês | MEDLINE | ID: mdl-28557947

RESUMO

PURPOSE: To investigate evidence for validity of faculty members' pediatric milestone (PM) ratings of interns (first-year residents) and subinterns (fourth-year medical students) on nine subcompetencies related to readiness to serve as a pediatric intern in the inpatient setting. METHOD: The Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network (APPD LEARN) and the National Board of Medical Examiners collaborated to investigate the utility of assessments of the PMs for trainees' performance. Data from 32 subinterns and 179 interns at 17 programs were collected from July 2012 through April 2013. Observers used several tools to assess learners. At each site, a faculty member used these data to make judgments about the learner's current developmental milestone in each subcompetency. Linear mixed models were fitted to milestone judgments to examine their relationship with learner's rank and subcompetency. RESULTS: On a 5-point developmental scale, mean milestone levels for interns ranged from 3.20 (for the subcompetency Work effectively as a member of a team) to 3.72 (Humanism) and for subinterns from 2.89 (Organize and prioritize care) to 3.61 (Professionalization). Mean milestone ratings were significantly higher for the Professionalism competency (3.59-3.72) for all trainees compared with Patient Care (2.89-3.24) and Personal and Professional Development (3.33-3.51). Mean intern ratings were significantly higher than mean subintern ratings for all nine subcompetencies except Professionalization, Humanism, and Trustworthiness. CONCLUSIONS: The PMs had a coherent internal structure and could distinguish between differing levels of trainees, which supports their validation for documenting developmental progression of pediatric trainees.


Assuntos
Competência Clínica/normas , Educação de Pós-Graduação em Medicina/normas , Avaliação Educacional/métodos , Internato e Residência/normas , Pediatria/educação , Estudantes de Medicina , Adulto , Feminino , Humanos , Masculino , Avaliação de Programas e Projetos de Saúde , Reprodutibilidade dos Testes , Estados Unidos , Adulto Jovem
3.
Eval Health Prof ; 33(3): 386-403, 2010 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-20801978

RESUMO

Years of research with high-stakes written tests indicates that although repeat examinees typically experience score gains between their first and subsequent attempts, their pass rates remain considerably lower than pass rates for first-time examinees. This outcome is consistent with expectations. Comparable studies of the performance of repeat examinees on oral examinations are lacking. The current research evaluated pass rates for more than 50,000 examinees on written and oral exams administered by six medical specialty boards for several recent years. Pass rates for first-time examinees were similar for both written and oral exams, averaging about 84% across all boards. Pass rates for repeat examinees on written exams were expectedly lower, ranging from 22% to 51%, with an average of 36%. However, pass rates for repeat examinees on oral exams were markedly higher than for written exams, ranging from 53% to 77%, with an average of 65%. Four explanations for the elevated repeat pass rates on oral exams are proposed, including an increase in examinee proficiency, construct-irrelevant variance, measurement error (score unreliability), and memorization of test content. Simulated data are used to demonstrate that roughly one third of the score increase can be explained by measurement error alone. The authors suggest that a substantial portion of the score increase can also likely be attributed to construct-irrelevant variance. Results are discussed in terms of their implications for making pass-fail decisions when retesting is allowed. The article concludes by identifying areas for future research.


Assuntos
Competência Clínica/estatística & dados numéricos , Avaliação Educacional/estatística & dados numéricos , Licenciamento em Medicina/estatística & dados numéricos , Conselhos de Especialidade Profissional/estatística & dados numéricos , Estudantes de Medicina/estatística & dados numéricos , Redação , Competência Clínica/normas , Escolaridade , Humanos , Psicometria , Análise de Regressão , Conselhos de Especialidade Profissional/normas , Análise e Desempenho de Tarefas , Estados Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA