Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
J Dent Educ ; 82(6): 565-574, 2018 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-29858252

RESUMO

Progress testing is an innovative formative assessment practice that has been found successful in many educational programs. In progress testing, one exam is given to students at regular intervals as they progress through a curriculum, allowing them to benchmark their increase in knowledge over time. The aim of this study was to assess the first two years of results of a progress testing system implemented in a Canadian dental school. This was the first time in North America a dental school had introduced progress testing. Each test form contains 200 multiple-choice questions (MCQs) to assess the cognitive knowledge base that a competent dentist should have by the end of the program. All dental students are required to complete the test in three hours. In the first three administrations, three test forms with 86 common items were administered to all DMD students. The total of 383 MCQs spanning nine domains of cognitive knowledge in dentistry were distributed among these three test forms. Each student received a test form different from the previous one in the subsequent two semesters. In the fourth administration, 299 new questions were introduced to create two test forms sharing 101 questions. Each administration occurred at the beginning of a semester. All students received individualized reports comparing their performance with their class median in each of the domains. Aggregated results from each administration were provided to the faculty. Based on analysis of students' responses to the common items in the first two administrations, progression in all domains was observed. Comparing equated results across the four administrations also showed progress. This experience suggests that introducing a progress testing assessment system for competency-based dental education has many merits. Challenges and lessons learned with this assessment are discussed.


Assuntos
Competência Clínica , Educação Baseada em Competências , Educação em Odontologia , Faculdades de Odontologia , Canadá , Humanos , Modelos Educacionais , Inquéritos e Questionários
2.
Med Educ ; 52(10): 1003-1004, 2018 10.
Artigo em Inglês | MEDLINE | ID: mdl-29700841
3.
Med Teach ; 40(3): 267-274, 2018 03.
Artigo em Inglês | MEDLINE | ID: mdl-29172940

RESUMO

CONTEXT: Creating a new testing program requires the development of a test blueprint that will determine how the items on each test form are distributed across possible content areas and practice domains. To achieve validity, categories of a blueprint are typically based on the judgments of content experts. How experts judgments are elicited and combined is important to the quality of resulting test blueprints. METHODS: Content experts in dentistry participated in a day-long faculty-wide workshop to discuss, refine, and confirm the categories and their relative weights. After reaching agreement on categories and their definitions, experts judged the relative importance between category pairs, registering their judgments anonymously using iClicker, an audience response system. Judgments were combined in two ways: a simple calculation that could be performed during the workshop and a multidimensional scaling of the judgments performed later. RESULTS: Content experts were able to produce a set of relative weights using this approach. The multidimensional scaling yielded a three-dimensional model with the potential to provide deeper insights into the basis of the experts' judgments. CONCLUSION: The approach developed and demonstrated in this study can be applied across academic disciplines to elicit and combine content experts judgments for the development of test blueprints.


Assuntos
Educação em Odontologia , Educação de Graduação em Medicina , Avaliação Educacional , Competência Clínica/normas , Avaliação Educacional/métodos , Humanos , Entrevistas como Assunto , Pesquisa Qualitativa
4.
BMC Public Health ; 14: 331, 2014 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-24712314

RESUMO

BACKGROUND: Few researchers have the data required to adequately understand how the school environment impacts youth health behaviour development over time. METHODS/DESIGN: COMPASS is a prospective cohort study designed to annually collect hierarchical longitudinal data from a sample of 90 secondary schools and the 50,000+ grade 9 to 12 students attending those schools. COMPASS uses a rigorous quasi-experimental design to evaluate how changes in school programs, policies, and/or built environment (BE) characteristics are related to changes in multiple youth health behaviours and outcomes over time. These data will allow for the quasi-experimental evaluation of natural experiments that will occur within schools over the course of COMPASS, providing a means for generating "practice based evidence" in school-based prevention programming. DISCUSSION: COMPASS is the first study with the infrastructure to robustly evaluate the impact that changes in multiple school-level programs, policies, and BE characteristics within or surrounding a school might have on multiple youth health behaviours or outcomes over time. COMPASS will provide valuable new insight for planning, tailoring and targeting of school-based prevention initiatives where they are most likely to have impact.


Assuntos
Planejamento Ambiental , Comportamentos Relacionados com a Saúde , Políticas , Serviços de Saúde Escolar , Adolescente , Canadá , Estudos de Coortes , Humanos , Instituições Acadêmicas , Estudantes/psicologia
5.
J Allied Health ; 38(3): 158-62, 2009.
Artigo em Inglês | MEDLINE | ID: mdl-19753427

RESUMO

Determining admission criteria that will predict successful student outcomes is a challenging undertaking for newly established health professional programs. This study examined data from the students who entered a medical radiation sciences program in September 2002. By analyzing the correlation between undergraduate GPA, grades in undergraduate science courses, performance in program coursework, and post-graduation certification examination results, the authors determined admission criteria that were linked to successful student outcomes for radiological technology and radiation therapy students.


Assuntos
Medicina Nuclear/educação , Critérios de Admissão Escolar , Estudantes de Ciências da Saúde , Tecnologia Radiológica/educação , Educação Profissionalizante/organização & administração , Educação Profissionalizante/normas , Avaliação Educacional , Humanos , Ontário
6.
Acad Med ; 78(10 Suppl): S62-4, 2003 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-14557098

RESUMO

PROBLEM STATEMENT AND BACKGROUND: Examinees can make three types of errors on the short-menu questions in the Clinical Reasoning Skills component of the Medical Council of Canada's Qualifying Examination Part I: (1) failing to select any correct responses, (2) selecting too many responses, or (3) selecting a response that is inappropriate or harmful to the patient. This study compared the information provided by equal and differential weighting of these errors. METHOD: The item response theory nominal model was applied to fit examinees' response patterns on the 1998 test. RESULTS: Differential error weighting resulted in improved model fit and increased test information for examinees in the lower half of the achievement continuum. CONCLUSION: Differential error weighting appears promising. The pass score is near the lower end of the achievement continuum; therefore, this approach may improve the accuracy of pass-fail decisions.


Assuntos
Competência Clínica/estatística & dados numéricos , Avaliação Educacional/estatística & dados numéricos , Estatística como Assunto/métodos , Canadá , Humanos , Estudantes de Medicina
7.
Acad Med ; 78(10 Suppl): S65-7, 2003 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-14557099

RESUMO

PURPOSE: This study investigates (a) whether items within the Multiple-Choice Questions component of the Medical Council of Canada's Qualifying Examination Part I exhibit local dependencies and (b) potential sources of such dependencies. METHOD: The dimensionality of each of six discipline-based subtests was assessed based on exploratory nonlinear factor analyses. A standardized Fisher's z statistic was used to test residual item correlations for local item dependencies. The characteristics of pairs of items flagged as possibly locally dependent were reviewed. RESULTS: Some items in the Pediatrics and Public Medicine/Community Health subtests are locally dependent; these tend to be the more difficult items on the subtests. DISCUSSION: While these results are encouraging, the possible causes and potential impacts of any local dependencies should be investigated further.


Assuntos
Avaliação Educacional/estatística & dados numéricos , Licenciamento em Medicina , Canadá , Competência Clínica , Humanos , Psicometria , Estudantes de Medicina
8.
J Pers Assess ; 78(1): 130-44, 2002 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-11936205

RESUMO

All clinical psychology doctoral programs accredited by the American Psychological Association provide training in psychological assessment. However, what the programs teach and how they teach it vary widely. So, also, do beliefs about what should be taught. In this study, program descriptive materials and course syllabi from 84 programs were analyzed. Findings highlight commonalities in basic course content and supervised practice in administering, scoring, and interpreting assessment instruments as well as differences in coverage of psychometric and other assessment-related topics and in the extent to which lectures, labs, and practica are integrated.


Assuntos
Educação de Pós-Graduação , Determinação da Personalidade , Psicologia Clínica/educação , Ensino , Currículo , Humanos , Psicometria
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA