Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Radiology ; 268(1): 219-27, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23793591

ABSTRACT

The American Board of Radiology (ABR) has provided certification for diagnostic radiologists and other specialists and subspecialists for more than 75 years. The Board certification process is a tangible expression of the social contract between the profession and the public by which the profession enjoys the privilege of self-regulation and the public is assured that it can expect medical professionals to put patients' interests first, guarantees the competence of practitioners, and guards the public health. A primary tool used by the ABR in fulfilling this responsibility is the secure proctored examination. This article sets forth seven standards based on authoritative sources in the field of psychometrics (the science of mental measurements), and explains in each case how the ABR implements that standard. Readers are encouraged to understand that, despite the multiple opinions that may be held, these standards developed over decades by experts using the scientific method should be the central feature in any discussion or critique of examinations given for the privilege of professional practice and for safeguarding the public well-being.


Subject(s)
Certification/standards , Educational Measurement , Radiology/education , Radiology/standards , Specialty Boards , Clinical Competence/standards , Humans , Professional Practice , Specialization , United States
2.
J Am Coll Radiol ; 19(5): 663-668, 2022 05.
Article in English | MEDLINE | ID: mdl-35341700

ABSTRACT

With the onset of the global coronavirus disease 2019 pandemic in early 2020, it became apparent that routine administration of the ABR Qualifying and Certifying Exams would be disrupted. Initial intent for postponement was later altered to a recognition that replacement of the existing delivery methodologies was essential. Herein, the authors describe the conceptualization, development, administration, and future implications of the new remote examination delivery platforms.


Subject(s)
COVID-19 , Internship and Residency , Radiation Oncology , Certification , Educational Measurement , Forecasting , Humans , Radiation Oncology/education , Specialty Boards , United States
3.
AJR Am J Roentgenol ; 195(1): 10-2, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20566793

ABSTRACT

The purpose of this article is to inform radiology residents, program directors, and other interested parties of the processes involved in developing, administering, and scoring the American Board of Radiology (ABR) diagnostic radiology qualifying (written) examinations. The residents, once certified, will have a lifelong professional relationship with the ABR. It is important for the ABR to be transparent about the processes it uses to ensure that its examinations are fair, valid, and reliable so that residents and their program directors have accurate information about these high-stakes examinations.


Subject(s)
Certification , Education, Medical, Graduate/standards , Educational Measurement/standards , Radiology/education , Radiology/standards , Clinical Competence , Curriculum , Humans , Internship and Residency , Specialty Boards , United States , Writing
4.
AJR Am J Roentgenol ; 195(4): 820-4, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20858803

ABSTRACT

OBJECTIVE: This pilot study of a computer-based examination for primary certification by the American Board of Radiology was designed to acquire comparative data on candidates that were measures of individual performance on the oral examination compared with the computer-based examination. MATERIALS AND METHODS: The pilot computer-based pediatric radiology examination was designed by experienced oral board examiners and the pediatric subspecialty trustees. Images were chosen from the examination repository of the American Board of Radiology. The 20-minute examination was designed to include 8-10 cases with 26-31 scorable units covering all aspects of pediatric radiology. RESULTS: Among the 1,317 candidates taking the oral board examination, 1,048 candidates (79.6%) participated in the voluntary pilot examination. The scores of the two examinations were subjected to statistical analysis. The sensitivity and specificity of the pilot examination were 94.5% and 45.7%. The overall accuracy was 92.8%. Seventy-five candidates (7.2%) who participated in this study received different verdicts on the pilot examination and the pediatric radiology category of the oral examination. Fifty-six of these candidates (5.3%) failed the pilot examination but passed in the oral pediatric radiology category; 19 of the candidates (1.8%) passed the pilot examination but failed the oral pediatric radiology test. Pilot examination scores were higher for candidates who passed the oral pediatric radiology category (median score, 80; interquartile range, 74.1-85.2) than for candidates who failed (median score, 65.4; interquartile range, 58.6-71.0) (p < 0.0001). CONCLUSION: The pediatric pilot examination was useful for differentiating passing candidates from failing candidates when the score in the pediatric radiology category of the oral examination was used as the reference standard. The overall accuracy was 92.8%.


Subject(s)
Certification/methods , Computers , Pediatrics , Radiology , Pilot Projects , United States
5.
Radiology ; 250(3): 658-64, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19164117

ABSTRACT

PURPOSE: To prospectively compare high-, mid-, and low-resolution off-the-shelf displays currently employed by commercial testing centers, in terms of visibility of lesion features needed to render a diagnostic decision when possible diagnoses are provided in a multiple-choice format during a maintenance of certification (MOC) examination. MATERIALS AND METHODS: The Psychometrics Division of the American Board of Radiology (ABR) approved the studies (human subjects and HIPAA compliant). One study compared 1280 x 1024 displays with 1024 x 768 displays; the second, 1600 x 1200 with 1280 x 1024 displays. Images from nine subspecialties were used. In each study, observers viewed images twice-once on each display. Diagnoses were provided, and observers rated visibility of diagnostic features. RESULTS: Of 7977 data pairs analyzed in study 1, the 1024 and 1280 displays received the same ratings for 5726 data pairs (72% of the time), with the 1024 display receiving a higher rating for 679 data pairs (9% of the time) and the 1280 receiving a higher rating for 1572 data pairs (19% of the time) (P < .0001). When rating differences existed, all subspecialties except nuclear medicine had significantly more high-visibility ratings with the 1280 display. Of 1090 data pairs analyzed in study 2, the 1280 and 1600 displays received the same ratings for 689 data pairs (63% of the time), with the 1280 receiving a higher rating for 162 data pairs (15% of the time) and the 1600 receiving a higher rating for 239 data pairs (22% of the time) (P = .0001). When rating differences existed, only cardiopulmonary and musculoskeletal images had significantly more high-visibility ratings with the 1600 display. CONCLUSION: For the ABR MOC examinations, 1280 x 1024 displays should be used, compared to 1024 x 768 displays; 1600 x 1200 displays may be necessary for some images. Good-quality images must be used on the examinations, so digital rather than digitized film images should be used to ensure high-quality images.


Subject(s)
Certification , Computer Terminals , Computer-Assisted Instruction/instrumentation , Educational Measurement/methods , Image Interpretation, Computer-Assisted/instrumentation , Specialty Boards , Computer-Assisted Instruction/methods , Equipment Design , Equipment Failure Analysis , Image Interpretation, Computer-Assisted/methods , United States
6.
J Am Coll Radiol ; 16(4 Pt A): 513-517, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30584037

ABSTRACT

PURPOSE: Beginning in 2010, the ABR has administered triennial clinical practice analysis surveys to inform examination development volunteers and staff about the actual state of radiation oncology practice. METHODS AND MATERIALS: As reported here, the 2016 survey was designed to provide objective data regarding actual patient volumes of specific disease sites and subjective insight as to the importance and relevance of site-specific therapy to individual practices. RESULTS: The survey instrument was circulated to 4,075 radiation oncologists listed in the membership database of the American Society for Radiation Oncology, and responses were received from 690 (16.9%); a total of 287 (41.5%) self-identified as being in academic practice. Even in the academic setting, a majority (216 of 287, or 75.3%) indicated that they spend most of their time in clinical practice. CONCLUSIONS: Data from the survey are informative regarding changes in the practice of radiation oncology over the past 6 years.


Subject(s)
Radiation Oncology/trends , Humans , Societies, Medical , Specialty Boards , Surveys and Questionnaires , United States
10.
Pract Radiat Oncol ; 3(1): 74-78, 2013.
Article in English | MEDLINE | ID: mdl-24674266

ABSTRACT

PURPOSE: Oral examinations are used in certifying examinations by many medical specialty boards. They represent daily clinical practice situations more realistically than do written tests or computer-based tests. However, there are repeated concerns in the literature regarding objectivity, fairness, and extraneous factors from interpersonal interactions, item bias, reliability, and validity. In this study, the reliability of oral examination on the radiation oncology certifying examination, which was administered in May of 2010, was analyzed. METHODS AND MATERIALS: One hundred fifty-two candidates rotated though 8 examination stations. Stations consisted of a hotel room equipped with a computer and software that exhibited images appropriate to the content areas. Each candidate had a 25-30 minute face-to-face encounter with an oral examiner who was a content expert in one of the following areas: gastrointestinal, gynecology, genitourinary, lymphoma/leukemia/transplant/myeloma, head/neck/skin, breast, central nervous system/pediatrics, or lung/sarcoma. This type of design is typically referred to as a repeated measures design or a subject by treatment design, although the oral examination was a routine event without any experimental manipulation. RESULTS: The reliability coefficient was obtained by applying Feldt and Charter's simple computational alternative to analysis of variance formulas that yielded KR-20, or Cronbach's coefficient alpha of 0.81. CONCLUSIONS: An experimental design to develop a blueprint in order to improve the consistency of evaluation is suggested.

11.
Int J Radiat Oncol Biol Phys ; 87(2): 237-45, 2013 Oct 01.
Article in English | MEDLINE | ID: mdl-23958146

ABSTRACT

The American Board of Radiology (ABR) has provided certification for diagnostic radiologists and other specialists and subspecialists for more than 75 years. The Board certification process is a tangible expression of the social contract between the profession and the public by which the profession enjoys the privilege of self-regulation and the public is assured that it can expect medical professionals to put patients' interests first, guarantees the competence of practitioners, and guards the public health. A primary tool used by the ABR in fulfilling this responsibility is the secure proctored examination. This article sets forth seven standards based on authoritative sources in the field of psychometrics (the science of mental measurements), and explains in each case how the ABR implements that standard. Readers are encouraged to understand that, despite the multiple opinions that may be held, these standards developed over decades by experts using the scientific method should be the central feature in any discussion or critique of examinations given for the privilege of professional practice and for safeguarding the public well-being.


Subject(s)
Certification/standards , Clinical Competence/standards , Governing Board/standards , Radiology/standards , Communication , Educational Measurement/standards , Patient Safety/standards , Professional Autonomy , Psychometrics , Quality Improvement , Reproducibility of Results , Social Responsibility
12.
J Am Coll Radiol ; 9(2): 121-8, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22305698

ABSTRACT

The ABR performs practice analysis every 3 years, according to its strategic plan, in an effort to strengthen the content validity of its qualifying and certifying examinations as well as its maintenance of certification examinations. A nationwide survey of diagnostic radiologists was conducted in July 2010 for the purpose of determining the critically important and frequently performed activities in 12 clinical categories. The survey instrument was distributed electronically to 17,721 members of the ACR, with a unique identification code for each individual. A 5-point scale was established for both frequency and importance variables. The frequency scale ranged from 1 to 5 as follows: 1 = not applicable, 2 = occasionally, 3 = monthly, 4 = weekly, and 5 = daily. The scale for importance also ranged from 1 to 5: 1 = not applicable, 2 = not important, 3 = somewhat important, 4 = important, and 5 = essential. A total of 2,909 diagnostic radiologists (19.32%) participated. Of these, 2,233 (76.76%) indicated that they spent ≥50% of their time in clinical practice. Because of its brevity of the list of the activities, results for the gastrointestinal category are presented in this article. The list of activities weighted according to importance and frequency is presented in this article and, as illustrated, could become the foundation for developing a more detailed blueprint for the gastrointestinal category certifying examinations in diagnostic radiology. Findings on demographic information are also presented.


Subject(s)
Certification , Practice Patterns, Physicians'/statistics & numerical data , Professional Practice/statistics & numerical data , Radiology/education , Radiology/standards , Specialty Boards , Workload/statistics & numerical data , Data Collection , Educational Measurement , United States
13.
14.
J Am Coll Radiol ; 8(3): 199-202, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21371671

ABSTRACT

This report was prepared by those who are closely involved in the radiation oncology initial qualification examinations. The primary purpose of this article is to disseminate information concerning test preparation, test administration, scoring, and reporting processes of the ABR. The authors hope that the information contained in the article will be helpful to radiology residents, program directors, and other interested parties.


Subject(s)
Education, Medical, Graduate/standards , Educational Measurement/standards , Radiology/education , Radiology/standards , Certification , Clinical Competence , Curriculum , Humans , Internship and Residency , Specialty Boards , United States , Writing
SELECTION OF CITATIONS
SEARCH DETAIL