ABSTRACT
The field of Radiology is continually changing, requiring corresponding evolution in both medical student and resident training to adequately prepare the next generation of radiologists. With advancements in adult education theory and a deeper understanding of perception in imaging interpretation, expert educators are reshaping the training landscape by introducing innovative teaching methods to align with increased workload demands and emerging technologies. These include the use of peer and interdisciplinary teaching, gamification, case repositories, flipped-classroom models, social media, and drawing and comics. This publication aims to investigate these novel approaches and offer persuasive evidence supporting their incorporation into the updated Radiology curriculum.
Subject(s)
Curriculum , Radiologists , Radiology , Humans , Radiologists/education , Radiology/educationABSTRACT
OBJECTIVE: The purpose was to evaluate reader variability between experienced and in-training radiologists of COVID-19 pneumonia severity on chest radiograph (CXR), and to create a multireader database suitable for AI development. METHODS: In this study, CXRs from polymerase chain reaction positive COVID-19 patients were reviewed. Six experienced cardiothoracic radiologists and two residents classified each CXR according to severity. One radiologist performed the classification twice to assess intraobserver variability. Severity classification was assessed using a 4-class system: normal (0), mild (1), moderate (2), and severe (3). A median severity score (Rad Med) for each CXR was determined for the six radiologists for development of a multireader database (XCOMS). Kendal Tau correlation and percentage of disagreement were calculated to assess variability. RESULTS: A total of 397 patients (1208 CXRs) were included (mean age, 60 years SD ± 1), 189 men). Interobserver variability between the radiologists ranges between 0.67 and 0.78. Compared to the Rad Med score, the radiologists show good correlation between 0.79-0.88. Residents show slightly lower interobserver agreement of 0.66 with each other and between 0.69 and 0.71 with experienced radiologists. Intraobserver agreement was high with a correlation coefficient of 0.77. In 220 (18%), 707 (59%), 259 (21%) and 22 (2%) CXRs there was a 0, 1, 2 or 3 class-difference. In 594 (50%) CXRs the median scores of the residents and the radiologists were similar, in 578 (48%) and 36 (3%) CXRs there was a 1 and 2 class-difference. CONCLUSION: Experienced and in-training radiologists demonstrate good inter- and intraobserver agreement in COVID-19 pneumonia severity classification. A higher percentage of disagreement was observed in moderate cases, which may affect training of AI algorithms. ADVANCES IN KNOWLEDGE: Most AI algorithms are trained on data labeled by a single expert. This study shows that for COVID-19 X-ray severity classification there is significant variability and disagreement between radiologist and between residents.
Subject(s)
COVID-19 , Algorithms , Artificial Intelligence , COVID-19/diagnostic imaging , Humans , Male , Middle Aged , Radiography, Thoracic , Radiologists , Retrospective StudiesSubject(s)
Education, Medical, Undergraduate , Students, Medical , Curriculum , Humans , Narration , TeachingSubject(s)
Biomedical Research/education , Mentoring , Radiology/education , Students, Medical , HumansABSTRACT
The adage "a picture is worth a thousand words" holds true in medicine, especially so in radiology. Although the images radiologists interpret are highly detailed, there often is no substitute for a concise diagrammatic illustration. Medical illustrations can help to clarify anatomy, pathology, and procedures-relaying complex information in a simple and easily understandable format. Medical illustrations have become ubiquitous in medical education and sought after for publications. Unfortunately, existing best-fit illustrations are not always available to complement discussion points. Thus, academicians are well served by the ability to produce their own illustrations. Although creating medical illustrations may seem unachievable to amateur artists, this is not necessarily the case. Digital illustration does not require the typical artistic skills needed for drawing with pen and paper or painting on a canvas. Radiologists of all skill levels, including those who do not view themselves as artistically inclined, can create their own high-quality original diagrams. Whether drawn freehand with a stylus or traced with a mouse, simple and complex digital works are within reach. However, the utility of illustration programs for radiologists is not inherently obvious, and discussion of useful features in the radiology literature is lacking. Digital illustration programs are accessible to most radiologists, and the process can be simplified to an easily approachable level, with illustration complexity left to the artist's discretion. Online supplemental material is available for this article. ©RSNA, 2018.