Your browser doesn't support javascript.
loading
Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: Review of a single clinical site.
Woznitza, N; Piper, K; Burke, S; Ellis, S; Bothamley, G.
  • Woznitza N; Radiology Department, Homerton University Hospital, United Kingdom; School of Allied Health Professions, Canterbury Christ Church University, United Kingdom. Electronic address: nicholas.woznitza@nhs.net.
  • Piper K; School of Allied Health Professions, Canterbury Christ Church University, United Kingdom. Electronic address: keith.piper@canterbury.ac.uk.
  • Burke S; Radiology Department, Homerton University Hospital, United Kingdom. Electronic address: stephen.burke@homerton.nhs.uk.
  • Ellis S; Radiology Department, Barts Health, United Kingdom. Electronic address: stephen.ellis@bartshealth.nhs.uk.
  • Bothamley G; Department of Respiratory Medicine, Homerton University Hospital, United Kingdom. Electronic address: graham.bothamley@homerton.nhs.uk.
Radiography (Lond) ; 24(3): 234-239, 2018 08.
Article en En | MEDLINE | ID: mdl-29976336
INTRODUCTION: To compare the clinical chest radiograph (CXR) reports provided by consultant radiologists and reporting radiographers with expert thoracic radiologists. METHODS: Adult CXRs (n = 193) from a single site were included; 83% randomly selected from CXRs performed over one year, and 17% selected from the discrepancy meeting. Chest radiographs were independently interpreted by two expert thoracic radiologists (CTR1/2).Clinical history, previous and follow-up imaging was available, but not the original clinical report. Two arbiters compared expert and clinical reports independently. Kappa (Ƙ), Chi Square (χ2) and McNemar tests were performed to determine inter-observer agreement. RESULTS: CTR1 interpreted 187 (97%) and CTR2 186 (96%) CXRs, with 180 CXRs interpreted by both experts. Radiologists and radiographers provided 93 and 87 of the original clinical reports respectively. Consensus between both expert thoracic radiologists and the radiographer clinical report was 70 (CTR1; Ƙ = 0.59) and 70 (CTR2; Ƙ = 0.62), and comparable to agreement between expert thoracic radiologists and the radiologist clinical report (CTR1 = 76, Ƙ = 0.60; CTR2 = 75, Ƙ = 0.62). Expert thoracic radiologists agreed in 131 cases (Ƙ = 0.48). There was no difference in agreement between either expert thoracic radiologist, when the clinical report was provided by radiographers or radiologists (CTR1 χ = 0.056, p = 0.813; CTR2 χ = 0.014, p = 0.906), or when stratified by inter-expert agreement; radiographer McNemar p = 0.629 and radiologist p = 0.701. CONCLUSION: Even when weighted with chest radiographs reviewed at discrepancy meetings, content of CXR reports from trained radiographers were indistinguishable from content of reports issued by radiologists and expert thoracic radiologists.
Asunto(s)
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Radiografía Torácica / Competencia Clínica / Consultores Tipo de estudio: Observational_studies Límite: Adolescent / Adult / Aged / Humans / Middle aged País como asunto: Europa Idioma: En Año: 2018 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Radiografía Torácica / Competencia Clínica / Consultores Tipo de estudio: Observational_studies Límite: Adolescent / Adult / Aged / Humans / Middle aged País como asunto: Europa Idioma: En Año: 2018 Tipo del documento: Article