Your browser doesn't support javascript.
loading
Interobserver Reliability of the Coronary Artery Disease Reporting and Data System in Clinical Practice.
Hu, Jiun-Yiing; Bergquist, Peter J; Hossain, Rydhwana; Ropp, Alan M; Kligerman, Seth; Amin, Sagar B; Sechrist, Jacob W; Patel, Priya; Jeudy, Jean; White, Charles S.
Afiliação
  • Hu JY; University of Maryland School of Medicine.
  • Bergquist PJ; Department of Radiology, MedStar Georgetown University Hospital, Washington, DC.
  • Hossain R; University of Maryland School of Medicine.
  • Ropp AM; Department of Diagnostic Radiology & Nuclear Medicine, University of Maryland School of Medicine, Baltimore, MD.
  • Kligerman S; Department of Diagnostic Radiology and Nuclear Medicine, University of Virginia School of Medicine, Charlottesville, VA.
  • Amin SB; Department of Radiology, University of California San Diego School of Medicine, San Diego, CA.
  • Sechrist JW; Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA.
  • Patel P; Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, PA.
  • Jeudy J; Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, PA.
  • White CS; University of Maryland School of Medicine.
J Thorac Imaging ; 36(2): 95-101, 2021 Mar 01.
Article em En | MEDLINE | ID: mdl-32205820
ABSTRACT

PURPOSE:

This study aimed to evaluate interobserver reproducibility between cardiothoracic radiologists applying the Coronary Artery Disease Reporting and Data System (CAD-RADS) to describe atherosclerotic burden on coronary computed tomography angiography.

METHODS:

Forty clinical computed tomography angiography cases were retrospectively and independently evaluated by 3 attending and 2 fellowship-trained cardiothoracic radiologists using the CAD-RADS lexicon. Radiologists were blinded to patient history and underwent initial training using a practice set of 10 subjects. Interobserver reproducibility was assessed using an intraclass correlation (ICC) on the basis of single-observer scores, absolute agreement, and a 2-way random-effects model. Nondiagnostic studies were excluded. ICC was also performed for CAD-RADS scores grouped by management recommendations for absent (0), nonobstructive (1 to 2), and potentially obstructive (3 to 5) CAD.

RESULTS:

Interobserver reproducibility was moderate to good (ICC 0.748, 95% confidence interval [CI] 0.639-0.842, P<0.0001), with higher agreement among cardiothoracic radiology fellows (ICC 0.853, 95% CI 0.730-0.922, P<0.0001) than attending radiologists (ICC 0.711, 95% CI 0.568-0.824, P<0.0001). Interobserver reproducibility for clinical management categories was marginally decreased (ICC 0.692, 95% CI 0.570-0.802, P<0.0001). The average percent agreement between pairs of radiologists was 84.74%. Percent observer agreement was significantly reduced in the presence (M=62.22%, SD=15.17%) versus the absence (M=80.91%, SD=17.97%) of modifiers, t(37.95)=3.566, P=0.001.

CONCLUSIONS:

Interobserver reliability and agreement with the CAD-RADS terminology are moderate to good in clinical practice. However, further investigations are needed to characterize the causes of interobserver disagreement that may lead to differences in management recommendations.
Assuntos

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Doença da Artéria Coronariana Idioma: En Ano de publicação: 2021 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Doença da Artéria Coronariana Idioma: En Ano de publicação: 2021 Tipo de documento: Article