Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Intervalo de año de publicación
1.
AJNR Am J Neuroradiol ; 33(6): 1032-6, 2012 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-22300933

RESUMEN

BACKGROUND AND PURPOSE: Prior studies have found a 2%-8% clinically significant error rate in radiology practice. We compared discrepancy rates of studies interpreted by subspecialty-trained neuroradiologists working with and without trainees. MATERIALS AND METHODS: Subspecialty-trained neuroradiologists reviewed 2162 studies during 41 months. Discrepancies between the original and "second opinion" reports were scored: 1, no change; 2, clinically insignificant detection discrepancy; 3, clinically insignificant interpretation discrepancy; 4, clinically significant detection discrepancy; and 5, clinically significant interpretation discrepancy. Faculty alone versus faculty and trainee discrepancy rates were calculated. RESULTS: In 87.6% (1894/2162), there were no discrepancies with the original report. The neuroradiology division had a 1.8% (39/2162; 95% CI, 1.3%-2.5%) rate of clinically significant discrepancies. In cases reviewed solely by faculty neuroradiologists (16.2% = 350/2162 of the total), the rate of discrepancy was 1.7% (6/350). With fellows (1232/2162, 57.0% of total) and residents (580/2162, 26.8% of total), the rates of discrepancy were 1.6% (20/1232) and 2.2% (13/580), respectively. The odds of a discrepant result were 26% greater (OR = 1.26; 95% CI, 0.38-4.20) when reading with a resident and 8% less (OR = 0.92; 95% CI, 0.35-2.44) when reading with a fellow than when reading alone. CONCLUSIONS: There was a 1.8% rate of clinically significant detection or interpretation discrepancy among academic neuroradiologists. The difference in the discrepancy rates between faculty only (1.7%), fellows and faculty (1.6%), and residents and faculty (2.2%) was not statistically significant but showed a trend indicating that reading with a resident increased the odds of a discrepant result.


Asunto(s)
Neoplasias Encefálicas/diagnóstico , Docentes/estadística & datos numéricos , Neurorradiografía/estadística & datos numéricos , Competencia Profesional/estadística & datos numéricos , Control de Calidad , Humanos , Maryland , Neurorradiografía/normas , Variaciones Dependientes del Observador , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Apoyo a la Formación Profesional
2.
AJNR Am J Neuroradiol ; 33(1): 37-42, 2012 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-22033725

RESUMEN

Prior studies have found a 3%-6% clinically significant error rate in radiology practice. We set out to assess discrepancy rates between subspecialty-trained university-based neuroradiologists. Over 17 months, university neuroradiologists randomly reviewed 1000 studies and reports of previously read examinations of patients in whom follow-up studies were read. The discrepancies between the original and "second opinion" reports were scored according to a 5-point scale: 1, no change; 2, clinically insignificant detection discrepancy; 3, clinically insignificant interpretation discrepancy; 4, clinically significant detection discrepancy; and 5, clinically significant interpretation discrepancy. Of the 1000 studies, 876 (87.6%) showed agreements with the original report. The neuroradiology division had a 2.0% (20/1000; 95% CI, 1.1%-2.9%) rate of clinically significant discrepancies involving 8 CTs and 12 MR images. Discrepancies were classified as vascular (n = 7), neoplastic (n = 9), congenital (n = 2), and artifacts (n = 2). Individual neuroradiologist's scores ranged from 0% to 7.7% ± 2.3% (n = 18). Both CT and MR imaging studies had a discrepancy rate of 2.0%. Our quality assessment study could serve as initial data before intervention as part of a PQI project.


Asunto(s)
Imagen por Resonancia Magnética/estadística & datos numéricos , Neurorradiografía/estadística & datos numéricos , Neurorradiografía/normas , Variaciones Dependientes del Observador , Médicos/estadística & datos numéricos , Competencia Profesional/estadística & datos numéricos , Tomografía Computarizada por Rayos X/estadística & datos numéricos , Maryland , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA