Your browser doesn't support javascript.
loading
Evaluating visual analytics for health informatics applications: a systematic review from the American Medical Informatics Association Visual Analytics Working Group Task Force on Evaluation.
Wu, Danny T Y; Chen, Annie T; Manning, John D; Levy-Fix, Gal; Backonja, Uba; Borland, David; Caban, Jesus J; Dowding, Dawn W; Hochheiser, Harry; Kagan, Vadim; Kandaswamy, Swaminathan; Kumar, Manish; Nunez, Alexis; Pan, Eric; Gotz, David.
Afiliação
  • Wu DTY; Department of Biomedical Informatics, University of Cincinnati, Cincinnati, Ohio, USA.
  • Chen AT; Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington, USA.
  • Manning JD; Department of Emergency Medicine, Atrium Health's Carolinas Medical Center, Charlotte, North Carolina, USA.
  • Levy-Fix G; Department of Biomedical Informatics, Columbia University, New York, New York, USA.
  • Backonja U; Department of Biomedical Informatics and Medical Education, University of Washington School of Medicine, Seattle, Washington, USA.
  • Borland D; Nursing & Healthcare Leadership, University of Washington Tacoma, Tacoma, Washington.
  • Caban JJ; Renaissance Computing Institute, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.
  • Dowding DW; National Intrepid Center of Excellence, Walter Reed National Military Medical Center, Bethesda, Maryland, USA.
  • Hochheiser H; Division of Nursing, Midwifery and Social Work, School of Health Sciences, University of Manchester, Manchester, United Kingdom.
  • Kagan V; Department of Biomedical Informatics and Intelligent Systems Program, University of Pittsburgh, Pittsburgh, Pennsylvania, USA.
  • Kandaswamy S; SentiMetrix, Inc, Bethesda, Maryland, USA.
  • Kumar M; Department of Mechanical and Industrial Engineering, University of Massachusetts at Amherst, Amherst, Massachusetts, USA.
  • Nunez A; MEASURE Evaluation, Carolina Population Center, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.
  • Pan E; Carolina Health Informatics Program, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.
J Am Med Inform Assoc ; 26(4): 314-323, 2019 04 01.
Article em En | MEDLINE | ID: mdl-30840080
OBJECTIVE: This article reports results from a systematic literature review related to the evaluation of data visualizations and visual analytics technologies within the health informatics domain. The review aims to (1) characterize the variety of evaluation methods used within the health informatics community and (2) identify best practices. METHODS: A systematic literature review was conducted following PRISMA guidelines. PubMed searches were conducted in February 2017 using search terms representing key concepts of interest: health care settings, visualization, and evaluation. References were also screened for eligibility. Data were extracted from included studies and analyzed using a PICOS framework: Participants, Interventions, Comparators, Outcomes, and Study Design. RESULTS: After screening, 76 publications met the review criteria. Publications varied across all PICOS dimensions. The most common audience was healthcare providers (n = 43), and the most common data gathering methods were direct observation (n = 30) and surveys (n = 27). About half of the publications focused on static, concentrated views of data with visuals (n = 36). Evaluations were heterogeneous regarding setting and measurements used. DISCUSSION: When evaluating data visualizations and visual analytics technologies, a variety of approaches have been used. Usability measures were used most often in early (prototype) implementations, whereas clinical outcomes were most common in evaluations of operationally-deployed systems. These findings suggest opportunities for both (1) expanding evaluation practices, and (2) innovation with respect to evaluation methods for data visualizations and visual analytics technologies across health settings. CONCLUSION: Evaluation approaches are varied. New studies should adopt commonly reported metrics, context-appropriate study designs, and phased evaluation strategies.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Aplicações da Informática Médica / Estudos de Avaliação como Assunto / Visualização de Dados Tipo de estudo: Diagnostic_studies / Evaluation_studies / Guideline / Prognostic_studies / Qualitative_research / Risk_factors_studies / Systematic_reviews Idioma: En Ano de publicação: 2019 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Aplicações da Informática Médica / Estudos de Avaliação como Assunto / Visualização de Dados Tipo de estudo: Diagnostic_studies / Evaluation_studies / Guideline / Prognostic_studies / Qualitative_research / Risk_factors_studies / Systematic_reviews Idioma: En Ano de publicação: 2019 Tipo de documento: Article