Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
J Am Med Inform Assoc ; 31(4): 968-974, 2024 Apr 03.
Article in English | MEDLINE | ID: mdl-38383050

ABSTRACT

OBJECTIVE: To develop and evaluate a data-driven process to generate suggestions for improving alert criteria using explainable artificial intelligence (XAI) approaches. METHODS: We extracted data on alerts generated from January 1, 2019 to December 31, 2020, at Vanderbilt University Medical Center. We developed machine learning models to predict user responses to alerts. We applied XAI techniques to generate global explanations and local explanations. We evaluated the generated suggestions by comparing with alert's historical change logs and stakeholder interviews. Suggestions that either matched (or partially matched) changes already made to the alert or were considered clinically correct were classified as helpful. RESULTS: The final dataset included 2 991 823 firings with 2689 features. Among the 5 machine learning models, the LightGBM model achieved the highest Area under the ROC Curve: 0.919 [0.918, 0.920]. We identified 96 helpful suggestions. A total of 278 807 firings (9.3%) could have been eliminated. Some of the suggestions also revealed workflow and education issues. CONCLUSION: We developed a data-driven process to generate suggestions for improving alert criteria using XAI techniques. Our approach could identify improvements regarding clinical decision support (CDS) that might be overlooked or delayed in manual reviews. It also unveils a secondary purpose for the XAI: to improve quality by discovering scenarios where CDS alerts are not accepted due to workflow, education, or staffing issues.


Subject(s)
Artificial Intelligence , Decision Support Systems, Clinical , Humans , Machine Learning , Academic Medical Centers , Educational Status
2.
Appl Clin Inform ; 13(5): 1024-1032, 2022 10.
Article in English | MEDLINE | ID: mdl-36288748

ABSTRACT

OBJECTIVES: To improve clinical decision support (CDS) by allowing users to provide real-time feedback when they interact with CDS tools and by creating processes for responding to and acting on this feedback. METHODS: Two organizations implemented similar real-time feedback tools and processes in their electronic health record and gathered data over a 30-month period. At both sites, users could provide feedback by using Likert feedback links embedded in all end-user facing alerts, with results stored outside the electronic health record, and provide feedback as a comment when they overrode an alert. Both systems are monitored daily by clinical informatics teams. RESULTS: The two sites received 2,639 Likert feedback comments and 623,270 override comments over a 30-month period. Through four case studies, we describe our use of end-user feedback to rapidly respond to build errors, as well as identifying inaccurate knowledge management, user-interface issues, and unique workflows. CONCLUSION: Feedback on CDS tools can be solicited in multiple ways, and it contains valuable and actionable suggestions to improve CDS alerts. Additionally, end users appreciate knowing their feedback is being received and may also make other suggestions to improve the electronic health record. Incorporation of end-user feedback into CDS monitoring, evaluation, and remediation is a way to improve CDS.


Subject(s)
Decision Support Systems, Clinical , Feedback , Electronic Health Records , Workflow
3.
J Am Med Inform Assoc ; 29(6): 1050-1059, 2022 05 11.
Article in English | MEDLINE | ID: mdl-35244165

ABSTRACT

OBJECTIVE: We describe the Clickbusters initiative implemented at Vanderbilt University Medical Center (VUMC), which was designed to improve safety and quality and reduce burnout through the optimization of clinical decision support (CDS) alerts. MATERIALS AND METHODS: We developed a 10-step Clickbusting process and implemented a program that included a curriculum, CDS alert inventory, oversight process, and gamification. We carried out two 3-month rounds of the Clickbusters program at VUMC. We completed descriptive analyses of the changes made to alerts during the process, and of alert firing rates before and after the program. RESULTS: Prior to Clickbusters, VUMC had 419 CDS alerts in production, with 488 425 firings (42 982 interruptive) each week. After 2 rounds, the Clickbusters program resulted in detailed, comprehensive reviews of 84 CDS alerts and reduced the number of weekly alert firings by more than 70 000 (15.43%). In addition to the direct improvements in CDS, the initiative also increased user engagement and involvement in CDS. CONCLUSIONS: At VUMC, the Clickbusters program was successful in optimizing CDS alerts by reducing alert firings and resulting clicks. The program also involved more users in the process of evaluating and improving CDS and helped build a culture of continuous evaluation and improvement of clinical content in the electronic health record.


Subject(s)
Decision Support Systems, Clinical , Medical Order Entry Systems , Electronic Health Records , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...