Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
1.
J Am Med Inform Assoc ; 26(12): 1427-1436, 2019 12 01.
Article in English | MEDLINE | ID: mdl-31578568

ABSTRACT

OBJECTIVE: Emergency departments (EDs) continue to pursue optimal patient flow without sacrificing quality of care. The speed with which a healthcare provider receives pertinent information, such as results from clinical orders, can impact flow. We seek to determine if clinical ordering behavior can be predicted at triage during an ED visit. MATERIALS AND METHODS: Using data available during triage, we trained multilabel machine learning classifiers to predict clinical orders placed during an ED visit. We benchmarked 4 classifiers with 2 multilabel learning frameworks that predict orders independently (binary relevance) or simultaneously (random k-labelsets). We evaluated algorithm performance, calculated variable importance, and conducted a simple simulation study to examine the effects of algorithm implementation on length of stay and cost. RESULTS: Aggregate performance across orders was highest when predicting orders independently with a multilayer perceptron (median F1 score = 0.56), but prediction frameworks that simultaneously predict orders for a visit enhanced predictive performance for correlated orders. Visit acuity was the most important predictor for most orders. Simulation results indicated that direct implementation of the model would increase ordering costs (from $21 to $45 per visit) but reduce length of stay (from 158 minutes to 151 minutes) over all visits. DISCUSSION: Simulated implementations of the predictive algorithm decreased length of stay but increased ordering costs. Optimal implementation of these predictions to reduce patient length of stay without incurring additional costs requires more exploration. CONCLUSIONS: It is possible to predict common clinical orders placed during an ED visit with data available at triage.


Subject(s)
Diagnostic Tests, Routine/statistics & numerical data , Emergency Service, Hospital/organization & administration , Machine Learning , Benchmarking , Decision Support Systems, Clinical , Humans , Length of Stay , Practice Patterns, Physicians'
2.
J Grad Med Educ ; 11(1): 85-91, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30805103

ABSTRACT

BACKGROUND: In 2017, the Maine Medical Center Graduate Medical Education Committee received an unprecedented number of requests (n = 18) to start new graduate medical education (GME) programs or expand existing programs. There was no process by which multiple programs could be prioritized to compete for scarce GME resources. OBJECTIVE: We developed a framework to strategically assess and prioritize GME program expansion requests to yield the greatest benefits for patients, learners, and the institution as well as to meet regional and societal priorities. METHODS: A systems engineering methodology called tradespace exploration was applied to a 6-step process to identify relevant categories and metrics. Programs' final scores were peer evaluated, and prioritization recommendations were made. Correlation analysis was used to evaluate the relevance of each category to final scores. Stakeholder feedback was solicited for process refinement. RESULTS: Five categories relevant to GME expansion were identified: institutional priorities, health care system priorities, regional and societal needs, program quality, and financial considerations. All categories, except program quality, correlated well with final scores (R 2 range 0.413-0.662). Three of 18 requested programs were recommended for funding. A stakeholder survey revealed that almost half of respondents (48%, 14 of 29) agreed that the process was unbiased and inclusive. Focus group feedback noted that the process had been rigorous and deliberate, although communication could have been improved. CONCLUSIONS: Applying a systems engineering approach to develop institution-specific metrics for assessing GME expansion requests provided a reproducible framework, allowing consideration of institutional, health care system, and regional societal needs, as well as program quality and funding considerations.


Subject(s)
Academic Medical Centers/organization & administration , Education, Medical, Graduate/methods , Internship and Residency/organization & administration , Strategic Planning , Training Support , Education, Medical, Graduate/organization & administration , Focus Groups , Humans , Internship and Residency/economics , Maine
3.
Acad Emerg Med ; 20(11): 1156-63, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24238319

ABSTRACT

OBJECTIVES: The objective was to test the generalizability, across a range of hospital sizes and demographics, of a previously developed method for predicting and aggregating, in real time, the probabilities that emergency department (ED) patients will be admitted to a hospital inpatient unit. METHODS: Logistic regression models were developed that estimate inpatient admission probabilities of each patient upon entering an ED. The models were based on retrospective development (n = 4,000 to 5,000 ED visits) and validation (n = 1,000 to 2,000 ED visits) data sets from four heterogeneous hospitals. Model performance was evaluated using retrospective test data sets (n = 1,000 to 2,000 ED visits). For one hospital the developed model also was applied prospectively to a test data set (n = 910 ED visits) coded by triage nurses in real time, to compare results to those from the retrospective single investigator-coded test data set. RESULTS: The prediction models for each hospital performed reasonably well and typically involved just a few simple-to-collect variables, which differed for each hospital. Areas under receiver operating characteristic curves (AUC) ranged from 0.80 to 0.89, R(2) correlation coefficients between predicted and actual daily admissions ranged from 0.58 to 0.90, and Hosmer-Lemeshow goodness-of-fit statistics of model accuracy had p > 0.01 with one exception. Data coded prospectively by triage nurses produced comparable results. CONCLUSIONS: The accuracy of regression models to predict ED patient admission likelihood was shown to be generalizable across hospitals of different sizes, populations, and administrative structures. Each hospital used a unique combination of predictive factors that may reflect these differences. This approach performed equally well when hospital staff coded patient data in real time versus the research team retrospectively.


Subject(s)
Emergency Service, Hospital/statistics & numerical data , Hospitalization/statistics & numerical data , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Patient Admission/statistics & numerical data , Predictive Value of Tests , Retrospective Studies , Triage , United States
4.
Acad Emerg Med ; 19(9): E1045-54, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22978731

ABSTRACT

OBJECTIVES: The objectives were to evaluate three models that use information gathered during triage to predict, in real time, the number of emergency department (ED) patients who subsequently will be admitted to a hospital inpatient unit (IU) and to introduce a new methodology for implementing these predictions in the hospital setting. METHODS: Three simple methods were compared for predicting hospital admission at ED triage: expert opinion, naïve Bayes conditional probability, and a generalized linear regression model with a logit link function (logit-linear). Two months of data were gathered from the Boston VA Healthcare System's 13-bed ED, which receives approximately 1,100 patients per month. Triage nurses were asked to estimate the likelihood that each of 767 triaged patients from that 2-month period would be admitted after their ED treatment, by placing them into one of six categories ranging from low to high likelihood. Logit-linear regression and naïve Bayes models also were developed using retrospective data and used to estimate admission probabilities for each patient who entered the ED within a 2-month time frame, during triage hours (1,160 patients). Predictors considered included patient age, primary complaint, provider, designation (ED or fast track), arrival mode, and urgency level (emergency severity index assigned at triage). RESULTS: Of the three methods considered, logit-linear regression performed the best in predicting total bed need, with a receiver operating characteristic (ROC) area under the curve (AUC) of 0.887, an R(2) of 0.58, an average estimation error of 0.19 beds per day, and on average roughly 3.5 hours before peak demand occurred. Significant predictors were patient age, primary complaint, bed type designation, and arrival mode (p < 0.0001 for all factors). The naïve Bayesian model had similar positive predictive value, with an AUC of 0.841 and an R(2) of 0.58, but with average difference in total bed need of approximately 2.08 per day. Triage nurse expert opinion also had some predictive capability, with an R(2) of 0.52 and an average difference in total bed need of 1.87 per day. CONCLUSIONS: Simple probability models can reasonably predict ED-to-IU patient volumes based on basic data gathered at triage. This predictive information could be used for improved real-time bed management, patient flow, and discharge processes. Both statistical models were reasonably accurate, using only a minimal number of readily available independent variables.


Subject(s)
Emergency Service, Hospital/organization & administration , Inpatients/statistics & numerical data , Patient Admission/statistics & numerical data , Patient Transfer/organization & administration , Triage , Adolescent , Adult , Aged , Aged, 80 and over , Bayes Theorem , Boston , Child , Emergency Medicine/organization & administration , Female , Hospitalization/statistics & numerical data , Humans , Length of Stay , Linear Models , Male , Middle Aged , Predictive Value of Tests , Total Quality Management , Waiting Lists , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...