RESUMEN
OBJECTIVE: This article addresses the persistent challenge of Delayed Hospital Discharge (DHD) and aims to provide a comprehensive overview, synthesis, and actionable, sustainable plan based on the synthesis of the systematic review articles spanning the past 24 years. Our research aims to comprehensively examine DHD, identifying its primary causes and emphasizing the significance of effective communication and management in healthcare settings. METHODS: We conducted the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) method for synthesizing findings from 23 review papers published over the last two decades, encompassing over 700 studies. In addition, we employed a practical and comprehensive framework to tackle DHD. Rooted in Linderman's model, our approach focused on continuous process improvement (CPI), which highlights senior management commitment, technical/administrative support, and social/transitional care. Our proposed CPI method comprised several stages: planning, implementation, data analysis, and adaptation, all contributing to continuous improvement in healthcare delivery. This method provided valuable insights and recommendations for addressing DHD challenges. FINDINGS: Our DHD analysis revealed crucial insights across multiple dimensions. Firstly, examining causes and interventions uncovered issues such as limited discharge destinations, signaling unsustainable solutions, and inefficient care coordination. The second aspect explored the patient and caregiver experience, emphasizing challenges linked to staff uncertainty and negative physical environments, with notable attention to the underexplored area of caregiver experience. The third theme explored organizational and individual factors, including cognitive impairment and socioeconomic influences. The findings emphasized the importance of incorporating patients' data, recognizing its complexity and current avoidance. Finally, the role of transitional and social care and financial strategies was scrutinized, emphasizing the need for multicomponent, context-specific interventions to address DHD effectively. CONCLUSION: This study addresses gaps in the literature, challenges prevailing solutions, and offers practical pathways for reducing DHD, contributing significantly to healthcare quality and patient outcomes. The synthesis introduces the vital CPI stage, enhancing Linderman's work and providing a pragmatic framework to eradicate delayed discharge. Future efforts will address practitioner consultations to enhance perspectives and further enrich the study. PATIENT OR PUBLIC CONTRIBUTION: Our scoping review synthesizes and analyzes existing systematic review articles and emphasizes offering practical, actionable solutions. While our approach does not directly engage patients, it strategically focuses on extracting insights from the literature to create a CPI framework. This unique aspect is intentionally designed to yield tangible benefits for patients, service users, caregivers, and the public. Our actionable recommendations aim to improve hospital discharge processes for better healthcare outcomes and experiences. This detailed analysis goes beyond theoretical considerations and provides a practical guide to improve healthcare practices and policies.
Asunto(s)
Atención a la Salud , Alta del Paciente , Humanos , Cuidadores , Hospitales , PacientesRESUMEN
The climate crisis significantly impacts the health and well-being of older adults, both directly and indirectly. This issue is of growing concern in Canada due to the country's rapidly accelerating warming trend and expanding elderly population. This article serves a threefold purpose: (i) outlining the impacts of the climate crisis on older adults, (ii) providing a descriptive review of existing policies with a specific focus on the Canadian context, and (iii) promoting actionable recommendations. Our review reveals the application of current strategies, including early warning systems, enhanced infrastructure, sustainable urban planning, healthcare access, social support systems, and community engagement, in enhancing resilience and reducing health consequences among older adults. Within the Canadian context, we then emphasize the importance of establishing robust risk metrics and evaluation methods to prepare for and manage the impacts of the climate crisis efficiently. We underscore the value of vulnerability mapping, utilizing geographic information to identify regions where older adults are most at risk. This allows for targeted interventions and resource allocation. We recommend employing a root cause analysis approach to tailor risk response strategies, along with a focus on promoting awareness, readiness, physician training, and fostering collaboration and benchmarking. These suggestions aim to enhance disaster risk management for the well-being and resilience of older adults in the face of the climate crisis.
Asunto(s)
Planificación en Desastres , Desastres , Humanos , Anciano , Canadá , Benchmarking , Planificación de CiudadesRESUMEN
The COVID-19 pandemic - as a massive disruption - has significantly increased the need for medical services putting an unprecedented strain on health systems. This study presents a robust location-allocation model under uncertainty to increase the resiliency of health systems by applying alternative resources, such as backup and field hospitals and student nurses. A multi-objective optimization model is developed to minimize the system's costs and maximize the satisfaction rate among medical staff and COVID-19 patients. A robust approach is provided to face the data uncertainty, and a new mathematical model is extended to linearize a nonlinear constraint. The ICU beds, ward beds, ventilators, and nurses are considered the four main capacity limitations of hospitals for admitting different types of COVID-19 patients. The sensitivity analysis is performed on a real-world case study to investigate the applicability of the proposed model. The results demonstrate the contribution of student nurses and backup and field hospitals in treating COVID-19 patients and provide more flexible decisions with lower risks in the system by managing the fluctuations in both the number of patients and available nurses. The results showed that a reduction in the number of available nurses incurs higher costs for the system and lower satisfaction among patients and nurses. Moreover, the backup and field hospitals and the medical staff elevated the system's resiliency. By allocating backup hospitals to COVID-19 patients, only 37% of severe patients were lost, and this rate fell to less than 5% after establishing field hospitals. Moreover, medical students and field hospitals curbed the costs and increased the satisfaction rate of nurses by 75%. Finally, the system was protected from failure by increasing the conservatism level. With a 2% growth in the price of robustness, the system saved 13%.
RESUMEN
In learning causal networks, typically cross-sectional data are used and the sequence among the network nodes is learned through conditional independence. Sequence is inherently a longitudinal concept. We propose to learn sequence of events in longitudinal data and use it to orient arc directions in a network learned from cross-sectional data. The network is learned from cross-sectional data using various established algorithms, with one modification. Arc directions that do not agree with the longitudinal sequence were prohibited. We established longitudinal sequence through two methods: Probabilistic Contrast, and Goodman and Kruskal error reduction methods. In simulated data, the error reduction method was used to learn the sequence in the data. The procedure reduced the number of arc direction errors and larger improvements were observed with increasing number of events in the network. In real data, different algorithms were used to learn the network from cross-sectional data, while prohibiting arc directions not supported by longitudinal information. The agreement among learned networks increased significantly. It is possible to combine sequence information learned from longitudinal data with algorithms organized for learning network models from cross-sectional data. Such models may have additional causal interpretation as they more explicitly take into account observed sequence of events.
Asunto(s)
Causalidad , Estudios Epidemiológicos , Modelos Estadísticos , Análisis de Causa Raíz/métodos , Algoritmos , Sesgo , Simulación por Computador , Estudios Transversales , Humanos , Estudios LongitudinalesRESUMEN
Independently performing activities of daily living (ADLs) is vital for maintaining one's quality of life. Losing this ability can significantly impact an individual's overall health status, including their mental health and social well-being. Aging is an important factor contributing to the loss of ADL abilities, and our study focuses on investigating the trajectories of functional decline and recovery in older adults. Employing trajectory analytics methodologies, this research delves into the intricate dynamics of ADL pathways, unveiling their complexity, diversity, and inherent characteristics. The study leverages a substantial dataset encompassing ADL assessments of nursing home residents with diverse disability profiles in the United States. The investigation begins by transforming these assessments into sequences of disability combinations, followed by applying various statistical measures, indicators, and visual analytics. Valuable insights are gained into the typical disability states, transitions, and patterns over time. The results also indicate that while predicting the progression of ADL disabilities presents manageable challenges, the duration of these states proves more complicated. Our findings hold significant potential for improving healthcare decision-making by enabling clinicians to anticipate possible patterns, develop targeted and effective interventions that support older patients in preserving their independence, and enhance overall care quality.
Asunto(s)
Actividades Cotidianas , Personas con Discapacidad , Humanos , Estados Unidos , Anciano , Calidad de Vida , Envejecimiento , Salud MentalRESUMEN
INTRODUCTION: The closest emergency department (ED) may not always be the optimal hospital for certain stable high acuity patients if further distanced ED's can provide specialized care or are less overcrowded. Machine learning (ML) predictions may support paramedic decision-making to transport a subgroup of emergent patients to a more suitable, albeit more distanced, ED if hospital admission is unlikely. We examined whether characteristics known to paramedics in the prehospital setting were predictive of hospital admission in emergent acuity patients. MATERIALS AND METHODS: We conducted a population-level cohort study using four ML algorithms to analyze ED visits of the National Ambulatory Care Reporting System from January 1, 2018 to December 31, 2019 in Ontario, Canada. We included all adult patients (≥18 years) transported to the ED by paramedics with an emergent Canadian Triage Acuity Scale score. We included eight characteristic classes as model predictors that are recorded at ED triage. All ML algorithms were trained and assessed using 10-fold cross-validation to predict hospital admission from the ED. Predictive model performance was determined using the area under curve (AUC) with 95% confidence intervals and probabilistic accuracy using the Brier Scaled score. Variable importance scores were computed to determine the top 10 predictors of hospital admission. RESULTS: All machine learning algorithms demonstrated acceptable accuracy in predicting hospital admission (AUC 0.77-0.78, Brier Scaled 0.22-0.24). The characteristics most predictive of admission were age between 65 to 105 years, referral source from a residential care facility, presenting with a respiratory complaint, and receiving home care. DISCUSSION: Hospital admission was accurately predicted based on patient characteristics known prehospital to paramedics prior to arrival. Our results support consideration of policy modification to permit certain emergent acuity patients to be transported to a further distanced ED. Additionally, this study demonstrates the utility of ML in paramedic and prehospital research.
Asunto(s)
Paramédico , Proyectos de Investigación , Adulto , Humanos , Anciano , Anciano de 80 o más Años , Estudios de Cohortes , Hospitales , Servicio de Urgencia en Hospital , Aprendizaje Automático , OntarioRESUMEN
OBJECTIVE: To assess the impacts of multiple chronic conditions (MCC) and frailty on 30-day post-discharge readmission and mortality among older patients with delayed discharge. DATA SOURCE/EXTRACTION: We used a retrospective cohort of older patients in the Discharge Abstract Database (DAD) between 2004 and 2017 in Ontario, Canada. We extracted data on patients aged ≥ 65 who experienced delayed discharge during hospitalization (N = 353,106). STUDY DESIGN: We measured MCC and frailty using the Elixhauser Comorbidity Index (ECI) and the Hospital Frailty Risk Score (HFRS), respectively. We used multinomial logistic regression to model the main and interactive effects of MCC and frailty on the adverse outcomes. PRINCIPAL FINDINGS: After adjusting for sex, discharge destination, urban/rural residency, wait time for alternative care, and socioeconomic status, the coexistence of MCC and high frailty increased the relative risk of 30-day mortality and readmission when compared to the references group, i.e., non-MCC patients with low-to-moderate frailty. CONCLUSIONS: Multimorbidity and frailty each provide unique information about adverse outcomes among older patients with delayed discharge but are most informative when examined in unison. IMPLICATIONS FOR HEALTH POLICY: To minimize the risk of adverse outcomes among older delayed discharge patients, discharge planning must be tailored to their concurrent multimorbidity and frailty status.
Asunto(s)
Fragilidad , Cuidados Posteriores , Anciano , Anciano Frágil , Fragilidad/epidemiología , Política de Salud , Humanos , Multimorbilidad , Ontario/epidemiología , Alta del Paciente , Estudios RetrospectivosRESUMEN
BACKGROUND: Patient complexity among older delayed-discharge patients complicates discharge planning, resulting in a higher rate of adverse outcomes, such as readmission and mortality. Early prediction of multimorbidity, as a common indicator of patient complexity, can support proactive discharge planning by prioritizing complex patients and reducing healthcare inefficiencies. OBJECTIVE: We set out to accomplish the following two objectives: 1) to examine the predictability of three common multimorbidity indices, including Charlson-Deyo Comorbidity Index (CDCI), the Elixhauser Comorbidity Index (ECI), and the Functional Comorbidity Index (FCI) using machine learning (ML), and 2) to assess the prognostic power of these indices in predicting 30-day readmission and mortality. MATERIALS AND METHODS: We used data including 163,983 observations of patients aged 65 and older who experienced discharge delay in Ontario, Canada, during 2004 - 2017. First, we utilized various classification ML algorithms, including classification and regression trees, random forests, bagging trees, extreme gradient boosting, and logistic regression, to predict the multimorbidity status based on CDCI, ECI, and FCI. Second, we used adjusted multinomial logistic regression to assess the association between multimorbidity indices and the patient-important outcomes, including 30-day mortality and readmission. RESULTS: For all ML algorithms and regardless of the predictive performance criteria, better predictions were established for the CDCI compared with the ECI and FCI. Remarkably, the most predictable multimorbidity index (i.e., CDCI with Area Under the Receiver Operating Characteristic Curve = 0.80, 95% CI = 0.79 - 0.81) also offered the highest prognostications regarding adverse events (RRRmortality = 3.44, 95% CI = 3.21 - 3.68 and RRRreadmission = 1.36, 95% CI = 1.31 - 1.40). CONCLUSIONS: Our findings highlight the feasibility and utility of predicting multimorbidity status using ML algorithms, resulting in the early detection of patients at risk of mortality and readmission. This can support proactive triage and decision-making about staffing and resource allocation, with the goal of optimizing patient outcomes and facilitating an upstream and informed discharge process through prioritizing complex patients for discharge and providing patient-centered care.
Asunto(s)
Multimorbilidad , Alta del Paciente , Anciano , Humanos , Aprendizaje Automático , Ontario , Readmisión del PacienteRESUMEN
Sepsis is a major public and global health concern. Every hour of delay in detecting sepsis significantly increases the risk of death, highlighting the importance of accurately predicting sepsis in a timely manner. A growing body of literature has examined developing new or improving the existing machine learning (ML) approaches for timely and accurate predictions of sepsis. This study contributes to this literature by providing clear insights regarding the role of the recency and adequacy of historical information in predicting sepsis using ML. To this end, we implemented a deep learning model using a bidirectional long short-term memory (BiLSTM) algorithm and compared it with six other ML algorithms based on numerous combinations of the prediction horizons (to capture information recency) and observation windows (to capture information adequacy) using different measures of predictive performance. Our results indicated that the BiLSTM algorithm outperforms all other ML algorithms and provides a great separability of the predicted risk of sepsis among septic versus non-septic patients. Moreover, decreasing the prediction horizon (in favor of information recency) always boosts the predictive performance; however, the impact of expanding the observation window (in favor of information adequacy) depends on the prediction horizon and the purpose of prediction. More specifically, when the prediction is responsive to the positive label (i.e., Sepsis), increasing historical data improves the predictive performance when the prediction horizon is short-moderate.
Asunto(s)
Aprendizaje Automático , Sepsis/diagnóstico , Algoritmos , Humanos , Pronóstico , RiesgoRESUMEN
BACKGROUND: Emergency departments (ED) are a portal of entry into the hospital and are uniquely positioned to influence the health care trajectories of older adults seeking medical attention. Older adults present to the ED with distinct needs and complex medical histories, which can make disposition planning more challenging. Machine learning (ML) approaches have been previously used to inform decision-making surrounding ED disposition in the general population. However, little is known about the performance and utility of ML methods in predicting hospital admission among older ED patients. We applied a series of ML algorithms to predict ED admission in older adults and discuss their clinical and policy implications. MATERIALS AND METHODS: We analyzed the Canadian data from the interRAI multinational ED study, the largest prospective cohort study of older ED patients to date. The data included 2274 ED patients 75 years of age and older from eight ED sites across Canada between November 2009 and April 2012. Data were extracted from the interRAI ED Contact Assessment, with predictors including a series of geriatric syndromes, functional assessments, and baseline care needs. We applied a total of five ML algorithms. Models were trained, assessed, and analyzed using 10-fold cross-validation. The performance of predictive models was measured using the area under the receiver operating characteristic curve (AUC). We also report the accuracy, sensitivity, and specificity of each model to supplement performance interpretation. RESULTS: Gradient boosted trees was the most accurate model to predict older ED patients who would require hospitalization (AUC = 0.80). The five most informative features include home intravenous therapy, time of ED presentation, a requirement for formal support services, independence in walking, and the presence of an unstable medical condition. CONCLUSION: To the best of our knowledge, this is the first study to predict hospital admission in older ED patients using a series of geriatric syndromes and functional assessments. We were able to predict hospital admission in older ED patients with good accuracy using the items available in the interRAI ED Contact Assessment. This information can be used to inform decision-making about ED disposition and may expedite admission processes and proactive discharge planning.
Asunto(s)
Servicio de Urgencia en Hospital/estadística & datos numéricos , Hospitalización/estadística & datos numéricos , Aprendizaje Automático , Medición de Riesgo/métodos , Anciano , Anciano de 80 o más Años , Canadá , Femenino , Evaluación Geriátrica , Humanos , Masculino , Estudios Prospectivos , Curva ROCRESUMEN
PURPOSE OF THE STUDY: This study provides benchmarks for likelihood, number of days until, and sequence of functional decline and recovery. DESIGN AND METHODS: We analyzed activities of daily living (ADLs) of 296,051 residents in Veteran Affairs nursing homes between January 1, 2000 and October 9, 2012. ADLs were extracted from standard minimum data set assessments. Because of significant overlap between short- and long-stay residents, we did not distinguish between these populations. Twenty-five combinations of ADL deficits described the experience of 84.3% of all residents. A network model described transitions among these 25 combinations. The network was used to calculate the shortest, longest, and maximum likelihood paths using backward induction methodology. Longitudinal data were used to derive a Bayesian network that preserved the sequence of occurrence of 9 ADL deficits. RESULTS: The majority of residents (57%) followed 4 pathways in loss of function. The most likely sequence, in order of occurrence, was bathing, grooming, walking, dressing, toileting, bowel continence, urinary continence, transferring, and feeding. The other three paths occurred with reversals in the order of dressing/toileting and bowel/urinary continence. ADL impairments persisted without any change for an average of 164 days (SD = 62). Residents recovered partially or completely from a single impairment in 57% of cases over an average of 119 days (SD = 41). Recovery rates declined as residents developed more than 4 impairments. IMPLICATIONS: Recovery of deficits among those studied followed a relatively predictable path, and although more than half recovered from a single functional deficit, recovery exceeded 100 days suggesting time to recover often occurs over many months.
Asunto(s)
Actividades Cotidianas , Trastornos del Conocimiento/fisiopatología , Cognición/fisiología , Evaluación Geriátrica/métodos , Casas de Salud , Recuperación de la Función , Caminata/fisiología , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Estudios Retrospectivos , Factores de Riesgo , Estados UnidosRESUMEN
BACKGROUND: Improvement teams make causal inferences, but the methods they use are based on statistical associations. This article shows how data and statistical models can be used to help improvement teams make causal inferences and find the root causes of problems. METHODS: This article uses attribution data, competing risk survival analysis, and Bayesian network probabilities to analyze excessive emergency department (ED) stays within one hospital. We use data recorded by ED clinicians that attributed the cause of excessive ED stays to 23 causes for the 70 049 ED visits between March 2011 and April 2014. We use competing risk survival analysis to identify contribution of each cause to the delay. We use Bayesian network models to analyze interaction among different causes of excessive stays and find the root causes of this problem. RESULTS: This article shows the utility of causal analysis to help improvement teams focus on the root causes of problems. For the example analyzed in the article, most causes for patients' excessive ED stays were related to the hospital operations outside the ED. Therefore, improvement projects inside the ED such as expanding ED, increasing staff at the ED, or improving operations are less likely to have a positive impact on reducing excessive ED stays. On the contrary, interventions that improve hospital occupancy (better discharge, expansion of beds, etc) or improve laboratory response times are more likely to result in positive outcomes.
Asunto(s)
Servicio de Urgencia en Hospital/organización & administración , Análisis de Causa Raíz , Tiempo de Tratamiento , Teorema de Bayes , Humanos , Tiempo de Internación , Mejoramiento de la Calidad , Análisis de SupervivenciaRESUMEN
We examine the role of a common cognitive heuristic in unsupervised learning of Bayesian probability networks from data. Human beings perceive a larger association between causal than diagnostic relationships. This psychological principal can be used to orient the arcs within Bayesian networks by prohibiting the direction that is less predictive. The heuristic increased predictive accuracy by an average of 0.51 % percent, a small amount. It also increased total agreement between different network learning algorithms (Max Spanning Tree, Taboo, EQ, SopLeq, and Taboo Order) by 25 %. Prior to use of the heuristic, the multiple raters Kappa between the algorithms was 0.60 (95 % confidence interval, CI, from 0.53 to 0.67) indicating moderate agreement among the networks learned through different algorithms. After the use of the heuristic, the multiple raters Kappa was 0.85 (95 % CI from 0.78 to 0.92). There was a statistically significant increase in agreement between the five algorithms (alpha < 0.05). These data suggest that the heuristic increased agreement between networks learned through use of different algorithms, without loss of predictive accuracy. Additional research is needed to see if findings persist in other data sets and to explain why a heuristic used by humans could improve construct validity of mathematical algorithms.
Asunto(s)
Actividades Cotidianas , Algoritmos , Teorema de Bayes , Causalidad , Casas de Salud , Inteligencia Artificial , Evaluación de la Discapacidad , HumanosRESUMEN
BACKGROUND: The efficacy of diabetic medications among patients with multiple comorbidities is not tested in randomized clinical studies. It is important to monitor the performance of these medications after marketing approvals. OBJECTIVE: To investigate the risk of all-cause mortality associated with prescription of hypoglycemic agents. METHODS: We retrospectively examined data from 17,773 type 2 diabetic patients seen from March 2, 1998, to December 13, 2010, in 3 Veterans Administration medical centers. Severity was measured using patients' inpatient and outpatient comorbidities during the last year of visits. Severity-adjusted logistic regression was used to measure the odds ratio for mortality within the study period. RESULTS: Patients' severity of illness correctly classified mortality for 89.8% of the patients (P less than 0.0001). Being younger, married, and white decreased severity adjusted risk of mortality. Exposure to the following medications increased severity adjusted risk of mortality: glyburide (odds ratio [OR] = 1.804, 95% CI from 1.518 to 2.145), glipizide (OR = 1.566, 95% CI from 1.333 to 1.839), rosiglitazone (OR = 1.805, 95% CI from 1.378 to 2.365), chlorpropamide (OR = 3.026, 95% CI from 1.096 to 8.351), insulin (OR = 2.382, 95% CI from 2.112 to 2.686). None of the other medications (metformin, acarbose, glimepiride, pioglitazone, repaglinide, troglitazone, or dipeptidyl peptidase-4) were associated with excess mortality beyond what could be expected from the patients' severity of illness or demographic characteristics. The reported excess mortality could not be explained away by use of other concurrent, nondiabetic classes of medications. CONCLUSION: Our findings suggest chlorpropamide, glipizide, glyburide, insulin, and rosiglitazone increased severity-adjusted mortality in veterans with type 2 diabetes. A decision aid that could optimize selection of hypoglycemic medications based on patients' comorbidities might increase patients' survival.
Asunto(s)
Diabetes Mellitus Tipo 2/tratamiento farmacológico , Diabetes Mellitus Tipo 2/mortalidad , Hipoglucemiantes/uso terapéutico , Veteranos , Estudios de Cohortes , Femenino , Humanos , Masculino , Estudios Retrospectivos , Índice de Severidad de la Enfermedad , Resultado del TratamientoRESUMEN
This paper proposes a method for examining the causal relationship among investment in information technology (IT) and the organization's productivity. In this method, first a strong relationship among (1) investment in IT, (2) use of IT and (3) organization's productivity is verified using correlations. Second, the assumption that IT investment preceded improved productivity is tested using partial correlation. Finally, the assumption of what may have happened in the absence of IT investment, the so called counterfactual, is tested through forecasting productivity at different levels of investment. The paper applies the proposed method to investment in the Veterans Health Information Systems and Technology Architecture (VISTA) system. Result show that the causal analysis can be done, even with limited data. Furthermore, because the procedure relies on overall organization's productivity, it might be more objective than when the analyst picks and chooses which costs and benefits should be included in the analysis.
RESUMEN
OBJECTIVE: Scientists have concluded that genetic profiles cannot predict a large percentage of variation in response to citalopram, a common antidepressant. Using the same data, we examined if a different conclusion can be arrived at when the results are personalized to fit specific patients. METHODS: We used data available through the Sequenced Treatment Alternatives to Relieve Depression database. We created three boosted Classification and Regression Trees to identify 16 subgroups of patients, among whom anticipation of positive or negative response to citalopram was significantly different from 0.5 (P≤0.1). RESULTS: In a 10-fold cross-validation, this ensemble of trees made no predictions in 33% of cases. In the remaining 67% of cases, it accurately classified response to citalopram in 78% of cases. CONCLUSION: For the majority of the patients, genetic markers can be used to guide selection of citalopram. The rules identified in this study can help personalize prescription of antidepressants.