ABSTRACT
Rationale: Definitive guidelines for anticoagulation management during veno-venous extracorporeal membrane oxygenation (VV ECMO) are lacking, whereas bleeding complications continue to pose major challenges. Objectives: To describe anticoagulation modalities and bleeding events in adults receiving VV ECMO. Methods: This was an international prospective observational study in 41 centers, from December 2018 to February 2021. Anticoagulation was recorded daily in terms of type, dosage, and monitoring strategy. Bleeding events were reported according to site, severity, and impact on mortality. Measurements and Main Results: The study cohort included 652 patients, and 8,471 days on ECMO were analyzed. Unfractionated heparin was the initial anticoagulant in 77% of patients, and the most frequently used anticoagulant during the ECMO course (6,221 d; 73%). Activated partial thromboplastin time (aPTT) was the most common test for monitoring coagulation (86% of days): the median value was 52 seconds (interquartile range, 39 to 61 s) but dropped by 5.3 seconds after the first bleeding event (95% confidence interval, -7.4 to -3.2; P < 0.01). Bleeding occurred on 1,202 days (16.5%). Overall, 342 patients (52.5%) experienced at least one bleeding event (one episode every 215 h on ECMO), of which 10 (1.6%) were fatal. In a multiple penalized Cox proportional hazard model, higher aPTT was a potentially modifiable risk factor for the first episode of bleeding (for 20-s increase; hazard ratio, 1.07). Conclusions: Anticoagulation during VV ECMO was a dynamic process, with frequent stopping in cases of bleeding and restart according to the clinical picture. Future studies might explore lower aPTT targets to reduce the risk of bleeding.
Subject(s)
Extracorporeal Membrane Oxygenation , Heparin , Adult , Humans , Heparin/adverse effects , Extracorporeal Membrane Oxygenation/adverse effects , Blood Coagulation , Hemorrhage/chemically induced , Hemorrhage/therapy , Anticoagulants/adverse effects , Retrospective StudiesABSTRACT
BACKGROUND: Adenosine deaminase (ADA) is a useful biomarker for the diagnosis of tuberculous pleurisy (TBP). However, pleural effusions with high ADA can also be caused by other diseases, particularly hematologic malignant pleural effusion (hMPE). This study aimed to investigate the features that could differentiate TBP and hMPE in patients with pleural effusion ADA ≥ 40 IU/L. METHODS: This was a retrospective observational study of patients with pleural effusion ADA ≥ 40 IU/L, conducted at a Korean tertiary referral hospital with an intermediate tuberculosis burden between January 2010 and December 2017. Multivariable logistic regression analyses were performed to investigate the features associated with TBP and hMPE, respectively. RESULTS: Among 1134 patients with ADA ≥ 40 IU/L, 375 (33.1%) and 85 (7.5%) were diagnosed with TBP and hMPE, respectively. TBP and hMPE accounted for 59% (257/433) and 6% (27/433) in patients with ADA between 70 and 150 IU/L, respectively. However, in patients with ADA ≥ 150 IU/L, they accounted for 7% (9/123) and 19% (23/123), respectively. When ADA between 40 and 70 IU/L was the reference category, ADA between 70 and 150 IU/L was independently associated with TBP (adjusted odds ratio [aOR], 3.11; 95% confidence interval [CI], 1.95-4.95; P < 0.001). ADA ≥ 150 IU/L was negatively associated with TBP (aOR, 0.35; 95% CI, 0.14-0.90; P = 0.029) and positively associated with hMPE (aOR, 13.21; 95% CI, 5.67-30.79; P < 0.001). In addition, TBP was independently associated with lymphocytes ≥ 35% and a lactate dehydrogenase (LD)/ADA ratio < 18 in pleural effusion. hMPE was independently associated with pleural polymorphonuclear neutrophils < 50%, thrombocytopenia, and higher serum LD. A combination of lymphocytes ≥ 35%, LD/ADA < 18, and ADA < 150 IU/L demonstrated a sensitivity of 0.824 and specificity of 0.937 for predicting TBP. CONCLUSION: In patients with very high levels of pleural effusion ADA, hMPE should be considered. Several features in pleural effusion and serum may help to more effectively differentiate TBP from hMPE.
Subject(s)
Hematologic Neoplasms , Pleural Effusion, Malignant , Pleural Effusion , Tuberculosis, Pleural , Humans , Adenosine Deaminase/analysis , Tuberculosis, Pleural/diagnosis , Tuberculosis, Pleural/epidemiology , Tuberculosis, Pleural/complications , Pleural Effusion/diagnosis , Pleural Effusion/epidemiology , Pleural Effusion, Malignant/diagnosis , Hematologic Neoplasms/complicationsABSTRACT
BACKGROUND: Withholding or continuing angiotensin-converting enzyme inhibitors or angiotensin 2 receptor blockers peri-operatively in non-cardiac surgery remains controversial as they may result in intra-operative hypotension and postoperative organ damage. METHODS: We included patients prescribed angiotensin-converting enzyme inhibitors or angiotensin 2 receptor blockers who underwent surgical procedures > 1 h duration under general or spinal anaesthesia from January 2012 to June 2022 in a single centre. We categorised patients by whether these drugs were withheld for 24 h before surgery. We evaluated the association of withholding these drugs before non-cardiac surgery with creatinine concentrations that increased ≥ 26.4 µmol.l-1 in the first 48 postoperative hours (acute kidney injury). We also analysed changes in creatinine concentrations and estimated glomerular filtration rates. RESULTS: Angiotensin-converting enzyme inhibitors or angiotensin 2 receptor blockers were withheld in 24,285 of 32,933 (74%) patients and continued in 8648 (26%) patients. We used propensity scores for drug discontinuation to match 8631 patient pairs who did or did not continue these drugs: acute kidney injury was recorded for 1791 (21%) patients who continued these drugs vs. 1587 (18%) who did not (OR (95%CI) 1.16 (1.08-1.25), p < 0.001). Intra-operative hypotension was recorded for 3892 (45%) patients who continued drugs vs. 3373 (39%) patients who did not (OR (95%CI) 1.28 (1.21-1.36), p < 0.001). Continuing drugs was independently associated with a mean increase in creatinine of 2.2 µmol.l-1 (p < 0.001) and a mean decrease in estimated glomerular filtration rate of 1.4 ml.min.1.73 m-2 (p < 0.001). CONCLUSIONS: Continuing angiotensin-converting enzyme inhibitors or angiotensin 2 receptor blockers 24 h before non-cardiac surgery was associated with intra-operative hypotension and postoperative acute kidney injury.
Subject(s)
Acute Kidney Injury , Angiotensin-Converting Enzyme Inhibitors , Postoperative Complications , Humans , Acute Kidney Injury/chemically induced , Female , Male , Aged , Angiotensin-Converting Enzyme Inhibitors/adverse effects , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Middle Aged , Creatinine/blood , Angiotensin Receptor Antagonists/adverse effects , Angiotensin Receptor Antagonists/therapeutic use , Retrospective Studies , Glomerular Filtration Rate/drug effects , Surgical Procedures, Operative , Withholding Treatment , Adult , Hypotension/chemically inducedABSTRACT
OBJECTIVES: In Asian populations, the correlation between sepsis outcomes and body mass is unclear. A multicenter, prospective, observational study conducted between September 2019 and December 2020 evaluated obesity's effects on sepsis outcomes in a national cohort. SETTING: Nineteen tertiary referral hospitals or university-affiliated hospitals in South Korea. PATIENTS: Adult patients with sepsis ( n = 6,424) were classified into obese ( n = 1,335) and nonobese groups ( n = 5,089). MEASUREMENTS AND RESULTS: Obese and nonobese patients were propensity score-matched in a ratio of 1:1. Inhospital mortality was the primary outcome. After propensity score matching, the nonobese group had higher hospital mortality than the obese group (25.3% vs 36.7%; p < 0.001). The obese group had a higher home discharge rate (70.3% vs 65.2%; p < 0.001) and lower median Clinical Frailty Scale (CFS) (4 vs 5; p = 0.007) at discharge than the nonobese group, whereas the proportion of frail patients at discharge (CFS ≥ 5) was significantly higher in the nonobese group (48.7% vs 54.7%; p = 0.011). Patients were divided into four groups according to the World Health Organization body mass index (BMI) classification and performed additional analyses. The adjusted odds ratio of hospital mortality and frailty at discharge for underweight, overweight, and obese patients relative to normal BMI was 1.25 ( p = 0.004), 0.58 ( p < 0.001), and 0.70 ( p = 0.047) and 1.53 ( p < 0.001), 0.80 ( p = 0.095), and 0.60 ( p = 0.022), respectively. CONCLUSIONS: Obesity is associated with higher hospital survival and functional outcomes at discharge in Asian patients with sepsis.
Subject(s)
Frailty , Sepsis , Adult , Humans , Prospective Studies , Obesity Paradox , Obesity/complications , Obesity/epidemiology , Body Mass Index , Retrospective StudiesABSTRACT
BACKGROUND: Data regarding the clinical effects of bacteremia on severe community-acquired pneumonia (CAP) are limited. Thus, we investigated clinical characteristics and outcomes of severe CAP patients with bacteremia compared with those of subjects without bacteremia. In addition, we evaluated clinical factors associated with bacteremia at the time of sepsis awareness. METHODS: We enrolled sepsis patients diagnosed with CAP at emergency departments (EDs) from an ongoing nationwide multicenter observational registry, the Korean Sepsis Alliance, between September 2019 and December 2020. For evaluation of clinical factors associated with bacteremia, we divided eligible patients into bacteremia and non-bacteremia groups, and logistic regression analysis was performed using the clinical characteristics at the time of sepsis awareness. RESULT: During the study period, 1,510 (47.9%) sepsis patients were caused by CAP, and bacteremia was identified in 212 (14.0%) patients. Septic shock occurred more frequently in the bacteremia group than in the non-bacteremia group (27.4% vs. 14.8%; p < 0.001). In multivariable analysis, hematologic malignancies and septic shock were associated with an increased risk of bacteremia. However, chronic lung disease was associated with a decreased risk of bacteremia. Hospital mortality was significantly higher in the bacteremia group than in the non-bacteremia group (27.3% vs. 40.6%, p < 0.001). The most prevalent pathogen in blood culture was Klebsiella pneumoniae followed by Escherichia coli in gram-negative pathogens. CONCLUSION: The incidence of bacteremia in severe CAP was low at 14.0%, but the occurrence of bacteremia was associated with increased hospital mortality. In severe CAP, hematologic malignancies and septic shock were associated with an increased risk of bacteremia.
Subject(s)
Bacteremia , Community-Acquired Infections , Hematologic Neoplasms , Pneumonia , Sepsis , Shock, Septic , Humans , Bacteremia/epidemiology , Community-Acquired Infections/epidemiology , Escherichia coli , Hematologic Neoplasms/complications , Pneumonia/epidemiology , Pneumonia/complications , Retrospective Studies , Risk Factors , Sepsis/complications , Multicenter Studies as Topic , Observational Studies as TopicABSTRACT
BACKGROUND: Although chemotherapy-induced febrile neutropenia (FN) is the most common and life-threatening oncologic emergency, the characteristics and outcomes associated with return visits to the emergency department (ED) in these patients are uncertain. Hence, we aimed to investigate the predictive factors and clinical outcomes of chemotherapy-induced FN patients returning to the ED. METHOD: This single-center, retrospective observational study spanning 14 years included chemotherapy-induced FN patients who visited the ED and were discharged. The primary outcome was a return visit to the ED within five days. We conducted logistic regression analyses to evaluate the factors influencing ED return visit. RESULTS: This study included 1318 FN patients, 154 (12.1%) of whom revisited the ED within five days. Patients (53.3%) revisited the ED owing to persistent fever (56.5%), with no intensive care unit admission and only one mortality case who was discharged hopelessly. Multivariable analysis revealed that shock index >0.9 (odds ratio [OR]: 1.45, 95% confidence interval [CI], 1.01-2.10), thrombocytopenia (<100 × 103/uL) (OR: 1.64, 95% CI, 1.11-2.42), and lactic acid level > 2 mmol/L (OR: 1.51, 95% CI, 0.99-2.25) were associated with an increased risk of a return visit to the ED, whereas being transferred into the ED from other hospitals (OR: 0.08; 95% CI, 0.005-0.38) was associated with a decreased risk of a return visit to the ED. CONCLUSION: High shock index, lactic acid, thrombocytopenia, and ED arrival type can predict return visits to the ED in chemotherapy-induced FN patients.
Subject(s)
Antineoplastic Agents , Chemotherapy-Induced Febrile Neutropenia , Febrile Neutropenia , Humans , Chemotherapy-Induced Febrile Neutropenia/epidemiology , Hospitalization , Emergency Service, Hospital , Patient Discharge , Retrospective Studies , Antineoplastic Agents/adverse effects , Febrile Neutropenia/chemically induced , Febrile Neutropenia/epidemiology , Patient ReadmissionABSTRACT
BACKGROUND: Although the Life-Sustaining Treatment (LST) Decision Act was enforced in 2018 in Korea, data on whether it is well established in actual clinical settings are limited. Hospital-acquired pneumonia (HAP) is a common nosocomial infection with high mortality. However, there are limited data on the end-of-life (EOL) decision of patients with HAP. Therefore, we aimed to examine clinical characteristics and outcomes according to the EOL decision for patients with HAP. METHODS: This multicenter study enrolled patients with HAP at 16 referral hospitals retrospectively from January to December 2019. EOL decisions included do-not-resuscitate (DNR), withholding of LST, and withdrawal of LST. Descriptive and Kaplan-Meier curve analyses for survival were performed. RESULTS: Of 1,131 patients with HAP, 283 deceased patients with EOL decisions (105 cases of DNR, 108 cases of withholding of LST, and 70 cases of withdrawal of LST) were analyzed. The median age was 74 (IQR 63-81) years. The prevalence of solid malignant tumors was high (32.4% vs. 46.3% vs. 54.3%, P = 0.011), and the ICU admission rate was lower (42.9% vs. 35.2% vs. 24.3%, P = 0.042) in the withdrawal group. The prevalence of multidrug-resistant pathogens, impaired consciousness, and cough was significantly lower in the withdrawal group. Kaplan-Meier curve analysis revealed that 30-day and 60-day survival rates were higher in the withdrawal group than in the DNR and withholding groups (log-rank P = 0.021 and 0.018). The survival of the withdrawal group was markedly decreased after 40 days; thus, the withdrawal decision was made around this time. Among patients aged below 80 years, the rates of EOL decisions were not different (P = 0.430); however, mong patients aged over 80 years, the rate of withdrawal was significantly lower than that of DNR and withholding (P = 0.001). CONCLUSIONS: After the LST Decision Act was enforced in Korea, a DNR order was still common in EOL decisions. Baseline characteristics and outcomes were similar between the DNR and withholding groups; however, differences were observed in the withdrawal group. Withdrawal decisions seemed to be made at the late stage of dying. Therefore, advance care planning for patients with HAP is needed.
Subject(s)
Neoplasms , Pneumonia , Humans , Aged, 80 and over , Aged , Retrospective Studies , Decision Making , Resuscitation Orders , Withholding Treatment , Hospitals , Pneumonia/therapy , Republic of Korea/epidemiology , DeathABSTRACT
BACKGROUND: This study aimed to evaluate whether the effect of tachycardia varies according to the degree of tissue perfusion in septic shock. METHODS: Patients with septic shock admitted to the intensive care units were categorized into the tachycardia (heart rate > 100 beats/min) and non-tachycardia (≤ 100 beats/min) groups. The association of tachycardia with hospital mortality was evaluated in each subgroup with low and high lactate levels, which were identified through a subpopulation treatment effect pattern plot analysis. RESULTS: In overall patients, hospital mortality did not differ between the two groups (44.6% vs. 41.8%, P = 0.441), however, tachycardia was associated with reduced hospital mortality rates in patients with a lactate level ≥ 5.3 mmol/L (48.7% vs. 60.3%, P = 0.030; adjusted odds ratio [OR], 0.59, 95% confidence interval [CI], 0.35-0.99, P = 0.045), not in patients with a lactate level < 5.3 mmol/L (36.5% vs. 29.7%, P = 0.156; adjusted OR, 1.39, 95% CI, 0.82-2.35, P = 0.227). CONCLUSION: In septic shock patients, the effect of tachycardia on hospital mortality differed by serum lactate level. Tachycardia was associated with better survival in patients with significantly elevated lactate levels.
Subject(s)
Shock, Septic , Humans , Shock, Septic/complications , Lactic Acid , Intensive Care Units , Tachycardia/complications , Cohort Studies , Retrospective Studies , PrognosisABSTRACT
This corrects the article on p. e294 in vol. 37, PMID: 36281485.
ABSTRACT
BACKGROUND: There is insufficient data on the benefits of empiric antibiotic combinations for hospital-acquired pneumonia (HAP). We aimed to investigate whether empiric anti-pseudomonal combination therapy with fluoroquinolones decreases mortality in patients with HAP. METHODS: This multicenter, retrospective cohort study included adult patients admitted to 16 tertiary and general hospitals in Korea between January 1 and December 31, 2019. Patients with risk factors for combination therapy were divided into anti-pseudomonal non-carbapenem ß-lactam monotherapy and fluoroquinolone combination therapy groups. Primary outcome was 30-day mortality. Propensity score matching (PSM) was used to reduce selection bias. RESULTS: In total, 631 patients with HAP were enrolled. Monotherapy was prescribed in 54.7% (n = 345) of the patients, and combination therapy was prescribed in 45.3% (n = 286). There was no significant difference in 30-day mortality between the two groups (16.8% vs. 18.2%, P = 0.729) or even after the PSM (17.5% vs. 18.2%, P = 0.913). After the PSM, adjusted hazard ratio for 30-day mortality from the combination therapy was 1.646 (95% confidence interval, 0.782-3.461; P = 0.189) in the Cox proportional hazards model. Moreover, there was no significant difference in the appropriateness of initial empiric antibiotics between the two groups (55.0% vs. 56.8%, P = 0.898). The proportion of multidrug-resistant (MDR) pathogens was high in both groups. CONCLUSION: Empiric anti-pseudomonal fluoroquinolone combination therapy showed no survival benefit compared to ß-lactam monotherapy in patients with HAP. Caution is needed regarding the routine combination of fluoroquinolones in the empiric treatment of HAP patients with a high risk of MDR.
Subject(s)
Community-Acquired Infections , Pneumonia , Adult , Humans , beta-Lactams/therapeutic use , Fluoroquinolones/therapeutic use , Retrospective Studies , Propensity Score , Drug Therapy, Combination , Anti-Bacterial Agents/therapeutic use , Pneumonia/etiology , Hospitals , Community-Acquired Infections/drug therapyABSTRACT
: Although early recognition of sepsis is essential for timely treatment and can improve sepsis outcomes, no marker has demonstrated sufficient discriminatory power to diagnose sepsis. This study aimed to compare gene expression profiles between patients with sepsis and healthy volunteers to determine the accuracy of these profiles in diagnosing sepsis and to predict sepsis outcomes by combining bioinformatics data with molecular experiments and clinical information. We identified 422 differentially expressed genes (DEGs) between the sepsis and control groups, of which 93 immune-related DEGs were considered for further studies due to immune-related pathways being the most highly enriched. Key genes upregulated during sepsis, including S100A8, S100A9, and CR1, are responsible for cell cycle regulation and immune responses. Key downregulated genes, including CD79A, HLA-DQB2, PLD4, and CCR7, are responsible for immune responses. Furthermore, the key upregulated genes showed excellent to fair accuracy in diagnosing sepsis (area under the curve 0.747-0.931) and predicting in-hospital mortality (0.863-0.966) of patients with sepsis. In contrast, the key downregulated genes showed excellent accuracy in predicting mortality of patients with sepsis (0.918-0.961) but failed to effectively diagnosis sepsis.In conclusion, bioinformatics analysis identified key genes that may serve as biomarkers for diagnosing sepsis and predicting outcomes among patients with sepsis.
Subject(s)
Sepsis , Transcriptome , Humans , Protein Interaction Maps/genetics , Gene Regulatory Networks , Gene Expression Profiling , Sepsis/diagnosis , Sepsis/genetics , Computational BiologyABSTRACT
BACKGROUND: Recent guidelines recommended conducting spontaneous breathing trial (SBT) with modest inspiratory pressure augmentation rather than T-piece or continuous positive airway pressure. However, it was based on few studies focused on the outcomes of extubation rather than the weaning process, despite the existence of various weaning situations in clinical practice. This study was designed to investigate the effects of SBT with pressure support ventilation (PSV) or T-piece on weaning outcomes. METHODS: All consecutive patients admitted to two medical intensive care units (ICUs) and those requiring mechanical ventilation (MV) for more than 24 h from November 1, 2017 to September 30, 2020 were prospectively registered. T-piece trial was used until March 2019, and then, pressure support of 8 cmH2O and 0 positive end-expiratory pressure were used for SBT since July 2019, after a 3-month transition period for the revised SBT protocol. The primary outcome of this study was successful weaning defined according to the WIND (Weaning according to a New Definition) definition and were compared between the T-piece group and PSV group. The association between the SBT method and weaning outcome was evaluated with logistic regression analysis. RESULTS: In this study, 787 eligible patients were divided into the T-piece (n = 473) and PSV (n = 314) groups after excluding patients for a 3-month transition period. Successful weaning was not different between the two groups (85.0% vs. 86.3%; p = 0.607). However, the PSV group had a higher proportion of short weaning (70.1% vs. 59.0%; p = 0.002) and lower proportion of difficult weaning (13.1% vs. 24.1%; p < 0.001) than the T-piece group. The proportion of prolonged weaning was similar between the two groups (16.9% vs. 16.9%; p = 0.990). After excluding patients who underwent tracheostomy before the SBTs, similar results were found. Reintubation rates at 48 h, 72 h, and 7 days following the planned extubation were not different between the PSV and T-piece groups. Moreover, no significant differences in intensive care unit and hospital mortality and length of stay were observed. CONCLUSIONS: In critically ill medical patients, SBT using PSV was not associated with a higher rate of successful weaning compared with SBT using T-piece. However, PSV could shorten the weaning process without increasing the risk of reintubation.
Subject(s)
Critical Illness/therapy , Intensive Care Units , Positive-Pressure Respiration/methods , Ventilator Weaning/methods , Aged , Airway Extubation , Critical Illness/epidemiology , Female , Follow-Up Studies , Hospital Mortality/trends , Humans , Intubation, Intratracheal , Male , Middle Aged , Prospective StudiesABSTRACT
Rationale: Acute respiratory failure (ARF) is associated with high mortality in immunocompromised patients, particularly when invasive mechanical ventilation is needed. Therefore, noninvasive oxygenation/ventilation strategies have been developed to avoid intubation, with uncertain impact on mortality, especially when intubation is delayed. Objectives: We sought to report trends of survival over time in immunocompromised patients receiving invasive mechanical ventilation. The impact of delayed intubation after failure of noninvasive strategies was also assessed. Methods: Systematic review and meta-analysis using individual patient data of studies that focused on immunocompromised adult patients with ARF requiring invasive mechanical ventilation. Studies published in English were identified through PubMed, Web of Science, and Cochrane Central (2008-2018). Individual patient data were requested from corresponding authors for all identified studies. We used mixed-effect models to estimate the effect of delayed intubation on hospital mortality and described mortality rates over time. Measurements and Main Results: A total of 11,087 patients were included (24 studies, three controlled trials, and 21 cohorts), of whom 7,736 (74%) were intubated within 24 hours of ICU admission (early intubation). The crude mortality rate was 53.2%. Adjusted survivals improved over time (from 1995 to 2017, odds ratio [OR] for hospital mortality per year, 0.96 [0.95-0.97]). For each elapsed day between ICU admission and intubation, mortality was higher (OR, 1.38 [1.26-1.52]; P < 0.001). Early intubation was significantly associated with lower mortality (OR, 0.83 [0.72-0.96]), regardless of initial oxygenation strategy. These results persisted after propensity score analysis (matched OR associated with delayed intubation, 1.56 [1.44-1.70]). Conclusions: In immunocompromised intubated patients, survival has improved over time. Time between ICU admission and intubation is a strong predictor of mortality, suggesting a detrimental effect of late initial oxygenation failure.
Subject(s)
Hospital Mortality/trends , Immunocompromised Host , Noninvasive Ventilation/mortality , Respiration, Artificial/mortality , Respiratory Insufficiency/mortality , Respiratory Insufficiency/therapy , Adult , Aged , Aged, 80 and over , Data Analysis , Female , Forecasting , Humans , Male , Middle Aged , Noninvasive Ventilation/methods , Odds Ratio , Propensity Score , Respiration, Artificial/methodsABSTRACT
BACKGROUND: Social and hospital environmental factors that may be associated with hospital-acquired pneumonia (HAP) have not been evaluated. Comprehensive risk assessment for the incidence of HAP including sociodemographic, clinical, and hospital environmental factors was conducted using national health insurance claims data. METHODS: This is a population-based retrospective cohort study of adult patients who were hospitalized for more than 3 days from the Health Insurance Review and Assessment Service-National Inpatient Sample data between January 1, 2016 and December 31, 2018 in South Korea. Multivariable logistic regression analyses were conducted to identify the factors associated with the incidence of HAP. RESULTS: Among the 512,278 hospitalizations, we identified 25,369 (5.0%) HAP cases. In multivariable analysis, well-known risk factors associated with HAP such as older age (over 70 vs. 20-29; adjusted odds ratio [aOR], 3.66; 95% confidence interval [CI] 3.36-3.99), male sex (aOR, 1.35; 95% CI 1.32-1.39), pre-existing lung diseases (asthma [aOR, 1.73; 95% CI 1.66-1.80]; chronic obstructive pulmonary disease [aOR, 1.62; 95% CI 1.53-1.71]; chronic lower airway disease [aOR, 1.79; 95% CI 1.73-1.85]), tube feeding (aOR, 3.32; 95% CI 3.16-3.50), suctioning (aOR, 2.34; 95% CI 2.23-2.47), positioning (aOR, 1.63; 95% CI 1.55-1.72), use of mechanical ventilation (aOR, 2.31; 95% CI 2.15-2.47), and intensive care unit admission (aOR, 1.29; 95% CI 1.22-1.36) were associated with the incidence of HAP. In addition, poverty (aOR, 1.08; 95% CI 1.04-1.13), general hospitals (aOR, 1.54; 95% CI 1.39-1.70), higher bed-to-nurse ratio (Grade ≥ 5; aOR, 1.45; 95% CI 1.32-1.59), higher number of beds per hospital room (6 beds; aOR, 3.08; 95% CI 2.77-3.42), and ward with caregiver (aOR, 1.19; 95% CI 1.12-1.26) were related to the incidence of HAP. CONCLUSIONS: The incidence of HAP was associated with various sociodemographic, clinical, and hospital environmental factors. Thus, taking a comprehensive approach to prevent and treat HAP is important.
Subject(s)
Healthcare-Associated Pneumonia/epidemiology , Adult , Aged , Aged, 80 and over , Comorbidity , Demography , Environment , Female , Humans , Incidence , Male , Middle Aged , Republic of Korea/epidemiology , Retrospective Studies , Risk Assessment , Risk Factors , Social Factors , Young AdultABSTRACT
BACKGROUND: The quick sequential organ failure assessment (qSOFA) score is suggested to use for screening patients with a high risk of clinical deterioration in the general wards, which could simply be regarded as a general early warning score. However, comparison of unselected admissions to highlight the benefits of introducing qSOFA in hospitals already using Modified Early Warning Score (MEWS) remains unclear. We sought to compare qSOFA with MEWS for predicting clinical deterioration in general ward patients regardless of suspected infection. METHODS: The predictive performance of qSOFA and MEWS for in-hospital cardiac arrest (IHCA) or unexpected intensive care unit (ICU) transfer was compared with the areas under the receiver operating characteristic curve (AUC) analysis using the databases of vital signs collected from consecutive hospitalized adult patients over 12 months in five participating hospitals in Korea. RESULTS: Of 173,057 hospitalized patients included for analysis, 668 (0.39%) experienced the composite outcome. The discrimination for the composite outcome for MEWS (AUC, 0.777; 95% confidence interval [CI], 0.770-0.781) was higher than that for qSOFA (AUC, 0.684; 95% CI, 0.676-0.686; P < 0.001). In addition, MEWS was better for prediction of IHCA (AUC, 0.792; 95% CI, 0.781-0.795 vs. AUC, 0.640; 95% CI, 0.625-0.645; P < 0.001) and unexpected ICU transfer (AUC, 0.767; 95% CI, 0.760-0.773 vs. AUC, 0.716; 95% CI, 0.707-0.718; P < 0.001) than qSOFA. Using the MEWS at a cutoff of ≥ 5 would correctly reclassify 3.7% of patients from qSOFA score ≥ 2. Most patients met MEWS ≥ 5 criteria 13 hours before the composite outcome compared with 11 hours for qSOFA score ≥ 2. CONCLUSION: MEWS is more accurate that qSOFA score for predicting IHCA or unexpected ICU transfer in patients outside the ICU. Our study suggests that qSOFA should not replace MEWS for identifying patients in the general wards at risk of poor outcome.
Subject(s)
Clinical Deterioration , Early Warning Score , Sepsis , Adult , Humans , Organ Dysfunction Scores , Patients' Rooms , Retrospective Studies , Sepsis/diagnosisABSTRACT
BACKGROUND: The demand for lung transplants continues to increase in Korea, and donor shortages and waitlist mortality are critical issues. This study aimed to evaluate the factors that affect waitlist outcomes from the time of registration for lung transplantation in Korea. METHODS: Data were obtained from the Korean Network for Organ Sharing for lung-only registrations between September 7, 2009, and December 31, 2020. Post-registration outcomes were evaluated according to the lung disease category, blood group, and age. RESULTS: Among the 1,671 registered patients, 49.1% had idiopathic pulmonary fibrosis (group C), 37.0% had acute respiratory distress syndrome and other interstitial lung diseases (group D), 7.2% had chronic obstructive pulmonary disease (group A), and 6.6% had primary pulmonary hypertension (group B). Approximately half of the patients (46.1%) were transplanted within 1 year of registration, while 31.8% died without receiving a lung transplant within 1 year of registration. Data from 1,611 patients were used to analyze 1-year post-registration outcomes, which were classified as transplanted (46.1%, n = 743), still awaiting (21.1%, n = 340), removed (0.9%, n = 15), and death on waitlist (31.8%, n = 513). No significant difference was found in the transplantation rate according to the year of registration. However, significant differences occurred between the waitlist mortality rates (P = 0.008) and the still awaiting rates (P = 0.009). The chance of transplantation after listing varies depending on the disease category, blood type, age, and urgency status. Waitlist mortality within 1 year was significantly associated with non-group A disease (hazard ratio [HR], 2.76, P < 0.001), age ≥ 65 years (HR, 1.48, P < 0.001), and status 0 at registration (HR, 2.10, P < 0.001). CONCLUSION: Waitlist mortality is still higher in Korea than in other countries. Future revisions to the lung allocation system should take into consideration the high waitlist mortality and donor shortages.
Subject(s)
Blood Group Antigens , Lung Transplantation , Humans , Aged , Data Analysis , Waiting Lists , Tissue Donors , Retrospective StudiesABSTRACT
Background and Objectives: Point-of-care ultrasound (POCUS) is a useful tool that helps clinicians properly treat patients in emergency department (ED). This study aimed to evaluate the impact of specific interventions on the use of POCUS in the ED. Materials and Methods: This retrospective study used an interrupted time series analysis to assess how interventions changed the use of POCUS in the emergency department of a tertiary medical institute in South Korea from October 2016 to February 2021. We chose two main interventions-expansion of benefit coverage of the National Health Insurance (NHI) for emergency ultrasound (EUS) and annual ultrasound educational workshops. The primary variable was the EUS rate, defined as the number of EUS scans per 1000 eligible patients per month. We compared the level and slope of EUS rates before and after interventions. Results: A total of 5188 scanned records were included. Before interventions, the EUS rate had increased gradually. After interventions, except for the first workshop, the EUS rate immediately increased significantly (p < 0.05). The difference in the EUS rate according to the expansion of the NHI was estimated to be the largest (p < 0.001). However, the change in slope significantly decreased after the third workshop during the coronavirus disease 2019 pandemic (p = 0.004). The EUS rate increased significantly in the presence of physicians participating in intensive POCUS training (p < 0.001). Conclusion: This study found that expansion of insurance coverage for EUS and ultrasound education led to a significant and immediate increase in the use of POCUS, suggesting that POCUS use can be increased by improving education and insurance benefits.
Subject(s)
COVID-19 , Point-of-Care Systems , Emergency Service, Hospital , Humans , Insurance Benefits , Interrupted Time Series Analysis , Retrospective Studies , SARS-CoV-2 , UltrasonographyABSTRACT
BACKGROUND: Limited data are available on practical predictors of successful de-cannulation among the patients who undergo tracheostomies. We evaluated factors associated with failed de-cannulations to develop a prediction model that could be easily be used at the time of weaning from MV. METHODS: In a retrospective cohort of 346 tracheostomised patients managed by a standardized de-cannulation program, multivariable logistic regression analysis identified variables that were independently associated with failed de-cannulation. Based on the logistic regression analysis, the new predictive scoring system for successful de-cannulation, referred to as the DECAN score, was developed and then internally validated. RESULTS: The model included age > 67 years, body mass index < 22 kg/m2, underlying malignancy, non-respiratory causes of mechanical ventilation (MV), presence of neurologic disease, vasopressor requirement, and presence of post-tracheostomy pneumonia, presence of delirium. The DECAN score was associated with good calibration (goodness-of-fit, 0.6477) and discrimination outcomes (area under the receiver operating characteristic curve 0.890, 95% CI 0.853-0.921). The optimal cut-off point for the DECAN score for the prediction of the successful de-cannulation was ≤ 5 points, and was associated with the specificities of 84.6% (95% CI 77.7-90.0) and sensitivities of 80.2% (95% CI 73.9-85.5). CONCLUSIONS: The DECAN score for tracheostomised patients who are successfully weaned from prolonged MV can be computed at the time of weaning to assess the probability of de-cannulation based on readily available variables.
Subject(s)
Chest Tubes , Decision Support Techniques , Device Removal , Respiration, Artificial , Tracheostomy/instrumentation , Ventilator Weaning , Adult , Aged , Aged, 80 and over , Device Removal/adverse effects , Equipment Design , Female , Humans , Intensive Care Units , Male , Middle Aged , Predictive Value of Tests , Respiration, Artificial/adverse effects , Retrospective Studies , Risk Assessment , Risk Factors , Tracheostomy/adverse effects , Treatment Outcome , Ventilator Weaning/adverse effectsABSTRACT
BACKGROUND: Rapid response systems (RRSs) improve patients' safety, but the role of dedicated doctors within these systems remains controversial. We aimed to evaluate patient survival rates and differences in types of interventions performed depending on the presence of dedicated doctors in the RRS. METHODS: Patients managed by the RRSs of 9 centers in South Korea from January 1, 2016, through December 31, 2017, were included retrospectively. We used propensity score-matched analysis to balance patients according to the presence of dedicated doctors in the RRS. The primary outcome was in-hospital survival. The secondary outcomes were the incidence of interventions performed. A sensitivity analysis was performed with the subgroup of patients diagnosed with sepsis or septic shock. RESULTS: After propensity score matching, 2981 patients were included per group according to the presence of dedicated doctors in the RRS. The presence of the dedicated doctors was not associated with patients' overall likelihood of survival (hazard ratio for death 1.05, 95% confidence interval [CI] 0.93â1.20). Interventions, such as arterial line insertion (odds ratio [OR] 25.33, 95% CI 15.12â42.44) and kidney replacement therapy (OR 10.77, 95% CI 6.10â19.01), were more commonly performed for patients detected using RRS with dedicated doctors. The presence of dedicated doctors in the RRS was associated with better survival of patients with sepsis or septic shock (hazard ratio for death 0.62, 95% CI 0.39â0.98) and lower intensive care unit admission rates (OR 0.53, 95% CI 0.37â0.75). CONCLUSIONS: The presence of dedicated doctors within the RRS was not associated with better survival in the overall population but with better survival and lower intensive care unit admission rates for patients with sepsis or septic shock.
Subject(s)
Health Workforce/trends , Hospital Mortality/trends , Hospital Rapid Response Team/trends , Physicians/trends , Propensity Score , Aged , Aged, 80 and over , Female , Humans , Intensive Care Units/trends , Male , Middle Aged , Physicians/supply & distribution , Republic of Korea/epidemiology , Retrospective Studies , Survival Rate/trends , Treatment OutcomeABSTRACT
BACKGROUND: Rapid response system (RRS) is being increasingly adopted to improve patient safety in hospitals worldwide. However, predictors of survival outcome after RRS activation because of unexpected clinical deterioration are not well defined. We investigated whether hospital length of stay (LOS) before RRS activation can predict the clinical outcomes. METHODS: Using a nationwide multicenter RRS database, we identified patients for whom RRS was activated during hospitalization at 9 tertiary referral hospitals in South Korea between January 1, 2016, and December 31, 2017. All information on patient characteristics, RRS activation, and clinical outcomes were retrospectively collected by reviewing patient medical records at each center. Patients were categorized into two groups according to their hospital LOS before RRS activation: early deterioration (LOS < 5 days) and late deterioration (LOS ≥ 5 days). The primary outcome was 28-day mortality and multivariable logistic regression was used to compare the two groups. In addition, propensity score-matched analysis was used to minimize the effects of confounding factors. RESULTS: Among 11,612 patients, 5779 and 5883 patients belonged to the early and late deterioration groups, respectively. Patients in the late deterioration group were more likely to have malignant disease and to be more severely ill at the time of RRS activation. After adjusting for confounding factors, the late deterioration group had higher 28-day mortality (aOR 1.60, 95% CI 1.44-1.77). Other clinical outcomes (in-hospital mortality and hospital LOS after RRS activation) were worse in the late deterioration group as well, and similar results were found in the propensity score-matched analysis (aOR for 28-day mortality 1.66, 95% CI 1.45-1.91). CONCLUSIONS: Patients who stayed longer in the hospital before RRS activation had worse clinical outcomes. During the RRS team review of patients, hospital LOS before RRS activation should be considered as a predictor of future outcome.