ABSTRACT
BACKGROUND: Glutamine is thought to have beneficial effects on the metabolic and stress response to severe injury. Clinical trials involving patients with burns and other critically ill patients have shown conflicting results regarding the benefits and risks of glutamine supplementation. METHODS: In a double-blind, randomized, placebo-controlled trial, we assigned patients with deep second- or third-degree burns (affecting ≥10% to ≥20% of total body-surface area, depending on age) within 72 hours after hospital admission to receive 0.5 g per kilogram of body weight per day of enterally delivered glutamine or placebo. Trial agents were given every 4 hours through a feeding tube or three or four times a day by mouth until 7 days after the last skin grafting procedure, discharge from the acute care unit, or 3 months after admission, whichever came first. The primary outcome was the time to discharge alive from the hospital, with data censored at 90 days. We calculated subdistribution hazard ratios for discharge alive, which took into account death as a competing risk. RESULTS: A total of 1209 patients with severe burns (mean burn size, 33% of total body-surface area) underwent randomization, and 1200 were included in the analysis (596 patients in the glutamine group and 604 in the placebo group). The median time to discharge alive from the hospital was 40 days (interquartile range, 24 to 87) in the glutamine group and 38 days (interquartile range, 22 to 75) in the placebo group (subdistribution hazard ratio for discharge alive, 0.91; 95% confidence interval [CI], 0.80 to 1.04; P = 0.17). Mortality at 6 months was 17.2% in the glutamine group and 16.2% in the placebo group (hazard ratio for death, 1.06; 95% CI, 0.80 to 1.41). No substantial between-group differences in serious adverse events were observed. CONCLUSIONS: In patients with severe burns, supplemental glutamine did not reduce the time to discharge alive from the hospital. (Funded by the U.S. Department of Defense and the Canadian Institutes of Health Research; RE-ENERGIZE ClinicalTrials.gov number, NCT00985205.).
Subject(s)
Burns , Enteral Nutrition , Glutamine , Burns/drug therapy , Burns/pathology , Canada , Critical Illness/therapy , Double-Blind Method , Enteral Nutrition/adverse effects , Enteral Nutrition/methods , Glutamine/administration & dosage , Glutamine/adverse effects , Glutamine/therapeutic use , HumansABSTRACT
BACKGROUND: Studies that have evaluated the use of intravenous vitamin C in adults with sepsis who were receiving vasopressor therapy in the intensive care unit (ICU) have shown mixed results with respect to the risk of death and organ dysfunction. METHODS: In this randomized, placebo-controlled trial, we assigned adults who had been in the ICU for no longer than 24 hours, who had proven or suspected infection as the main diagnosis, and who were receiving a vasopressor to receive an infusion of either vitamin C (at a dose of 50 mg per kilogram of body weight) or matched placebo administered every 6 hours for up to 96 hours. The primary outcome was a composite of death or persistent organ dysfunction (defined by the use of vasopressors, invasive mechanical ventilation, or new renal-replacement therapy) on day 28. RESULTS: A total of 872 patients underwent randomization (435 to the vitamin C group and 437 to the control group). The primary outcome occurred in 191 of 429 patients (44.5%) in the vitamin C group and in 167 of 434 patients (38.5%) in the control group (risk ratio, 1.21; 95% confidence interval [CI], 1.04 to 1.40; P = 0.01). At 28 days, death had occurred in 152 of 429 patients (35.4%) in the vitamin C group and in 137 of 434 patients (31.6%) in the placebo group (risk ratio, 1.17; 95% CI, 0.98 to 1.40) and persistent organ dysfunction in 39 of 429 patients (9.1%) and 30 of 434 patients (6.9%), respectively (risk ratio, 1.30; 95% CI, 0.83 to 2.05). Findings were similar in the two groups regarding organ-dysfunction scores, biomarkers, 6-month survival, health-related quality of life, stage 3 acute kidney injury, and hypoglycemic episodes. In the vitamin C group, one patient had a severe hypoglycemic episode and another had a serious anaphylaxis event. CONCLUSIONS: In adults with sepsis receiving vasopressor therapy in the ICU, those who received intravenous vitamin C had a higher risk of death or persistent organ dysfunction at 28 days than those who received placebo. (Funded by the Lotte and John Hecht Memorial Foundation; LOVIT ClinicalTrials.gov number, NCT03680274.).
Subject(s)
Ascorbic Acid , Sepsis , Adult , Ascorbic Acid/adverse effects , Humans , Hypoglycemic Agents/therapeutic use , Intensive Care Units , Multiple Organ Failure , Quality of Life , Sepsis/drug therapy , Vasoconstrictor Agents/adverse effects , Vitamins/adverse effectsABSTRACT
Rationale: It is increasingly recognized that adults with preserved ratio impaired spirometry (PRISm) are prone to increased morbidity. However, the underlying pathophysiological mechanisms are unknown. Objectives: Evaluate the mechanisms of increased dyspnea and reduced exercise capacity in PRISm. Methods: We completed a cross-sectional analysis of the CanCOLD (Canadian Cohort Obstructive Lung Disease) population-based study. We compared physiological responses in 59 participants meeting PRISm spirometric criteria (post-bronchodilator FEV1 < 80% predicted and FEV1/FVC ⩾ 0.7), 264 control participants, and 170 ever-smokers with chronic obstructive pulmonary disease (COPD), at rest and during cardiopulmonary exercise testing. Measurements and Main Results: Individuals with PRISm had lower total lung, vital, and inspiratory capacities than healthy controls (all P < 0.05) and minimal small airway, pulmonary gas exchange, and radiographic parenchymal lung abnormalities. Compared with healthy controls, individuals with PRISm had higher dyspnea/[Formula: see text]o2 ratio at peak exercise (4.0 ± 2.2 vs. 2.9 ± 1.9 Borg units/L/min; P < 0.001) and lower [Formula: see text]o2peak (74 ± 22% predicted vs. 96 ± 25% predicted; P < 0.001). At standardized submaximal work rates, individuals with PRISm had greater Vt/inspiratory capacity (Vt%IC; P < 0.001), reflecting inspiratory mechanical constraint. In contrast to participants with PRISm, those with COPD had characteristic small airways dysfunction, dynamic hyperinflation, and pulmonary gas exchange abnormalities. Despite these physiological differences among the three groups, the relationship between increasing dyspnea and Vt%IC during cardiopulmonary exercise testing was similar. Resting IC significantly correlated with [Formula: see text]o2peak (r = 0.65; P < 0.001) in the entire sample, even after adjusting for airflow limitation, gas trapping, and diffusing capacity. Conclusions: In individuals with PRISm, lower exercise capacity and higher exertional dyspnea than healthy controls were mainly explained by lower resting lung volumes and earlier onset of dynamic inspiratory mechanical constraints at relatively low work rates. Clinical trial registered with www.clinicaltrials.gov (NCT00920348).
Subject(s)
Dyspnea , Exercise Tolerance , Pulmonary Disease, Chronic Obstructive , Spirometry , Humans , Male , Dyspnea/physiopathology , Dyspnea/etiology , Female , Cross-Sectional Studies , Middle Aged , Aged , Exercise Tolerance/physiology , Pulmonary Disease, Chronic Obstructive/physiopathology , Exercise Test/methods , Canada , Forced Expiratory Volume/physiologyABSTRACT
BACKGROUND: On the basis of low-quality evidence, international critical care nutrition guidelines recommend a wide range of protein doses. The effect of delivering high-dose protein during critical illness is unknown. We aimed to test the hypothesis that a higher dose of protein provided to critically ill patients would improve their clinical outcomes. METHODS: This international, investigator-initiated, pragmatic, registry-based, single-blinded, randomised trial was undertaken in 85 intensive care units (ICUs) across 16 countries. We enrolled nutritionally high-risk adults (≥18 years) undergoing mechanical ventilation to compare prescribing high-dose protein (≥2·2 g/kg per day) with usual dose protein (≤1·2 g/kg per day) started within 96 h of ICU admission and continued for up to 28 days or death or transition to oral feeding. Participants were randomly allocated (1:1) to high-dose protein or usual dose protein, stratified by site. As site personnel were involved in both prescribing and delivering protein dose, it was not possible to blind clinicians, but patients were not made aware of the treatment assignment. The primary efficacy outcome was time-to-discharge-alive from hospital up to 60 days after ICU admission and the secondary outcome was 60-day morality. Patients were analysed in the group to which they were randomly assigned regardless of study compliance, although patients who dropped out of the study before receiving the study intervention were excluded. This study is registered with ClinicalTrials.gov, NCT03160547. FINDINGS: Between Jan 17, 2018, and Dec 3, 2021, 1329 patients were randomised and 1301 (97·9%) were included in the analysis (645 in the high-dose protein group and 656 in usual dose group). By 60 days after randomisation, the cumulative incidence of alive hospital discharge was 46·1% (95 CI 42·0%-50·1%) in the high-dose compared with 50·2% (46·0%-54·3%) in the usual dose protein group (hazard ratio 0·91, 95% CI 0·77-1·07; p=0·27). The 60-day mortality rate was 34·6% (222 of 642) in the high dose protein group compared with 32·1% (208 of 648) in the usual dose protein group (relative risk 1·08, 95% CI 0·92-1·26). There appeared to be a subgroup effect with higher protein provision being particularly harmful in patients with acute kidney injury and higher organ failure scores at baseline. INTERPRETATION: Delivery of higher doses of protein to mechanically ventilated critically ill patients did not improve the time-to-discharge-alive from hospital and might have worsened outcomes for patients with acute kidney injury and high organ failure scores. FUNDING: None.
Subject(s)
Critical Care , Critical Illness , Adult , Humans , Critical Illness/therapy , Intensive Care Units , Hospitalization , Respiration, Artificial , RegistriesABSTRACT
OBJECTIVES: The association between protein intake and the need for mechanical ventilation (MV) is controversial. We aimed to investigate the associations between protein intake and outcomes in ventilated critically ill patients. DESIGN: Analysis of a subset of a large international point prevalence survey of nutritional practice in ICUs. SETTING: A total of 785 international ICUs. PATIENTS: A total of 12,930 patients had been in the ICU for at least 96 hours and required MV by the fourth day after ICU admission at the latest. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We modeled associations between the adjusted hazard rate (aHR) of death in patients requiring MV and successful weaning (competing risks), and three categories of protein intake (low: < 0.8 g/kg/d, standard: 0.8-1.2 g/kg/d, high: > 1.2 g/kg/d). We compared five different hypothetical protein diets (an exclusively low protein intake, a standard protein intake given early (days 1-4) or late (days 5-11) after ICU admission, and an early or late high protein intake). There was no evidence that the level of protein intake was associated with time to weaning. However, compared with an exclusively low protein intake, a standard protein intake was associated with a lower hazard of death in MV: minimum aHR 0.60 (95% CI, 0.45-0.80). With an early high intake, there was a trend to a higher risk of death in patients requiring MV: maximum aHR 1.35 (95% CI, 0.99-1.85) compared with a standard diet. CONCLUSIONS: The duration of MV does not appear to depend on protein intake, whereas mortality in patients requiring MV may be improved by a standard protein intake. Adverse effects of a high protein intake cannot be excluded.
Subject(s)
Respiration, Artificial , Ventilator Weaning , Humans , Critical Illness/therapy , Intensive Care Units , HospitalizationABSTRACT
OBJECTIVES: Across guidelines, protein dosing for critically ill patients with obesity varies considerably. The objective of this analysis was to evaluate whether this population would benefit from higher doses of protein. DESIGN: A post hoc subgroup analysis of the effect of higher protein dosing in critically ill patients with high nutritional risk (EFFORT Protein): an international, multicenter, pragmatic, registry-based randomized trial. SETTING: Eighty-five adult ICUs across 16 countries. PATIENTS: Patients with obesity defined as a body mass index (BMI) greater than or equal to 30 kg/m 2 ( n = 425). INTERVENTIONS: In the primary study, patients were randomized into a high-dose (≥ 2.2 g/kg/d) or usual-dose protein group (≤ 1.2 g/kg/d). MEASUREMENTS AND MAIN RESULTS: Protein intake was monitored for up to 28 days, and outcomes (time to discharge alive [TTDA], 60-d mortality, days of mechanical ventilation [MV], hospital, and ICU length of stay [LOS]) were recorded until 60 days post-randomization. Of the 1301 patients in the primary study, 425 had a BMI greater than or equal to 30 kg/m 2 . After adjusting for sites and covariates, we observed a nonsignificant slower rate of TTDA with higher protein that ruled out a clinically important benefit (hazard ratio, 0.78; 95% CI, 0.58-1.05; p = 0.10). We found no evidence of difference in TTDA between protein groups when subgroups with different classes of obesity or patients with and without various nutritional and frailty risk variables were examined, even after the removal of patients with baseline acute kidney injury. Overall, 60-day mortality rates were 31.5% and 28.2% in the high protein and usual protein groups, respectively (risk difference, 3.3%; 95% CI, -5.4 to 12.1; p = 0.46). Duration of MV and LOS in hospital and ICU were not significantly different between groups. CONCLUSIONS: In critically ill patients with obesity, higher protein doses did not improve clinical outcomes, including those with higher nutritional and frailty risk.
Subject(s)
Critical Illness , Frailty , Adult , Humans , Critical Illness/therapy , Obesity , Intensive Care Units , Proportional Hazards Models , Length of StayABSTRACT
INTRODUCTION: Vitamin D insufficiency is common in patients who receive hemodialysis, yet there is no clear guidance regarding surveillance or treatment. We hypothesized that increasing 25(OH)D3 levels is associated with lower phosphate, parathyroid hormone (PTH), and alkaline phosphatase (ALP). METHODS: Baseline 25(OH)D3 level was measured in all patients receiving in-center hemodialysis in June 2017. Laboratory parameters were measured every 6 (phosphate, calcium) or 12 weeks (25(OH)D3, PTH, ALP) until February 2021. In September 2018, a treatment algorithm of 50,000 IU weekly until sufficient followed by 50,000 IU monthly was suggested. Generalized linear mixed regression models including linear spline effects, a log link function, and random effects were used to examine the impact of increasing 25(OH)D3 levels on calcium, phosphate, ALP, and PTH. RESULTS: Of 697 participants, 15% and 57% had vitamin D deficiency (25(OH)D3 <25 nmol/L) and insufficiency (between 25 and 74 nmol/L). Incorporating up to 7,272 observations, increasing 25(OH)D3 was associated with significantly decreasing PTH for 25(OH)D3 levels between 25 and 75 nmol/L regardless of vitamin D treatment. In an interaction model, the negative slope between 25(OH)D3 and PTH remained significant beyond 75 nmol/L in the absence of calcitriol. Increasing 25(OH)D3 was associated with significantly decreasing phosphate for 25(OH)D3 levels between 25 and 75 nmol/L regardless of vitamin D treatment and below 25 nmol/L in values of untreated patients. Calcium increased across the spectrum of 25(OH)D3 regardless of vitamin D treatment. Overall, 0.2% of 25(OH)D3 levels exceeded 250 nmol/L and 2.1% of calcium levels exceeded the normal range. CONCLUSIONS: Vitamin D treatment in a real-world setting was safe and associated with lower PTH levels. Whether improved biochemical markers translate to a reduction in clinical endpoints warrants further study.
ABSTRACT
BACKGROUND: The EFFORT Protein trial assessed the effect of high vs usual dosing of protein in adult ICU patients with organ failure. This study provides a probabilistic interpretation and evaluates heterogeneity in treatment effects (HTE). METHODS: We analysed 60-day all-cause mortality and time to discharge alive from hospital using Bayesian models with weakly informative priors. HTE on mortality was assessed according to disease severity (Sequential Organ Failure Assessment [SOFA] score), acute kidney injury, and serum creatinine values at baseline. RESULTS: The absolute difference in mortality was 2.5% points (95% credible interval -6.9 to 12.4), with a 72% posterior probability of harm associated with high protein treatment. For time to discharge alive from hospital, the hazard ratio was 0.91 (95% credible interval 0.80 to 1.04) with a 92% probability of harm for the high-dose protein group compared with the usual-dose protein group. There were 97% and 95% probabilities of positive interactions between the high protein intervention and serum creatinine and SOFA score at randomisation, respectively. Specifically, there was a potentially relatively higher mortality of high protein doses with higher baseline serum creatinine or SOFA scores. CONCLUSIONS: We found moderate to high probabilities of harm with high protein doses compared with usual protein in ICU patients for the primary and secondary outcomes. We found suggestions of heterogeneity in treatment effects with worse outcomes in participants randomised to high protein doses with renal dysfunction or acute kidney injury and greater illness severity at baseline.
ABSTRACT
OBJECTIVES: Evidence supporting glutamine supplementation in severe adult burn patients has created a state of uncertainty due to the variability in the treatment effect reported across small and large randomized controlled trials (RCTs). We aimed to systematically review the effect of glutamine supplementation on mortality in severe adult burn patients. DATA SOURCES: MEDLINE, Embase, CINAHL, and Cochrane Central were searched from inception to February 10, 2023. STUDY SELECTION: RCTs evaluating the effect of enteral or IV glutamine supplementation alone in severe adult burn patients were included. DATA EXTRACTION: Two reviewers independently extracted data on study characteristics, burn injury characteristics, description of the intervention between groups, adverse events, and clinical outcomes. DATA SYNTHESIS: Random effects meta-analyses were performed to estimate the pooled risk ratio (RR). Trial sequential analyses (TSA) for mortality and infectious complications were performed. Ten RCTs (1,577 patients) were included. We observed no significant effect of glutamine supplementation on overall mortality (RR, 0.65, 95% CI, 0.33-1.28; p = 0.21), infectious complications (RR, 0.83; 95% CI, 0.63-1.09; p = 0.18), or other secondary outcomes. In subgroup analyses, we observed no significant effects based on administration route or burn severity. We did observe a significant subgroup effect between single and multicenter RCTs in which glutamine significantly reduced mortality and infectious complications in singe-center RCTs but not in multicenter RCTs. However, TSA showed that the pooled results of single-center RCTs were type 1 errors and further trials would be futile. CONCLUSIONS: Glutamine supplementation, regardless of administration, does not appear to improve clinical outcomes in severely adult burned patients.
Subject(s)
Dietary Supplements , Glutamine , Humans , Adult , Glutamine/therapeutic use , Length of Stay , Multicenter Studies as TopicABSTRACT
BACKGROUND: Vitamin K activates matrix Gla protein (MGP), a key inhibitor of vascular calcification. There is a high prevalence of sub-clinical vitamin K deficiency in patients with end-stage kidney disease. METHODS: A parallel randomized placebo-controlled pilot trial was designed to determine whether 10 mg of phylloquinone thrice weekly versus placebo modifies coronary artery calcification progression over 12 months in patients requiring hemodialysis with a coronary artery calcium score (CAC) ≥30 Agatston Units (ClinicalTrials.gov identifier NCT01528800). The primary outcome was feasibility (recruitment rate, compliance with study medication, study completion and adherence overall to study protocol). CAC score was used to assess calcification at baseline and 12 months. Secondary objectives were to explore the impact of phylloquinone on vitamin K-related biomarkers (phylloquinone, dephospho-uncarboxylated MGP and the Gla-osteocalcin to Glu-osteocalcin ratio) and events of clinical interest. RESULTS: A total of 86 patients with a CAC score ≥30 Agatston Units were randomized to either 10 mg of phylloquinone or a matching placebo three times per week. In all, 69 participants (80%) completed the trial. Recruitment rate (4.4 participants/month) and medication compliance (96%) met pre-defined feasibility criteria of ≥4.17 and ≥90%, respectively. Patients randomized to phylloquinone for 12 months had significantly reduced levels of dephospho-uncarboxylated MGP (86% reduction) and increased levels of phylloquinone and Gla-osteocalcin to Glu-osteocalcin ratio compared with placebo. There was no difference in the absolute or relative progression of coronary artery calcification between groups. CONCLUSION: We demonstrated that phylloquinone treatment improves vitamin K status and that a fully powered randomized trial may be feasible.
Subject(s)
Coronary Artery Disease , Vascular Calcification , Humans , Vitamin K/therapeutic use , Vitamin K 1/therapeutic use , Osteocalcin/therapeutic use , Pilot Projects , Coronary Artery Disease/drug therapy , Vascular Calcification/drug therapy , Calcium-Binding Proteins , Extracellular Matrix Proteins , Renal Dialysis , Vitamin K 2/pharmacologyABSTRACT
BACKGROUND: Based on low-quality evidence, current nutrition guidelines recommend the delivery of high-dose protein in critically ill patients. The EFFORT Protein trial showed that higher protein dose is not associated with improved outcomes, whereas the effects in critically ill patients who developed acute kidney injury (AKI) need further evaluation. The overall aim is to evaluate the effects of high-dose protein in critically ill patients who developed different stages of AKI. METHODS: In this post hoc analysis of the EFFORT Protein trial, we investigated the effect of high versus usual protein dose (≥ 2.2 vs. ≤ 1.2 g/kg body weight/day) on time-to-discharge alive from the hospital (TTDA) and 60-day mortality and in different subgroups in critically ill patients with AKI as defined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria within 7 days of ICU admission. The associations of protein dose with incidence and duration of kidney replacement therapy (KRT) were also investigated. RESULTS: Of the 1329 randomized patients, 312 developed AKI and were included in this analysis (163 in the high and 149 in the usual protein dose group). High protein was associated with a slower time-to-discharge alive from the hospital (TTDA) (hazard ratio 0.5, 95% CI 0.4-0.8) and higher 60-day mortality (relative risk 1.4 (95% CI 1.1-1.8). Effect modification was not statistically significant for any subgroup, and no subgroups suggested a beneficial effect of higher protein, although the harmful effect of higher protein target appeared to disappear in patients who received kidney replacement therapy (KRT). Protein dose was not significantly associated with the incidence of AKI and KRT or duration of KRT. CONCLUSIONS: In critically ill patients with AKI, high protein may be associated with worse outcomes in all AKI stages. Recommendation of higher protein dosing in AKI patients should be carefully re-evaluated to avoid potential harmful effects especially in patients who were not treated with KRT. TRIAL REGISTRATION: This study is registered at ClinicalTrials.gov (NCT03160547) on May 17th 2017.
Subject(s)
Acute Kidney Injury , Critical Illness , Humans , Acute Kidney Injury/therapy , Critical Illness/therapy , Critical Illness/epidemiology , Hospitalization , Intensive Care Units , Length of Stay , Renal Replacement TherapyABSTRACT
AIMS AND OBJECTIVES: The aim of the study was to investigate the effect of supporting family members to partner with health professionals on nutrition intakes and decision-making and to evaluate intervention and study feasibility. BACKGROUND: Family partnerships can improve outcomes for critically ill patients and family members. Interventions that support families to engage with health professionals require evaluation. DESIGN: A multi-centre, randomised, parallel group superiority Phase II randomised controlled trial. METHODS: In nine intensive care units (ICUs) across three countries, critically ill patients ≥60 years, or those 55-59 years with advanced chronic diseases and expected ICU length of stay >72 h and their family member were enrolled between 9 May 2017 and 31 March 2020. Participants were randomised (1:1:1) to either a decision support or nutrition optimisation family-centred intervention, or usual care. Primary outcomes included protein and energy intake during ICU and hospital stay (nutrition intervention) and family satisfaction (decision support). Study feasibility was assessed as a composite of consent rate, intervention adherence, contamination and physician awareness of intervention assignment. RESULTS: We randomised 135 patients/family members (consent rate 51.7%). The average rate of randomisation was 0.5 (0.13-1.53) per month. Unavailability (staff/family) was the major contributor to families not being approached for consent. Declined consent was attributed to families feeling overwhelmed (58/126, 46%). Pandemic visitor restrictions contributed to early study cessation. Intervention adherence for the decision support intervention was 76.9%-100.0% and for the nutrition intervention was 44.8%-100.0%. Nutritional adequacy, decisional conflict, satisfaction with decision-making and overall family satisfaction with ICU were similar for all groups. CONCLUSIONS: Active partnerships between family members and health professionals are important but can be challenging to achieve in critical care contexts. We were unable to demonstrate the efficacy of either intervention. Feasibility outcomes suggest further refinement of interventions and study protocol may be warranted. RELEVANCE TO CLINICAL PRACTICE: Interventions to promote family partnerships in critical illness are needed but require a greater understanding of the extent to which families want and are able to engage and the activities in which they have most impact. REPORTING METHOD: This study has been reported following the Consolidated Standards of Reporting Trials (CONSORT) and the Template for Intervention Description and Replication (TIDieR) guidelines. PATIENT OR PUBLIC CONTRIBUTION: Patients and caregivers were engaged in and contributed to the development and subsequent iterations of the two family-centred interventions use in this study. CLINICAL TRIAL REGISTRATION NUMBER: Trial registration. CLINICALTRIALS: gov, ID: NCT02920086. Registered on 30 September 2016. First patient enrolled on 9 May 2017 https://clinicaltrials.gov/ct2/results?cond=&term=NCT02920086&cntry=&state=&city=&dist=.
Subject(s)
Critical Illness , Nutritional Status , Humans , Length of Stay , Intensive Care Units , Critical CareABSTRACT
Importance: The efficacy of vitamin C for hospitalized patients with COVID-19 is uncertain. Objective: To determine whether vitamin C improves outcomes for patients with COVID-19. Design, Setting, and Participants: Two prospectively harmonized randomized clinical trials enrolled critically ill patients receiving organ support in intensive care units (90 sites) and patients who were not critically ill (40 sites) between July 23, 2020, and July 15, 2022, on 4 continents. Interventions: Patients were randomized to receive vitamin C administered intravenously or control (placebo or no vitamin C) every 6 hours for 96 hours (maximum of 16 doses). Main Outcomes and Measures: The primary outcome was a composite of organ support-free days defined as days alive and free of respiratory and cardiovascular organ support in the intensive care unit up to day 21 and survival to hospital discharge. Values ranged from -1 organ support-free days for patients experiencing in-hospital death to 22 organ support-free days for those who survived without needing organ support. The primary analysis used a bayesian cumulative logistic model. An odds ratio (OR) greater than 1 represented efficacy (improved survival, more organ support-free days, or both), an OR less than 1 represented harm, and an OR less than 1.2 represented futility. Results: Enrollment was terminated after statistical triggers for harm and futility were met. The trials had primary outcome data for 1568 critically ill patients (1037 in the vitamin C group and 531 in the control group; median age, 60 years [IQR, 50-70 years]; 35.9% were female) and 1022 patients who were not critically ill (456 in the vitamin C group and 566 in the control group; median age, 62 years [IQR, 51-72 years]; 39.6% were female). Among critically ill patients, the median number of organ support-free days was 7 (IQR, -1 to 17 days) for the vitamin C group vs 10 (IQR, -1 to 17 days) for the control group (adjusted proportional OR, 0.88 [95% credible interval {CrI}, 0.73 to 1.06]) and the posterior probabilities were 8.6% (efficacy), 91.4% (harm), and 99.9% (futility). Among patients who were not critically ill, the median number of organ support-free days was 22 (IQR, 18 to 22 days) for the vitamin C group vs 22 (IQR, 21 to 22 days) for the control group (adjusted proportional OR, 0.80 [95% CrI, 0.60 to 1.01]) and the posterior probabilities were 2.9% (efficacy), 97.1% (harm), and greater than 99.9% (futility). Among critically ill patients, survival to hospital discharge was 61.9% (642/1037) for the vitamin C group vs 64.6% (343/531) for the control group (adjusted OR, 0.92 [95% CrI, 0.73 to 1.17]) and the posterior probability was 24.0% for efficacy. Among patients who were not critically ill, survival to hospital discharge was 85.1% (388/456) for the vitamin C group vs 86.6% (490/566) for the control group (adjusted OR, 0.86 [95% CrI, 0.61 to 1.17]) and the posterior probability was 17.8% for efficacy. Conclusions and Relevance: In hospitalized patients with COVID-19, vitamin C had low probability of improving the primary composite outcome of organ support-free days and hospital survival. Trial Registration: ClinicalTrials.gov Identifiers: NCT04401150 (LOVIT-COVID) and NCT02735707 (REMAP-CAP).
Subject(s)
COVID-19 , Sepsis , Humans , Female , Middle Aged , Male , Ascorbic Acid/therapeutic use , Critical Illness/therapy , Critical Illness/mortality , Hospital Mortality , Bayes Theorem , Randomized Controlled Trials as Topic , Vitamins/therapeutic use , Sepsis/drug therapyABSTRACT
BACKGROUND: Efforts to manage obesity through weight loss are often unsuccessful as most adults are not able to sustain the major changes in behaviour that are required to maintain weight loss long term. We sought to determine whether small changes in physical activity and diet prevent weight gain in adults with overweight and obesity. METHODS: We randomized 320 sedentary adults with overweight or obesity to monitoring alone (MA, n = 160) or a small change approach (SCA, n = 160). In Phase I (2 yr), MA participants were asked to maintain their normal lifestyle and SCA participants were counselled to make small changes in diet and physical activity, namely a suggested increase in daily step count of 2000 steps with a decrease in energy intake of 100 kcal per day, with group and individual support. Phase II (1 yr) was a passive follow-up period. The difference in change in body weight between groups at 24 and 36 months from baseline was the primary outcome. Additional outcomes included waist circumference and cardiorespiratory fitness. RESULTS: Overall, 268 participants (83.8%) completed the 2-year intervention, and 239 (74.7%) returned at the end of the follow-up period at 3 years. The difference in body weight change between the SCA and MA groups was significant at 3, 6, 12 and 15 months from baseline, but was no longer significant at 24 months (mean change 0.9 [standard error (SE) 0.5] kg v. -0.4 [SE 0.5] kg; difference -0.6, 95% confidence interval [CI] -1.9 to 0.8) or at 36 months (-1.2 [SE 0.8] v. -0.7 [SE 0.8] kg; difference -0.5, 95% CI -2.2 to 1.2). Changes in waist circumference and cardiorespiratory fitness were not significantly different between groups at 24 or 36 months (both p > 0.1). INTERPRETATION: The SCA did not prevent weight gain compared with monitoring alone at 2 or 3 years in adults with overweight or obesity. On average, we observed prevention of weight gain in both arms of the trial. TRIAL REGISTRATION: ClinicalTrials.gov, no. NCT02027077.
Subject(s)
Obesity , Overweight , Adult , Exercise , Humans , Obesity/prevention & control , Overweight/prevention & control , Weight Gain , Weight LossABSTRACT
BACKGROUND: Proteins are an essential part of medical nutrition therapy in critically ill patients. Guidelines almost universally recommend a high protein intake without robust evidence supporting its use. METHODS: Using a large international database, we modelled associations between the hazard rate of in-hospital death and live hospital discharge (competing risks) and three categories of protein intake (low: < 0.8 g/kg per day, standard: 0.8-1.2 g/kg per day, high: > 1.2 g/kg per day) during the first 11 days after ICU admission (acute phase). Time-varying cause-specific hazard ratios (HR) were calculated from piece-wise exponential additive mixed models. We used the estimated model to compare five different hypothetical protein diets (an exclusively low protein diet, a standard protein diet administered early (day 1 to 4) or late (day 5 to 11) after ICU admission, and an early or late high protein diet). RESULTS: Of 21,100 critically ill patients in the database, 16,489 fulfilled inclusion criteria for the analysis. By day 60, 11,360 (68.9%) patients had been discharged from hospital, 4,192 patients (25.4%) had died in hospital, and 937 patients (5.7%) were still hospitalized. Median daily low protein intake was 0.49 g/kg [IQR 0.27-0.66], standard intake 0.99 g/kg [IQR 0.89- 1.09], and high intake 1.41 g/kg [IQR 1.29-1.60]. In comparison with an exclusively low protein diet, a late standard protein diet was associated with a lower hazard of in-hospital death: minimum 0.75 (95% CI 0.64, 0.87), and a higher hazard of live hospital discharge: maximum HR 1.98 (95% CI 1.72, 2.28). Results on hospital discharge, however, were qualitatively changed by a sensitivity analysis. There was no evidence that an early standard or a high protein intake during the acute phase was associated with a further improvement of outcome. CONCLUSIONS: Provision of a standard protein intake during the late acute phase may improve outcome compared to an exclusively low protein diet. In unselected critically ill patients, clinical outcome may not be improved by a high protein intake during the acute phase. Study registration ID number ISRCTN17829198.
Subject(s)
Critical Illness , Nutrition Therapy , Databases, Factual , Hospital Mortality , Humans , Intensive Care UnitsABSTRACT
INTRODUCTION: Real-world evidence on the timing and efficacy of enteral nutrition (EN) practices in intensive care unit (ICU) patients with circulatory shock is limited. We hypothesized early EN (EEN), as compared to delayed EN (DEN), is associated with improved clinical outcomes in mechanically ventilated (MV) patients with circulatory shock. METHODS: We analyzed a dataset from an international, multicenter, pragmatic randomized clinical trial (RCT) evaluating protein dose in ICU patients. Data were collected from ICU admission, and EEN was defined as initiating < 48 h from ICU admission and DEN > 48 h. We identified MV patients in circulatory shock to evaluate the association between the timing of EN initiation and clinical outcomes. The regression analysis model controlled for age, mNUTRIC score, APACHE II score, sepsis, and Site. RESULTS: We included 626 patients, from 52 ICUs in 14 countries. Median age was 60 years [18-93], 55% had septic shock, 99% received norepinephrine alone, 91% received EN alone, and 50.3% were randomized to a usual protein dose. Forty-two percent of EEN patients had persistent organ dysfunction syndrome plus death at day 28, compared to 53% in the DEN group (p = 0.04). EEN was associated with more ICU-free days (9.3 ± 9.2 vs. 5.7 ± 7.9, p = 0.0002), more days alive and free of vasopressors (7.1 ± 3.1 vs. 6.3 ± 3.2, p = 0.007), and shorter duration of MV among survivors (9.8 ± 10.9 vs. 13.8 ± 14.5, p = 0.0002). This trend was no longer observed in the adjusted analysis. There were no differences in ICU/60-day mortality or feeding intolerance rates between groups. CONCLUSION: In MV patients with circulatory shock, EEN, as compared to DEN, was associated with improved clinical outcomes, but no longer when adjusting for illness severity. RCTs comparing the efficacy of EEN to DEN in MV patients with circulatory shock are warranted.
Subject(s)
Enteral Nutrition , Sepsis , Humans , Infant, Newborn , Intensive Care Units , Middle Aged , Prospective Studies , Respiration, ArtificialABSTRACT
Urinary inulin clearance is considered the gold standard of glomerular filtration rate (GFR) measurement but plasma clearance of less expensive and more accessible tracers is more commonly performed. Many plasma sampling protocols exist but little is known about their accuracy. Here, the study objectives were to compare plasma iohexol and 99mTc-DTPA GFR with varying sampling strategies to the GFR measured by urinary inulin and to identify protocols with the greatest accuracy according to clinical characteristics. GFR was measured simultaneously using urinary inulin, plasma iohexol, and plasma 99mTc DTPA clearance. Blood was sampled from 2 to 10 hours after injection. For each method, bias, precision, and accuracy (P30 and mean absolute error) were calculated for the entire cohort and for eGFR-EPI creatinine subgroups (<30, 30-59, and ≥60 ml/min/1.73m2) and the edema stage using urinary inulin clearance as the gold standard. The mean inulin GFR of the 77 participants was 33 ml/min/1.73m2. Delay of both the initial and the final samples in plasma iohexol protocols yielded the highest accuracy in the setting of low GFR (<30 ml/min/1.73m2). Early initial and final samples yielded the highest accuracy in the setting of high GFRs (≥60 ml/min/1.73m2). No sampling strategy was accurate in edematous patients. Thus, our study demonstrates that customization of GFR protocols according to the anticipated level of GFR are required to optimize protocol accuracy.
Subject(s)
Iohexol , Technetium Tc 99m Pentetate , Glomerular Filtration Rate , Humans , Inulin , Kidney Function TestsABSTRACT
OBJECTIVES: To determine the incidence of enteral feed intolerance, identify factors associated with enteral feed intolerance, and assess the relationship between enteral feed intolerance and key nutritional and clinical outcomes in critically ill patients. DESIGN: Analysis of International Nutrition Survey database collected prospectively from 2007 to 2014. SETTING: Seven-hundred eighty-five ICUs from around the world. PATIENTS: Mechanically ventilated adults with ICU stay greater than or equal to 72 hours and received enteral nutrition during the first 12 ICU days. INTERVENTIONS: None. MEASUREMENT AND MAIN RESULTS: We defined enteral feed intolerance as interrupted feeding due to one of the following reasons: high gastric residual volumes, increased abdominal girth, distension, subjective discomfort, emesis, or diarrhea. The current analysis included 15,918 patients. Of these, 4,036 (24%) had at least one episode of enteral feed intolerance. The enteral feed intolerance rate increased from 1% on day 1 to 6% on days 4 and 5 and declined daily thereafter. After controlling for site and patient covariates, burn (odds ratio 1.46; 95% CIs, 1.07-1.99), gastrointestinal (odds ratio 1.45; 95% CI, 1.27-1.66), and sepsis (odds ratio 1.34; 95% CI, 1.17-1.54) admission diagnoses were more likely to develop enteral feed intolerance, as compared to patients with respiratory-related admission diagnosis. enteral feed intolerance patients received about 10% less enteral nutrition intake, as compared to patients without enteral feed intolerance after controlling for important covariates including severity of illness. Enteral feed intolerance patients had fewer ventilator-free days and longer ICU length of stay time to discharge alive (all p < 0.0001). The daily mortality hazard rate increased by a factor of 1.5 (1.4-1.6; p < 0.0001) once enteral feed intolerance occurred. CONCLUSIONS: Enteral feed intolerance occurs frequently during enteral nutrition delivery in the critically ill. Burn and gastrointestinal patients had the highest risk of developing enteral feed intolerance. Enteral feed intolerance is associated with lower enteral nutrition delivery and worse clinical outcomes. Identification, prevention, and optimal management of enteral feed intolerance may improve nutrition delivery and clinical outcomes in important "at risk" populations.
Subject(s)
Critical Illness/therapy , Enteral Nutrition/adverse effects , Respiration, Artificial/adverse effects , Adolescent , Adult , Aged , Aged, 80 and over , Databases as Topic , Female , Humans , Incidence , Intensive Care Units/statistics & numerical data , Male , Middle Aged , Nutritional Status , Retrospective Studies , Risk Factors , Treatment Outcome , Young AdultABSTRACT
OBJECTIVE: Carotid endarterectomy is recommended for the prevention of ischaemic stroke due to carotid stenosis. However, the risk of stroke after carotid endarterectomy has been estimated at 2% - 5%. Monitoring intra-operative cerebral oxygenation with near infrared spectroscopy (NIRS) has been assessed as a strategy to reduce intra- and post-operative complications. The aim was to summarise the diagnostic accuracy of NIRS to detect intra-operative ischaemic events, the values associated with ischaemic events, and the relative contribution of external carotid contamination to the NIRS signal in adults undergoing carotid endarterectomy. DATA SOURCES: EMBASE, MEDLINE, Cochrane Centre Register of Controlled Trials, and reference lists through May 2019 were searched. REVIEW METHODS: Non-randomised and randomised studies assessing NIRS as an intra-operative monitoring tool in carotid endarterectomy were included. Studies using NIRS as the reference were excluded. Risk of bias was assessed using the Newcastle Ottawa Scale, RoB-2, and QUADAS-2. RESULTS: Seventy-six studies were included (n = 8 480), under local (n = 1 864) or general (n = 6 582) anaesthesia. Seven studies were eligible for meta-analysis (n = 524). As a tool for identifying intra-operative ischaemia, specificity increased with more stringent NIRS thresholds, while there was unpredictable variation in sensitivity across studies. A Δ20% threshold under local anaesthesia resulted in pooled estimates for sensitivity and specificity of 70.5% (95% confidence interval, CI, 54.1 - 82.9) and 92.4% (95% CI 85.5 - 96.1) compared with awake neurological monitoring. These studies had low or unclear risk of bias. NIRS signal consistently dropped across clamping and recovered to pre-clamp values upon de-clamp in most studies, and larger decreases were observed in patients with ischaemic events. The contribution of extracranial signal to change in signal across clamp varied from 3% to 50%. CONCLUSION: NIRS has low sensitivity and high specificity to identify intra-operative ischaemia compared with awake monitoring. Extracranial signal contribution was highly variable. Ultimately, data from high quality studies are desperately needed to determine the utility of NIRS.
Subject(s)
Carotid Stenosis/diagnostic imaging , Carotid Stenosis/surgery , Endarterectomy, Carotid , Monitoring, Intraoperative , Spectroscopy, Near-Infrared , Humans , Sensitivity and SpecificityABSTRACT
STUDY OBJECTIVE: Little is known about the cause or optimal treatment of hyperemesis in habitual cannabis users. Anecdotal evidence supports the use of haloperidol over traditional antiemetics for this newly recognized disorder. We compare haloperidol with ondansetron for cannabis hyperemesis syndrome. METHODS: We randomized cannabis users with active emesis to either haloperidol (with a nested randomization to either 0.05 or 0.1 mg/kg) or ondansetron 8 mg intravenously in a triple-blind fashion. The primary outcome was the reduction from baseline in abdominal pain and nausea (each measured on a 10-cm visual analog scale) at 2 hours after treatment. Although the trial allowed for crossover, the primary analysis used only the first treatment period because few subjects crossed over. RESULTS: We enrolled 33 subjects, of whom 30 (16 men, aged 29 years [SD 11 years] using 1.5 g/day [SD 0.9 g/day] since age 19 years [SD 2 years]) received at least 1 treatment (haloperidol 13, ondansetron 17). Haloperidol at either dose was superior to ondansetron (difference 2.3 cm [95% confidence interval 0.6 to 4.0 cm]; P=.01), with similar improvements in both pain and nausea, as well as less use of rescue antiemetics (31% versus 59%; difference -28% [95% confidence interval -61% to 13%]) and shorter time to emergency department (ED) departure (3.1 hours [SD 1.7] versus 5.6 hours [SD 4.5]; difference 2.5 hours [95% confidence interval 0.1 to 5.0 hours]; P=.03). There were 2 return visits for acute dystonia, both in the higher-dose haloperidol group. CONCLUSION: In this clinical trial, haloperidol was superior to ondansetron for the acute treatment of cannabis-associated hyperemesis. The efficacy of haloperidol over ondansetron provides insight into the pathophysiology of this now common diagnosis in many EDs.