Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 61
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Clin Infect Dis ; 77(2): 186-193, 2023 07 26.
Article in English | MEDLINE | ID: mdl-36996150

ABSTRACT

BACKGROUND: The vast majority of coronavirus disease 2019 (COVID-19) disease occurs in outpatients where treatment is limited to antivirals for high-risk subgroups. Acebilustat, a leukotriene B4 inhibitor, has potential to reduce inflammation and symptom duration. METHODS: In a single-center trial spanning Delta and Omicron variants, outpatients were randomized to 100 mg/d of oral acebilustat or placebo for 28 days. Patients reported daily symptoms via electronic query through day 28 with phone follow-up on day 120 and collected nasal swab samples on days 1-10. The primary outcome was sustained symptom resolution to day 28. Secondary 28-day outcomes included time to first symptom resolution, area under the curve (AUC) for longitudinal daily symptom scores, duration of viral shedding through day 10, and symptoms on day 120. RESULTS: Sixty participants were randomized to each study arm. At enrollment, the median duration was 4 days (interquartile range, 3-5 days), and the median number of symptoms was 9 (7-11). Most patients (90%) were vaccinated, with 73% having neutralizing antibodies. A minority of participants (44%; 35% in the acebilustat arm and 53% in placebo) had sustained symptom resolution at day 28 (hazard ratio, 0.6 [95% confidence interval, .34-1.04]; P = .07 favoring placebo). There was no difference in the mean AUC for symptom scores over 28 days (difference in mean AUC, 9.4 [95% confidence interval, -42.1 to 60.9]; P = .72). Acebilustat did not affect viral shedding or symptoms at day 120. CONCLUSIONS: Sustained symptoms through day 28 were common in this low-risk population. Despite this, leukotriene B4 antagonism with acebilustat did not shorten symptom duration in outpatients with COVID-19. Clinical Trials Registration. NCT04662060.


Subject(s)
COVID-19 , Humans , SARS-CoV-2 , Leukotriene B4 , Outpatients , Double-Blind Method , Treatment Outcome
2.
Crit Care Med ; 51(12): e269-e274, 2023 12 01.
Article in English | MEDLINE | ID: mdl-37695136

ABSTRACT

OBJECTIVES: Interleukin-18 (IL-18) plasma level and latent class analysis (LCA) have separately been shown to predict prognosis and treatment response in acute respiratory distress syndrome (ARDS). IL-18 is a measure of inflammasome activation, a pathway potentially distinct from inflammation captured by biomarkers defining previously published LCA classes. We hypothesized that elevated IL-18 would identify distinct "high-risk" patients not captured by prior LCA classifications. DESIGN: Statins for acutely injured lungs from sepsis (SAILS) and hydroxymethylglutaryl-CoA reductase inhibition with simvastatin in acute lung injury to reduce pulmonary dysfunction trial (HARP-2) are two large randomized, controlled trials in ARDS in which both LCA assignments and IL-18 levels were shown to predict mortality. We first evaluated the overlap between high IL-18 levels (≥ 800 pg/mL) with prior LCA class assignments using McNemar's test and then tested the correlation between IL-18 and LCA biomarkers using Pearson's exact test on log-2 transformed values. Our primary analysis was the association of IL-18 level with 60-day mortality in the hypoinflammatory LCA class, which was assessed using the Fisher exact test and Cox proportional hazards modeling adjusting for age, Acute Physiology and Chronic Health Evaluation score, and gender. Secondary analyses included the association of IL-18 and LCA with mortality within each IL-18/LCA subgroup. SETTING: Secondary analysis of two multicenter, randomized controlled clinical trials of ARDS patients. SUBJECTS: Six hundred eighty-three patients in SAILS and 511 patients in HARP-2. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We found that 33% of patients in SAILS and HARP-2 were discordant by IL-18 level and LCA class. We further found that IL-18 level was only modestly correlated (0.17-0.47) with cytokines used in the LCA assignment. A substantial subset of individuals classified as hypoinflammatory by LCA (14% of SAILS and 43% of HARP-2) were classified as high risk by elevated IL-18. These individuals were at high risk for mortality in both SAILS (42% 60-d mortality, odds ratio [OR] 3.3; 95% CI, 1.8-6.1; p < 0.001) and HARP-2 (27% 60-d mortality, OR 2.1; 95% CI, 1.2-3.8; p = 0.009). CONCLUSIONS: Plasma IL-18 level provides important additional prognostic information to LCA subphenotypes defined largely by traditional inflammatory biomarkers in two large ARDS cohorts.


Subject(s)
Interleukin-18 , Respiratory Distress Syndrome , Humans , Latent Class Analysis , Retrospective Studies , Cytokines , Randomized Controlled Trials as Topic , Respiratory Distress Syndrome/therapy , Biomarkers , Interleukin-8
3.
Crit Care ; 27(1): 126, 2023 03 28.
Article in English | MEDLINE | ID: mdl-36978134

ABSTRACT

BACKGROUND: Two acute respiratory distress syndrome (ARDS) trials showed no benefit for statin therapy, though secondary analyses suggest inflammatory subphenotypes may have a differential response to simvastatin. Statin medications decrease cholesterol levels, and low cholesterol has been associated with increased mortality in critical illness. We hypothesized that patients with ARDS and sepsis with low cholesterol could be harmed by statins. METHODS: Secondary analysis of patients with ARDS and sepsis from two multicenter trials. We measured total cholesterol from frozen plasma samples obtained at enrollment in Statins for Acutely Injured Lungs from Sepsis (SAILS) and Simvastatin in the Acute Respiratory Distress Syndrome (HARP-2) trials, which randomized subjects with ARDS to rosuvastatin versus placebo and simvastatin versus placebo, respectively, for up to 28 days. We compared the lowest cholesterol quartile (< 69 mg/dL in SAILS, < 44 mg/dL in HARP-2) versus all other quartiles for association with 60-day mortality and medication effect. Fisher's exact test, logistic regression, and Cox Proportional Hazards were used to assess mortality. RESULTS: There were 678 subjects with cholesterol measured in SAILS and 509 subjects in HARP-2, of whom 384 had sepsis. Median cholesterol at enrollment was 97 mg/dL in both SAILS and HARP-2. Low cholesterol was associated with higher APACHE III and shock prevalence in SAILS, and higher Sequential Organ Failure Assessment score and vasopressor use in HARP-2. Importantly, the effect of statins differed in these trials. In SAILS, patients with low cholesterol who received rosuvastatin were more likely to die (odds ratio (OR) 2.23, 95% confidence interval (95% CI) 1.06-4.77, p = 0.02; interaction p = 0.02). In contrast, in HARP-2, low cholesterol patients had lower mortality if randomized to simvastatin, though this did not reach statistical significance in the smaller cohort (OR 0.44, 95% CI 0.17-1.07, p = 0.06; interaction p = 0.22). CONCLUSIONS: Cholesterol levels are low in two cohorts with sepsis-related ARDS, and those in the lowest cholesterol quartile are sicker. Despite the very low levels of cholesterol, simvastatin therapy seems safe and may reduce mortality in this group, though rosuvastatin was associated with harm.


Subject(s)
Hydroxymethylglutaryl-CoA Reductase Inhibitors , Respiratory Distress Syndrome , Sepsis , Humans , Hydroxymethylglutaryl-CoA Reductase Inhibitors/pharmacology , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Rosuvastatin Calcium/pharmacology , Rosuvastatin Calcium/therapeutic use , Simvastatin/pharmacology , Simvastatin/therapeutic use , Respiratory Distress Syndrome/therapy , Sepsis/complications
4.
JAMA ; 329(14): 1170-1182, 2023 04 11.
Article in English | MEDLINE | ID: mdl-37039791

ABSTRACT

Importance: Preclinical models suggest dysregulation of the renin-angiotensin system (RAS) caused by SARS-CoV-2 infection may increase the relative activity of angiotensin II compared with angiotensin (1-7) and may be an important contributor to COVID-19 pathophysiology. Objective: To evaluate the efficacy and safety of RAS modulation using 2 investigational RAS agents, TXA-127 (synthetic angiotensin [1-7]) and TRV-027 (an angiotensin II type 1 receptor-biased ligand), that are hypothesized to potentiate the action of angiotensin (1-7) and mitigate the action of the angiotensin II. Design, Setting, and Participants: Two randomized clinical trials including adults hospitalized with acute COVID-19 and new-onset hypoxemia were conducted at 35 sites in the US between July 22, 2021, and April 20, 2022; last follow-up visit: July 26, 2022. Interventions: A 0.5-mg/kg intravenous infusion of TXA-127 once daily for 5 days or placebo. A 12-mg/h continuous intravenous infusion of TRV-027 for 5 days or placebo. Main Outcomes and Measures: The primary outcome was oxygen-free days, an ordinal outcome that classifies a patient's status at day 28 based on mortality and duration of supplemental oxygen use; an adjusted odds ratio (OR) greater than 1.0 indicated superiority of the RAS agent vs placebo. A key secondary outcome was 28-day all-cause mortality. Safety outcomes included allergic reaction, new kidney replacement therapy, and hypotension. Results: Both trials met prespecified early stopping criteria for a low probability of efficacy. Of 343 patients in the TXA-127 trial (226 [65.9%] aged 31-64 years, 200 [58.3%] men, 225 [65.6%] White, and 274 [79.9%] not Hispanic), 170 received TXA-127 and 173 received placebo. Of 290 patients in the TRV-027 trial (199 [68.6%] aged 31-64 years, 168 [57.9%] men, 195 [67.2%] White, and 225 [77.6%] not Hispanic), 145 received TRV-027 and 145 received placebo. Compared with placebo, both TXA-127 (unadjusted mean difference, -2.3 [95% CrI, -4.8 to 0.2]; adjusted OR, 0.88 [95% CrI, 0.59 to 1.30]) and TRV-027 (unadjusted mean difference, -2.4 [95% CrI, -5.1 to 0.3]; adjusted OR, 0.74 [95% CrI, 0.48 to 1.13]) resulted in no difference in oxygen-free days. In the TXA-127 trial, 28-day all-cause mortality occurred in 22 of 163 patients (13.5%) in the TXA-127 group vs 22 of 166 patients (13.3%) in the placebo group (adjusted OR, 0.83 [95% CrI, 0.41 to 1.66]). In the TRV-027 trial, 28-day all-cause mortality occurred in 29 of 141 patients (20.6%) in the TRV-027 group vs 18 of 140 patients (12.9%) in the placebo group (adjusted OR, 1.52 [95% CrI, 0.75 to 3.08]). The frequency of the safety outcomes was similar with either TXA-127 or TRV-027 vs placebo. Conclusions and Relevance: In adults with severe COVID-19, RAS modulation (TXA-127 or TRV-027) did not improve oxygen-free days vs placebo. These results do not support the hypotheses that pharmacological interventions that selectively block the angiotensin II type 1 receptor or increase angiotensin (1-7) improve outcomes for patients with severe COVID-19. Trial Registration: ClinicalTrials.gov Identifier: NCT04924660.


Subject(s)
COVID-19 , Receptor, Angiotensin, Type 1 , Renin-Angiotensin System , Vasodilator Agents , Adult , Female , Humans , Male , Middle Aged , Angiotensin II/metabolism , Angiotensins/administration & dosage , Angiotensins/therapeutic use , COVID-19/complications , COVID-19/mortality , COVID-19/physiopathology , COVID-19/therapy , Hypoxia/drug therapy , Hypoxia/etiology , Hypoxia/mortality , Infusions, Intravenous , Ligands , Oligopeptides/administration & dosage , Oligopeptides/therapeutic use , Randomized Controlled Trials as Topic , Receptor, Angiotensin, Type 1/administration & dosage , Receptor, Angiotensin, Type 1/therapeutic use , Renin-Angiotensin System/drug effects , SARS-CoV-2 , Vasodilator Agents/administration & dosage , Vasodilator Agents/therapeutic use
5.
N Engl J Med ; 381(26): 2529-2540, 2019 12 26.
Article in English | MEDLINE | ID: mdl-31826336

ABSTRACT

BACKGROUND: Vitamin D deficiency is a common, potentially reversible contributor to morbidity and mortality among critically ill patients. The potential benefits of vitamin D supplementation in acute critical illness require further study. METHODS: We conducted a randomized, double-blind, placebo-controlled, phase 3 trial of early vitamin D3 supplementation in critically ill, vitamin D-deficient patients who were at high risk for death. Randomization occurred within 12 hours after the decision to admit the patient to an intensive care unit. Eligible patients received a single enteral dose of 540,000 IU of vitamin D3 or matched placebo. The primary end point was 90-day all-cause, all-location mortality. RESULTS: A total of 1360 patients were found to be vitamin D-deficient during point-of-care screening and underwent randomization. Of these patients, 1078 had baseline vitamin D deficiency (25-hydroxyvitamin D level, <20 ng per milliliter [50 nmol per liter]) confirmed by subsequent testing and were included in the primary analysis population. The mean day 3 level of 25-hydroxyvitamin D was 46.9±23.2 ng per milliliter (117±58 nmol per liter) in the vitamin D group and 11.4±5.6 ng per milliliter (28±14 nmol per liter) in the placebo group (difference, 35.5 ng per milliliter; 95% confidence interval [CI], 31.5 to 39.6). The 90-day mortality was 23.5% in the vitamin D group (125 of 531 patients) and 20.6% in the placebo group (109 of 528 patients) (difference, 2.9 percentage points; 95% CI, -2.1 to 7.9; P = 0.26). There were no clinically important differences between the groups with respect to secondary clinical, physiological, or safety end points. The severity of vitamin D deficiency at baseline did not affect the association between the treatment assignment and mortality. CONCLUSIONS: Early administration of high-dose enteral vitamin D3 did not provide an advantage over placebo with respect to 90-day mortality or other, nonfatal outcomes among critically ill, vitamin D-deficient patients. (Funded by the National Heart, Lung, and Blood Institute; VIOLET ClinicalTrials.gov number, NCT03096314.).


Subject(s)
Cholecalciferol/administration & dosage , Critical Illness/therapy , Vitamin D Deficiency/drug therapy , Vitamins/administration & dosage , Adult , Cholecalciferol/adverse effects , Critical Illness/mortality , Double-Blind Method , Female , Humans , Kaplan-Meier Estimate , Length of Stay , Male , Middle Aged , Organ Dysfunction Scores , Treatment Failure , Vitamin D/analogs & derivatives , Vitamin D/blood , Vitamins/adverse effects
6.
Crit Care ; 25(1): 404, 2021 11 23.
Article in English | MEDLINE | ID: mdl-34814925

ABSTRACT

Identifying new effective treatments for the acute respiratory distress syndrome (ARDS), including COVID-19 ARDS, remains a challenge. The field of ARDS investigation is moving increasingly toward innovative approaches such as the personalization of therapy to biological and clinical sub-phenotypes. Additionally, there is growing recognition of the importance of the global context to identify effective ARDS treatments. This review highlights emerging opportunities and continued challenges for personalizing therapy for ARDS, from identifying treatable traits to innovative clinical trial design and recognition of patient-level factors as the field of critical care investigation moves forward into the twenty-first century.


Subject(s)
Precision Medicine , Respiratory Distress Syndrome/therapy , COVID-19/complications , Clinical Trials as Topic , Humans , Respiratory Distress Syndrome/virology
7.
Dysphagia ; 36(5): 831-841, 2021 10.
Article in English | MEDLINE | ID: mdl-33156398

ABSTRACT

The mechanisms responsible for aspiration are relatively unknown in patients recovering from acute respiratory failure (ARF) who required mechanical ventilation. Though many conditions may contribute to swallowing dysfunction, alterations in laryngeal structure and swallowing function likely play a role in the development of aspiration. At four university-based tertiary medical centers, we conducted a prospective cohort study of ARF patients who required intensive care and mechanical ventilation for at least 48 h. Within 72 h after extubation, a Fiberoptic Flexible Endoscopic Evaluation of Swallowing (FEES) examination was performed. Univariate and multivariable analyses examined the relationship between laryngeal structure and swallowing function abnormalities. Aspiration was the primary outcome, defined as a Penetration- Aspiration Scale (PAS) score of 6 or greater. Two other salient signs of dysphagia-spillage and residue-were secondary outcomes. A total of 213 patients were included in the final analysis. Aspiration was detected in 70 patients (33%) on at least one bolus. The most commonly aspirated consistency was thin liquids (27%). In univariate analyses, several abnormalities in laryngeal anatomy and structural movement were significantly associated with aspiration, spillage, and residue. In a multivariable analysis, the only variables that remained significant with aspiration were pharyngeal weakness (Odds ratio = 2.57, 95%CI = 1.16-5.84, p = 0.019) and upper airway edema (Odds ratio = 3.24, 95%CI = 1.44-7.66, p = 0.004). These results demonstrated that dysphagia in ARF survivors is multifactorial and characterized by both anatomic and physiologic abnormalities. These findings may have important implications for the development of novel interventions to treat dysphagia in ARF survivors.Clinical Trials Registration ClinicalTrials.gov Identifier: NCT02363686, Aspiration in Acute Respiratory Failure Survivors.


Subject(s)
Deglutition Disorders , Respiratory Insufficiency , Deglutition , Deglutition Disorders/etiology , Humans , Prospective Studies , Respiratory Aspiration/etiology , Respiratory Insufficiency/etiology , Survivors
8.
Crit Care Med ; 48(11): 1604-1611, 2020 11.
Article in English | MEDLINE | ID: mdl-32804785

ABSTRACT

OBJECTIVES: To determine whether a modifiable risk factor, endotracheal tube size, is associated with the diagnosis of postextubation aspiration in survivors of acute respiratory failure. DESIGN: Prospective cohort study. SETTING: ICUs at four academic tertiary care medical centers. PATIENTS: Two hundred ten patients who were at least 18 years old, admitted to an ICU, and mechanically ventilated with an endotracheal tube for longer than 48 hours were enrolled. INTERVENTIONS: Within 72 hours of extubation, all patients received a flexible endoscopic evaluation of swallowing examination that entailed administration of ice, thin liquid, thick liquid, puree, and cracker boluses. Patient demographics, treatment variables, and hospital outcomes were abstracted from the patient's medical records. Endotracheal tube size was independently selected by the patient's treating physicians. MEASUREMENTS AND MAIN RESULTS: For each flexible endoscopic evaluation of swallowing examination, laryngeal pathology was evaluated, and for each bolus, a Penetration Aspiration Scale score was assigned. Aspiration (Penetration Aspiration Scale score ≥ 6) was further categorized into nonsilent aspiration (Penetration Aspiration Scale score = 6 or 7) and silent aspiration (Penetration Aspiration Scale score = 8). One third of patients (n = 68) aspirated (Penetration Aspiration Scale score ≥ 6) on at least one bolus, 13.6% (n = 29) exhibited silent aspiration, and 23.8% (n = 50) exhibited nonsilent aspiration. In a multivariable analysis, endotracheal tube size (≤ 7.5 vs ≥ 8.0) was significantly associated with patients exhibiting any aspiration (Penetration Aspiration Scale score ≥ 6) (p = 0.016; odds ratio = 2.17; 95% CI 1.14-4.13) and with risk of developing laryngeal granulation tissue (p = 0.02). CONCLUSIONS: Larger endotracheal tube size was associated with increased risk of aspiration and laryngeal granulation tissue. Using smaller endotracheal tubes may reduce the risk of postextubation aspiration.


Subject(s)
Deglutition , Intubation, Intratracheal/instrumentation , Respiratory Aspiration/etiology , Respiratory Insufficiency/therapy , Aged , Deglutition/physiology , Female , Humans , Intensive Care Units , Intubation, Intratracheal/adverse effects , Intubation, Intratracheal/methods , Male , Middle Aged , Prospective Studies , Respiration, Artificial/adverse effects , Respiration, Artificial/instrumentation , Respiration, Artificial/methods , Risk Factors , Survivors/statistics & numerical data
9.
Dysphagia ; 34(4): 521-528, 2019 08.
Article in English | MEDLINE | ID: mdl-30694412

ABSTRACT

Dysphagia is common in hospitalized patients post-extubation and associated with poor outcomes. Laryngeal sensation is critical for airway protection and safe swallowing. However, current understanding of the relationship between laryngeal sensation and aspiration in post-extubation populations is limited. Acute respiratory failure patients requiring intensive care unit admission and mechanical ventilation received a Flexible Endoscopic Evaluation of Swallowing (FEES) within 72 h of extubation. Univariate and multivariable analyses were performed to examine the relationship between laryngeal sensation, length of intubation, and aspiration. Secondary outcomes included pharyngolaryngeal secretions, pneumonia, and diet recommendations. One-hundred and three patients met inclusion criteria. Fifty-one patients demonstrated an absent laryngeal adductor reflex (LAR). Altered laryngeal sensation correlated with the presence of secretions (p = 0.004). There was a significant interaction between the LAR, aspiration, and duration of mechanical ventilation. Altered laryngeal sensation was significantly associated with aspiration on FEES only in patients with a shorter length of intubation (p = 0.008). Patients with altered laryngeal sensation were prescribed significantly more restricted liquid (p = 0.03) and solid (p = 0.001) diets. No relationship was found between laryngeal sensation and pneumonia. There is a high prevalence of laryngeal sensory deficits in mechanically ventilated patients post-extubation. Altered laryngeal sensation was associated with secretions, aspiration, and modified diet recommendations especially in those patients with a shorter length of mechanical ventilation. These results demonstrate that laryngeal sensory abnormalities impact the development of post-extubation dysphagia.


Subject(s)
Intubation, Intratracheal/adverse effects , Laryngeal Diseases/etiology , Pneumonia, Aspiration/etiology , Sensation Disorders/etiology , Acute Disease , Female , Humans , Laryngeal Diseases/physiopathology , Larynx/physiopathology , Male , Middle Aged , Prospective Studies , Respiratory Insufficiency , Sensation Disorders/physiopathology , Time Factors
10.
Crit Care Med ; 45(5): 798-805, 2017 May.
Article in English | MEDLINE | ID: mdl-28240689

ABSTRACT

OBJECTIVES: Effective pharmacologic treatments directly targeting lung injury in patients with the acute respiratory distress syndrome are lacking. Early treatment with inhaled corticosteroids and beta agonists may reduce progression to acute respiratory distress syndrome by reducing lung inflammation and enhancing alveolar fluid clearance. DESIGN: Double-blind, randomized clinical trial (ClinicalTrials.gov: NCT01783821). The primary outcome was longitudinal change in oxygen saturation divided by the FIO2 (S/F) through day 5. We also analyzed categorical change in S/F by greater than 20%. Other outcomes included need for mechanical ventilation and development of acute respiratory distress syndrome. SETTING: Five academic centers in the United States. PATIENTS: Adult patients admitted through the emergency department at risk for acute respiratory distress syndrome. INTERVENTIONS: Aerosolized budesonide/formoterol versus placebo bid for up to 5 days. MEASUREMENTS AND MAIN RESULTS: Sixty-one patients were enrolled from September 3, 2013, to June 9, 2015. Median time from presentation to first study drug was less than 9 hours. More patients in the control group had shock at enrollment (14 vs 3 patients). The longitudinal increase in S/F was greater in the treatment group (p = 0.02) and independent of shock (p = 0.04). Categorical change in S/F improved (p = 0.01) but not after adjustment for shock (p = 0.15). More patients in the placebo group developed acute respiratory distress syndrome (7 vs 0) and required mechanical ventilation (53% vs 21%). CONCLUSIONS: Early treatment with inhaled budesonide/formoterol in patients at risk for acute respiratory distress syndrome is feasible and improved oxygenation as assessed by S/F. These results support further study to test the efficacy of inhaled corticosteroids and beta agonists for prevention of acute respiratory distress syndrome.


Subject(s)
Adrenal Cortex Hormones/administration & dosage , Adrenergic beta-Agonists/administration & dosage , Budesonide, Formoterol Fumarate Drug Combination/administration & dosage , Hypoxia/drug therapy , Respiratory Distress Syndrome/prevention & control , Academic Medical Centers , Administration, Inhalation , Aged , Aged, 80 and over , Biomarkers , Double-Blind Method , Drug Therapy, Combination , Female , Humans , Hypoxia/complications , Male , Middle Aged , Oxygen/blood , Patient Acuity , Respiration, Artificial , Respiratory Distress Syndrome/etiology , Risk Factors , United States
11.
Environ Sci Technol ; 51(3): 1168-1175, 2017 02 07.
Article in English | MEDLINE | ID: mdl-28074652

ABSTRACT

In southeast New Hampshire, where reformulated gasoline was used from the 1990s to 2007, methyl tert-butyl ether (MtBE) concentrations ≥0.2 µg/L were found in water from 26.7% of 195 domestic wells sampled in 2005. Ten years later in 2015, and eight years after MtBE was banned, 10.3% continue to have MtBE. Most wells (140 of 195) had no MtBE detections (concentrations <0.2 µg/L) in 2005 and 2015. Of the remaining wells, MtBE concentrations increased in 4 wells, decreased in 47 wells, and did not change in 4 wells. On average, MtBE concentrations decreased 65% among 47 wells whereas MtBE concentrations increased 17% among 4 wells between 2005 and 2015. The percent change in detection frequency from 2005 to 2015 (the decontamination rate) was lowest (45.5%) in high-population-density areas and in wells completed in the Berwick Formation geologic units. The decontamination rate was the highest (78.6%) where population densities were low and wells were completed in bedrock composed of granite, metamorphic, and mafic rocks. Wells in the Berwick Formation are characteristically deeper and have lower yields than wells in other rock types and have shallower overburden cover, which may allow for more rapid transport of MtBE from land-surface releases. Low-yielding, deep bedrock wells may require large contributing areas to achieve adequate well yield, and thus have a greater chance of intercepting MtBE, in addition to diluting contaminants at a slower rate and thus requiring more time to decontaminate.


Subject(s)
Gasoline , Methyl Ethers , Geology , New Hampshire , tert-Butyl Alcohol
12.
Expert Rev Proteomics ; 13(5): 457-69, 2016 05.
Article in English | MEDLINE | ID: mdl-27031735

ABSTRACT

The acute respiratory distress syndrome (ARDS) is a common cause of acute respiratory failure, and is associated with substantial mortality and morbidity. Dozens of clinical trials targeting ARDS have failed, with no drug specifically targeting lung injury in widespread clinical use. Thus, the need for drug development in ARDS is great. Targeted proteomic studies in ARDS have identified many key pathways in the disease, including inflammation, epithelial injury, endothelial injury or activation, and disordered coagulation and repair. Recent studies reveal the potential for proteomic changes to identify novel subphenotypes of ARDS patients who may be most likely to respond to therapy and could thus be targeted for enrollment in clinical trials. Nontargeted studies of proteomics in ARDS are just beginning and have the potential to identify novel drug targets and key pathways in the disease. Proteomics will play an important role in phenotyping of patients and developing novel therapies for ARDS in the future.


Subject(s)
Proteomics , Respiratory Distress Syndrome/metabolism , Drug Discovery , Humans , Inflammation , Molecular Targeted Therapy , Respiratory Distress Syndrome/pathology , Respiratory Distress Syndrome/therapy
13.
Am J Respir Crit Care Med ; 191(3): 302-8, 2015 Feb 01.
Article in English | MEDLINE | ID: mdl-25517213

ABSTRACT

RATIONALE: In 2005, the lung allocation score (LAS) was implemented to prioritize organ allocation to minimize waiting-list mortality and maximize 1-year survival. It resulted in transplantation of older and sicker patients without changing 1-year survival. Its effect on resource use is unknown. OBJECTIVES: To determine changes in resource use over time in lung transplant admissions. METHODS: Solid organ transplant recipients were identified within the Nationwide Inpatient Sample (NIS) data from 2000 to 2011. Joinpoint regression methodology was performed to identify a time point of change in mean total hospital charges among lung transplant and other solid-organ transplant recipients. Two temporal lung transplant recipient cohorts identified by joinpoint regression were compared for baseline characteristics and resource use, including total charges for index hospitalization, charges per day, length of stay, discharge disposition, tracheostomy, and need for extracorporeal membrane oxygenation. MEASUREMENTS AND MAIN RESULTS: A significant point of increased total hospital charges occurred for lung transplant recipients in 2005, corresponding to LAS implementation, which was not seen in other solid-organ transplant recipients. Total transplant hospital charges increased by 40% in the post-LAS cohort ($569,942 [$53,229] vs. $407,489 [$28,360]) along with an increased median length of stay, daily charges, and discharge disposition other than to home. Post-LAS recipients also had higher post-transplant use of extracorporeal membrane oxygenation (odds ratio, 2.35; 95% confidence interval, 1.56-3.55) and higher incidence of tracheostomy (odds ratio, 1.52; 95% confidence interval, 1.22-1.89). CONCLUSIONS: LAS implementation is associated with a significant increase in resource use during index hospitalization for lung transplant.


Subject(s)
Health Resources/statistics & numerical data , Length of Stay/economics , Lung Diseases/economics , Lung Transplantation/economics , Patient Selection , Extracorporeal Membrane Oxygenation/economics , Female , Humans , Lung Diseases/surgery , Lung Transplantation/mortality , Male , Middle Aged , Patient Admission/economics , Patient Discharge/economics , Tissue and Organ Procurement/economics , United States , Waiting Lists
15.
Sci Total Environ ; 919: 170838, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38340869

ABSTRACT

Large variations in redox-related water parameters, like pH and dissolved oxygen (DO), have been documented in New Hampshire (United States) drinking-water wells over the course of a few hours under pumping conditions. These findings suggest that comparable sub-daily variability in dissolved concentrations of redox-reactive and toxic arsenic (As) also may occur, representing a potentially critical public-health data gap and a fundamental challenge for long-term As-trends monitoring. To test this hypothesis, discrete groundwater As samples were collected approximately hourly during one day in May and again in August 2019 from three New Hampshire drinking-water wells (2 public-supply, 1 private) under active pumping conditions. Collected samples were assessed by laboratory analysis (total As [AsTot], As(III), As(V)) and by field analysis (AsTot) using a novel integrated biosensor system. Laboratory analysis revealed sub-daily variability (range) in AsTot concentrations equivalent to 16 % - 36 % of that observed in the antecedent 3-year bimonthly trend monitoring. Thus, the results indicated that, along with previously demonstrated seasonality effects, the timing and duration of pumping are important considerations when assessing trends in drinking-water As exposures and concomitant risks. Results also illustrated the utility of the field sensor for monitoring and management of AsTot exposures in near-real-time.


Subject(s)
Arsenic , Drinking Water , Groundwater , Water Pollutants, Chemical , United States , Water Wells , Water Supply , New Hampshire , Arsenic/analysis , Environmental Monitoring/methods , Water Pollutants, Chemical/analysis , Drinking Water/analysis
16.
J Am Coll Emerg Physicians Open ; 5(3): e13192, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38887225

ABSTRACT

Objectives: Patients hospitalized for COVID-19 frequently develop hypoxemia and acute respiratory distress syndrome (ARDS) after admission. In non-COVID-19 ARDS studies, admission to hospital wards with subsequent transfer to intensive care unit (ICU) is associated with worse outcomes. We hypothesized that initial admission to the ward may affect outcomes in patient with COVID-19 ARDS. Methods: This was a retrospective study of consecutive adults admitted for COVID-19 ARDS between March 2020 and March 2021 at Stanford Health Care. Mortality scores at hospital admission (Coronavirus Clinical Characterization Consortium Mortality Score [4C score]) and ICU admission (Simplified Acute Physiology Score III [SAPS-III]) were calculated, as well as ROX index for patients on high flow nasal oxygen. Patients were classified by emergency department (ED) disposition (ward-first vs. ICU-direct), and 28- and 60-day mortality and highest level of respiratory support within 1 day of ICU admission were compared. A second cohort (April 2021‒July 2022, n = 129) was phenotyped to validate mortality outcome. Results: A total of 157 patients were included, 48% of whom were first admitted to the ward (n = 75). Ward-first patients had more comorbidities, including lung disease. Ward-first patients had lower 4C and similar SAPS-III score, yet increased mortality at 28 days (32% vs. 17%, hazard ratio [HR] 2.0, 95% confidence interval [95% CI] 1.0‒3.7, p = 0.039) and 60 days (39% vs. 23%, HR 1.83, 95% CI 1.04‒3.22, p = 0.037) compared to ICU-direct patients. More ward-first patients escalated to mechanical ventilation on day 1 of ICU admission (36% vs. 14%, p = 0.002) despite similar ROX index. Ward-first patients who upgraded to ICU within 48 h of ED presentation had the highest mortality. Mortality findings were replicated in a sensitivity analysis. Conclusion: Despite similar baseline risk scores, ward-first patients with COVID-19 ARDS had increased mortality and escalation to mechanical ventilation compared to ICU-direct patients. Ward-first patients requiring ICU upgrade within 48 h were at highest risk, highlighting a need for improved identification of this group at ED admission.

17.
Crit Care Med ; 41(8): 1929-37, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23782966

ABSTRACT

OBJECTIVE: Mortality associated with acute lung injury remains high. Early identification of acute lung injury prior to onset of respiratory failure may provide a therapeutic window to target in future clinical trials. The recently validated Lung Injury Prediction Score identifies patients at risk for acute lung injury but may be limited for routine clinical use. We sought to empirically derive clinical criteria for a pragmatic definition of early acute lung injury to identify patients with lung injury prior to the need for positive pressure ventilation. DESIGN: Prospective observational cohort study. SETTING: Stanford University Hospital. PATIENTS: We prospectively evaluated 256 patients admitted to Stanford University Hospital with bilateral opacities on chest radiograph without isolated left atrial hypertension. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Of the 256 patients enrolled, 62 patients (25%) progressed to acute lung injury requiring positive pressure ventilation. Clinical variables (through first 72 hr or up to 6 hr prior to acute lung injury) associated with progression to acute lung injury were analyzed by backward regression. Oxygen requirement, maximal respiratory rate, and baseline immune suppression were independent predictors of progression to acute lung injury. A simple three-component early acute lung injury score (1 point for oxygen requirement > 2-6 L/min or 2 points for > 6 L/min; 1 point each for a respiratory rate ≥ 30 and immune suppression) accurately identified patients who progressed to acute lung injury requiring positive pressure ventilation (area under the receiver-operator characteristic curve, 0.86) and performed similarly to the Lung Injury Prediction Score. An early acute lung injury score greater than or equal to 2 identified patients who progressed to acute lung injury with 89% sensitivity and 75% specificity. Median time of progression from early acute lung injury criteria to acute lung injury requiring positive pressure ventilation was 20 hours. CONCLUSIONS: This pragmatic definition of early acute lung injury accurately identified patients who progressed to acute lung injury prior to requiring positive pressure ventilation. Pending further validation, these criteria could be useful for future clinical trials targeting early treatment of acute lung injury.


Subject(s)
Acute Lung Injury/diagnosis , Acute Lung Injury/therapy , Disease Progression , Early Diagnosis , Positive-Pressure Respiration , APACHE , Acute Lung Injury/mortality , California , Female , Hospital Mortality , Humans , Immunocompromised Host , Intensive Care Units , Length of Stay/statistics & numerical data , Male , Middle Aged , Multivariate Analysis , Oxygen Inhalation Therapy , Patient Admission/statistics & numerical data , Predictive Value of Tests , Prospective Studies , ROC Curve , Radiography, Thoracic , Respiratory Rate , Sensitivity and Specificity , Time Factors
18.
J Intensive Care Med ; 28(4): 241-6, 2013.
Article in English | MEDLINE | ID: mdl-22733725

ABSTRACT

BACKGROUND: Acute lung injury (ALI) has been primarily defined in patients who require positive pressure ventilation. As a result, the clinical characteristics of patients with early ALI (EALI) prior to the need for mechanical ventilation have not been well characterized. Early identification of patients with ALI and the impending need for positive pressure ventilation could define a study population for trials of novel therapies. METHODS: We analyzed clinical data from 93 patients at 12, 24, and 48 hours prior to the standard diagnosis of ALI. The time of ALI diagnosis was defined when patients were mechanically ventilated and met the 1994 American-European Consensus Conference diagnostic criteria for ALI. RESULTS: The majority of patients with ALI presented to the hospital more than 24 hours prior to developing ALI. Specifically, 73% presented more than 12 hours prior to diagnosis, and 57% presented more than 24 hours prior to diagnosis. Of patients hospitalized for at least 12 hours prior to ALI diagnosis, 94% had either bilateral infiltrates on chest radiograph, tachypnea, or an oxygen requirement greater than 2 L/min; 79% and 48% had 2 and 3 of these abnormalities, respectively. CONCLUSION: The majority of hospitalized patients who are destined to develop ALI demonstrate tachypnea, increased oxygen requirements, and/or bilateral infiltrates on chest radiograph more than 12 hours prior to meeting criteria for diagnosis. Some patients with EALI may be identified prior to meeting diagnostic criteria during a potential therapeutic window.


Subject(s)
Acute Lung Injury/diagnosis , Critical Care/methods , Early Diagnosis , Respiratory Distress Syndrome/diagnosis , Acute Lung Injury/complications , Acute Lung Injury/diagnostic imaging , Adult , Aged , Chronic Disease/epidemiology , Cohort Studies , Comorbidity , Female , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , Radiography , Respiratory Distress Syndrome/diagnostic imaging , Respiratory Distress Syndrome/etiology , Risk Factors , San Francisco/epidemiology , Time Factors , Treatment Outcome
19.
Semin Respir Crit Care Med ; 34(4): 448-58, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23934714

ABSTRACT

Acute lung injury (ALI) and the acute respiratory distress syndrome (ARDS) are serious complications of acute illness and injury, associated with an inpatient mortality of up to 40%. Despite considerable basic science and clinical research, therapeutic options for established ALI are limited. Survivors of ARDS are often faced with poor health-related quality of life, depressive-anxiety disorders, cognitive deficits, and financial strain. An attractive approach toward managing ALI lies in its prevention and early treatment. In addition to improving recognition of at-risk patients, it is necessary to identify novel treatments targeting the pathways that may prevent or ameliorate lung injury. The rationale and animal and clinical evidence for aspirin, systemic and inhaled steroids, ß-agonists, renin-angiotensin axis blockers, statins, peroxisome proliferator agonist receptor ligands, curcumin, and inhaled heparin are included in this narrative review. Randomized, controlled trials are currently being designed and implemented to address their efficacy in populations at risk for ALI.


Subject(s)
Acute Lung Injury/drug therapy , Molecular Targeted Therapy , Respiratory Distress Syndrome/drug therapy , Acute Lung Injury/physiopathology , Acute Lung Injury/prevention & control , Animals , Drug Design , Humans , Quality of Life , Randomized Controlled Trials as Topic , Respiratory Distress Syndrome/physiopathology , Respiratory Distress Syndrome/prevention & control , Risk Factors , Time Factors
20.
Ann Am Thorac Soc ; 20(10): 1465-1474, 2023 10.
Article in English | MEDLINE | ID: mdl-37478340

ABSTRACT

Rationale: Right ventricular (RV) dysfunction is common among patients hospitalized with coronavirus disease (COVID-19); however, its epidemiology may depend on the echocardiographic parameters used to define it. Objectives: To evaluate the prevalence of abnormalities in three common echocardiographic parameters of RV function among patients with COVID-19 admitted to the intensive care unit (ICU), as well as the effect of RV dilatation on differential parameter abnormality and the association of RV dysfunction with 60-day mortality. Methods: We conducted a retrospective cohort study of ICU patients with COVID-19 between March 4, 2020, and March 4, 2021, who received a transthoracic echocardiogram within 48 hours before to at most 7 days after ICU admission. RV dysfunction and dilatation, respectively, were defined by guideline thresholds for tricuspid annular plane systolic excursion (TAPSE), RV fractional area change, RV free wall longitudinal strain (RVFWS), and RV basal dimension or RV end-diastolic area. Association of RV dysfunction with 60-day mortality was assessed through logistic regression adjusting for age, prior history of congestive heart failure, invasive ventilation at the time of transthoracic echocardiogram, and Acute Physiology and Chronic Health Evaluation II score. Results: A total of 116 patients were included, of whom 69% had RV dysfunction by one or more parameters, and 36.3% of these had RV dilatation. The three most common patterns of RV dysfunction were the presence of three abnormalities, the combination of abnormal RVFWS and TAPSE, and isolated TAPSE abnormality. Patients with RV dilatation had worse RV fractional area change (24% vs. 36%; P = 0.001), worse RVFWS (16.3% vs. 19.1%; P = 0.005), higher RV systolic pressure (45 mm Hg vs. 31 mm Hg; P = 0.001) but similar TAPSE (13 mm vs. 13 mm; P = 0.30) compared with those with normal RV size. After multivariable adjustment, 60-day mortality was significantly associated with RV dysfunction (odds ratio, 2.91; 95% confidence interval, 1.01-9.44), as was the presence of at least two parameter abnormalities. Conclusions: ICU patients with COVID-19 had significant heterogeneity in RV function abnormalities present with different patterns associated with RV dilatation. RV dysfunction by any parameter was associated with increased mortality. Therefore, a multiparameter evaluation may be critical in recognizing RV dysfunction in COVID-19.


Subject(s)
COVID-19 , Ventricular Dysfunction, Right , Humans , Retrospective Studies , Ventricular Dysfunction, Right/diagnostic imaging , Ventricular Dysfunction, Right/epidemiology , COVID-19/complications , Echocardiography/methods , Intensive Care Units , Ventricular Function, Right
SELECTION OF CITATIONS
SEARCH DETAIL