Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 34
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Clin Trials ; 21(1): 124-135, 2024 02.
Article in English | MEDLINE | ID: mdl-37615179

ABSTRACT

BACKGROUND: Comparative effectiveness research is meant to determine which commonly employed medical interventions are most beneficial, least harmful, and/or most costly in a real-world setting. While the objectives for comparative effectiveness research are clear, the field has failed to develop either a uniform definition of comparative effectiveness research or an appropriate set of recommendations to provide standards for the design of critical care comparative effectiveness research trials, spurring controversy in recent years. The insertion of non-representative control and/or comparator arm subjects into critical care comparative effectiveness research trials can threaten trial subjects' safety. Nonetheless, the broader scientific community does not always appreciate the importance of defining and maintaining critical care practices during a trial, especially when vulnerable, critically ill populations are studied. Consequently, critical care comparative effectiveness research trials sometimes lack properly constructed control or active comparator arms altogether and/or suffer from the inclusion of "unusual critical care" that may adversely affect groups enrolled in one or more arms. This oversight has led to critical care comparative effectiveness research trial designs that impair informed consent, confound interpretation of trial results, and increase the risk of harm for trial participants. METHODS/EXAMPLES: We propose a novel approach to performing critical care comparative effectiveness research trials that mandates the documentation of critical care practices prior to trial initiation. We also classify the most common types of critical care comparative effectiveness research trials, as well as the most frequent errors in trial design. We present examples of these design flaws drawn from past and recently published trials as well as examples of trials that avoided those errors. Finally, we summarize strategies employed successfully in well-designed trials, in hopes of suggesting a comprehensive standard for the field. CONCLUSION: Flawed critical care comparative effectiveness research trial designs can lead to unsound trial conclusions, compromise informed consent, and increase risks to research subjects, undermining the major goal of comparative effectiveness research: to inform current practice. Well-constructed control and comparator arms comprise indispensable elements of critical care comparative effectiveness research trials, key to improving the trials' safety and to generating trial results likely to improve patient outcomes in clinical practice.


Subject(s)
Arm , Comparative Effectiveness Research , Humans , Informed Consent , Research Subjects , Critical Care
2.
Cancer Metastasis Rev ; 39(2): 337-340, 2020 06.
Article in English | MEDLINE | ID: mdl-32385712

ABSTRACT

Severe coronavirus disease (COVID-19) is characterized by pulmonary hyper-inflammation and potentially life-threatening "cytokine storms". Controlling the local and systemic inflammatory response in COVID-19 may be as important as anti-viral therapies. Endogenous lipid autacoid mediators, referred to as eicosanoids, play a critical role in the induction of inflammation and pro-inflammatory cytokine production. SARS-CoV-2 may trigger a cell death ("debris")-induced "eicosanoid storm", including prostaglandins and leukotrienes, which in turn initiates a robust inflammatory response. A paradigm shift is emerging in our understanding of the resolution of inflammation as an active biochemical process with the discovery of novel endogenous specialized pro-resolving lipid autacoid mediators (SPMs), such as resolvins. Resolvins and other SPMs stimulate macrophage-mediated clearance of debris and counter pro-inflammatory cytokine production, a process called inflammation resolution. SPMs and their lipid precursors exhibit anti-viral activity at nanogram doses in the setting of influenza without being immunosuppressive. SPMs also promote anti-viral B cell antibodies and lymphocyte activity, highlighting their potential use in the treatment of COVID-19. Soluble epoxide hydrolase (sEH) inhibitors stabilize arachidonic acid-derived epoxyeicosatrienoic acids (EETs), which also stimulate inflammation resolution by promoting the production of pro-resolution mediators, activating anti-inflammatory processes, and preventing the cytokine storm. Both resolvins and EETs also attenuate pathological thrombosis and promote clot removal, which is emerging as a key pathology of COVID-19 infection. Thus, both SPMs and sEH inhibitors may promote the resolution of inflammation in COVID-19, thereby reducing acute respiratory distress syndrome (ARDS) and other life-threatening complications associated with robust viral-induced inflammation. While most COVID-19 clinical trials focus on "anti-viral" and "anti-inflammatory" strategies, stimulating inflammation resolution is a novel host-centric therapeutic avenue. Importantly, SPMs and sEH inhibitors are currently in clinical trials for other inflammatory diseases and could be rapidly translated for the management of COVID-19 via debris clearance and inflammatory cytokine suppression. Here, we discuss using pro-resolution mediators as a potential complement to current anti-viral strategies for COVID-19.


Subject(s)
Anti-Inflammatory Agents, Non-Steroidal/therapeutic use , Antiviral Agents/therapeutic use , Betacoronavirus/immunology , Coronavirus Infections/drug therapy , Cytokine Release Syndrome/drug therapy , Pneumonia, Viral/drug therapy , Respiratory Distress Syndrome/therapy , Anti-Inflammatory Agents, Non-Steroidal/pharmacology , Betacoronavirus/isolation & purification , COVID-19 , Clinical Trials as Topic , Coronavirus Infections/complications , Coronavirus Infections/immunology , Coronavirus Infections/virology , Cytokine Release Syndrome/immunology , Cytokines/immunology , Cytokines/metabolism , Eicosanoids/immunology , Eicosanoids/metabolism , Epoxide Hydrolases/antagonists & inhibitors , Epoxide Hydrolases/metabolism , Humans , Macrophages/immunology , Macrophages/metabolism , Pandemics , Pneumonia, Viral/complications , Pneumonia, Viral/immunology , Pneumonia, Viral/virology , Pulmonary Alveoli/immunology , Pulmonary Alveoli/metabolism , Pulmonary Alveoli/virology , Respiratory Distress Syndrome/immunology , SARS-CoV-2
3.
Transfusion ; 60(4): 698-712, 2020 04.
Article in English | MEDLINE | ID: mdl-32086946

ABSTRACT

BACKGROUND: In experimental canine septic shock, depressed circulating granulocyte counts were associated with a poor outcome and increasing counts with prophylactic granulocyte colony-stimulating factor (G-CSF) improved outcome. Therapeutic G-CSF, in contrast, did not improve circulating counts or outcome, and therefore investigation was undertaken to determine whether transfusing granulocytes therapeutically would improve outcome. STUDY DESIGN AND METHODS: Twenty-eight purpose-bred beagles underwent an intrabronchial Staphylococcus aureus challenge and 4 hours later were randomly assigned to granulocyte (40-100 × 109 cells) or plasma transfusion. RESULTS: Granulocyte transfusion significantly expanded the low circulating counts for hours compared to septic controls but was not associated with significant mortality benefit (1/14, 7% vs. 2/14, 14%, respectively; p = 0.29). Septic animals with higher granulocyte count at 4 hours (median [interquartile range] of 3.81 3.39-5.05] vs. 1.77 [1.25-2.50]) had significantly increased survival independent of whether they were transfused with granulocytes. In a subgroup analysis, animals with higher circulating granulocyte counts receiving donor granulocytes had worsened lung injury compared to septic controls. Conversely, donor granulocytes decreased lung injury in septic animals with lower counts. CONCLUSION: During bacterial pneumonia, circulating counts predict the outcome of transfusing granulocytes. With low but normal counts, transfusing granulocytes does not improve survival and injures the lung, whereas for animals with very low counts, but not absolute neutropenia, granulocyte transfusion improves lung function.


Subject(s)
Granulocytes/transplantation , Pneumonia, Bacterial/therapy , Animals , Disease Models, Animal , Dogs , Granulocyte Colony-Stimulating Factor/therapeutic use , Granulocytes/cytology , Leukocyte Count , Leukocyte Transfusion , Lung Injury/prevention & control , Pneumonia, Bacterial/mortality , Staphylococcus aureus/pathogenicity , Tissue Donors , Treatment Outcome
4.
IEEE Pervasive Comput ; 19(3): 68-78, 2020.
Article in English | MEDLINE | ID: mdl-32754005

ABSTRACT

Future healthcare systems will rely heavily on clinical decision support systems (CDSS) to improve the decision-making processes of clinicians. To explore the design of future CDSS, we developed a research-focused CDSS for the management of patients in the intensive care unit that leverages Internet of Things (IoT) devices capable of collecting streaming physiologic data from ventilators and other medical devices. We then created machine learning (ML) models that could analyze the collected physiologic data to determine if the ventilator was delivering potentially harmful therapy and if a deadly respiratory condition, acute respiratory distress syndrome (ARDS), was present. We also present work to aggregate these models into a mobile application that can provide responsive, real-time alerts of changes in ventilation to providers. As illustrated in the recent COVID-19 pandemic, being able to accurately predict ARDS in newly infected patients can assist in prioritizing care. We show that CDSS may be used to analyze physiologic data for clinical event recognition and automated diagnosis, and we also highlight future research avenues for hospital CDSS.

5.
Transfusion ; 59(12): 3628-3638, 2019 12.
Article in English | MEDLINE | ID: mdl-31639229

ABSTRACT

BACKGROUND: During sepsis, higher plasma cell-free hemoglobin (CFH) levels portend worse outcomes. In sepsis models, plasma proteins that bind CFH improve survival. In our canine antibiotic-treated Staphylococcus aureus pneumonia model, with and without red blood cell (RBC) exchange transfusion, commercial human haptoglobin (Hp) concentrates bound and compartmentalized CFH intravascularly, increased CFH clearance, and lowered iron levels, improving shock, lung injury, and survival. We now investigate in our model how very high CFH levels and treatment time affect Hp's beneficial effects. MATERIALS AND METHODS: Two separate canine pneumonia sepsis Hp studies were undertaken: one with exchange transfusion of RBCs after prolonged storage to raise CFH to very high levels and another with rapidly lethal sepsis alone to shorten time to treat. All animals received continuous standard intensive care unit supportive care for 96 hours. RESULTS: Older RBCs markedly elevated plasma CFH levels and, when combined with Hp therapy, created supraphysiologic CFH-Hp complexes that did not increase CFH or iron clearance or improve lung injury and survival. In a rapidly lethal bacterial challenge model without RBC transfusion, Hp binding did not increase clearance of complexes or iron or show benefits seen previously in the less lethal model. DISCUSSION: High-level CFH-Hp complexes may impair clearance mechanisms and eliminate Hp's beneficial effect during sepsis. Rapidly lethal sepsis narrows the therapeutic window for CFH and iron clearance, also decreasing Hp's beneficial effects. In designing clinical trials, dosing and kinetics may be critical factors if Hp infusion is used to treat sepsis.


Subject(s)
Haptoglobins/therapeutic use , Hemoglobins/metabolism , Pneumonia, Staphylococcal/drug therapy , Pneumonia, Staphylococcal/metabolism , Shock, Septic/drug therapy , Shock, Septic/metabolism , Animals , Disease Models, Animal , Dogs , Erythrocyte Transfusion , Pneumonia, Staphylococcal/therapy , Sepsis/drug therapy , Sepsis/metabolism , Sepsis/therapy , Shock, Septic/therapy
6.
Transfusion ; 59(1): 347-358, 2019 01.
Article in English | MEDLINE | ID: mdl-30383305

ABSTRACT

BACKGROUND: Storage temperature is a critical factor for maintaining red-blood cell (RBC) viability, especially during prolonged cold storage. The target range of 1 to 6°C was established decades ago and may no longer be optimal for current blood-banking practices. STUDY DESIGN AND METHODS: Human and canine RBCs were collected under standard conditions and stored in precision-controlled refrigerators at 2°C, 4°C, or 6°C. RESULTS: During 42-day storage, human and canine RBCs showed progressive increases in supernatant non-transferrin-bound iron, cell-free hemoglobin, base deficit, and lactate levels that were overall greater at 6°C and 4°C than at 2°C. Animals transfused with 7-day-old RBCs had similar plasma cell-free hemoglobin and non-transferrin-bound iron levels at 1 to 72 hours for all three temperature conditions by chromium-51 recovery analysis. However, animals transfused with 35-day-old RBCs stored at higher temperatures developed plasma elevations in non-transferrin-bound iron and cell-free hemoglobin at 24 and 72 hours. Despite apparent impaired 35-day storage at 4°C and 6°C compared to 2°C, posttransfusion chromium-51 recovery at 24 hours was superior at higher temperatures. This finding was confounded by a preparation artifact related to an interaction between temperature and storage duration that leads to removal of fragile cells with repeated washing of the radiolabeled RBC test sample and renders the test sample unrepresentative of the stored unit. CONCLUSIONS: RBCs stored at the lower bounds of the temperature range are less metabolically active and produce less anaerobic acidosis and hemolysis, leading to a more suitable transfusion product. The higher refrigeration temperatures are not optimal during extended RBC storage and may confound chromium viability studies.


Subject(s)
Blood Preservation/methods , Chromium/metabolism , Erythrocytes/cytology , Animals , Cells, Cultured , Dogs , Hemolysis/physiology , Humans , Temperature
7.
Nitric Oxide ; 91: 1-14, 2019 10 01.
Article in English | MEDLINE | ID: mdl-31299340

ABSTRACT

Dysfunction in the nitric oxide (NO) signaling pathway can lead to the development of pulmonary hypertension (PH) in mammals. Discovery of an alternative pathway to NO generation involving reduction from nitrate to nitrite and to NO has motivated the evaluation of nitrite as an alternative to inhaled NO for PH. In contrast, inhaled nitrate has not been evaluated to date, and potential benefits include a prolonged half-life and decreased risk of methemoglobinemia. In a canine model of acute hypoxia-induced PH we evaluated the effects of inhaled nitrate to reduce pulmonary arterial pressure (PAP). In a randomized controlled trial, inhaled nitrate was compared to inhaled nitrite and inhaled saline. Exhaled NO, PAP and systemic blood pressures were continuously monitored. Inhaled nitrite significantly decreased PAP and increased exhaled NO. In contrast, inhaled nitrate and inhaled saline did not decrease PAP or increase exhaled NO. Unexpectedly, we found that inhaled nitrite resulted in prolonged (>5 h) exhaled NO release, increase in nitrate venous/arterial levels and a late surge in venous nitrite levels. These findings do not support a therapeutic role for inhaled nitrate in PH but may have therapeutic implications for inhaled nitrite in various disease states.


Subject(s)
Hypertension, Pulmonary/drug therapy , Nitrates/therapeutic use , Sodium Nitrite/therapeutic use , Administration, Inhalation , Animals , Dogs , Hypertension, Pulmonary/etiology , Hypoxia/complications , Hypoxia/physiopathology , Nitrates/administration & dosage , Nitrates/blood , Nitric Oxide/metabolism , Rats , Sodium Nitrite/administration & dosage , Sodium Nitrite/blood
8.
Transfusion ; 57(10): 2338-2347, 2017 10.
Article in English | MEDLINE | ID: mdl-28656646

ABSTRACT

BACKGROUND: No studies have been performed comparing intravenous (IV) iron with transfused red blood cells (RBCs) for treating anemia during infection. In a previous report, transfused older RBCs increased free iron release and mortality in infected animals when compared to fresher cells. We hypothesized that treating anemia during infection with transfused fresh RBCs, with minimal free iron release, would prove superior to IV iron therapy. STUDY DESIGN AND METHODS: Purpose-bred beagles (n = 42) with experimental Staphylococcus aureus pneumonia rendered anemic were randomized to be transfused RBCs stored for 7 days or one of two IV iron preparations (7 mg/kg), iron sucrose, a widely used preparation, or ferumoxytol, a newer formulation that blunts circulating iron levels. RESULTS: Both irons increased the alveolar-arterial oxygen gradient at 24 to 48 hours (p = 0.02-0.001), worsened shock at 16 hours (p = 0.02-0.003, respectively), and reduced survival (transfusion 56%; iron sucrose 8%, p = 0.01; ferumoxytol 9%, p = 0.04). Compared to fresh RBC transfusion, plasma iron measured by non-transferrin-bound iron levels increased with iron sucrose at 7, 10, 13, 16, 24, and 48 hours (p = 0.04 to p < 0.0001) and ferumoxytol at 7, 24, and 48 hours (p = 0.04 to p = 0.004). No significant differences in cardiac filling pressures or performance, hemoglobin (Hb), or cell-free Hb were observed. CONCLUSIONS: During canine experimental bacterial pneumonia, treatment of mild anemia with IV iron significantly increased free iron levels, shock, lung injury, and mortality compared to transfusion of fresh RBCs. This was true for iron preparations that do or do not blunt circulating free iron level elevations. These findings suggest that treatment of anemia with IV iron during infection should be undertaken with caution.


Subject(s)
Anemia/therapy , Erythrocyte Transfusion , Iron/administration & dosage , Pneumonia, Bacterial/complications , Anemia/complications , Anemia/etiology , Anemia/mortality , Animals , Disease Models, Animal , Dogs , Erythrocyte Transfusion/standards , Ferric Compounds/administration & dosage , Ferric Compounds/therapeutic use , Ferric Oxide, Saccharated , Ferrosoferric Oxide/administration & dosage , Ferrosoferric Oxide/therapeutic use , Glucaric Acid/administration & dosage , Glucaric Acid/therapeutic use , Iron/adverse effects , Iron/therapeutic use , Lung Injury , Mortality , Pneumonia, Staphylococcal/therapy
9.
Blood ; 123(9): 1403-11, 2014 Feb 27.
Article in English | MEDLINE | ID: mdl-24366359

ABSTRACT

In a randomized controlled blinded trial, 2-year-old purpose-bred beagles (n = 24), with Staphylococcus aureus pneumonia, were exchanged-transfused with either 7- or 42-day-old washed or unwashed canine universal donor blood (80 mL/kg in 4 divided doses). Washing red cells (RBC) before transfusion had a significantly different effect on canine survival, multiple organ injury, plasma iron, and cell-free hemoglobin (CFH) levels depending on the age of stored blood (all, P < .05 for interactions). Washing older units of blood improved survival rates, shock score, lung injury, cardiac performance and liver function, and reduced levels of non-transferrin bound iron and plasma labile iron. In contrast, washing fresh blood worsened all these same clinical parameters and increased CFH levels. Our data indicate that transfusion of fresh blood, which results in less hemolysis, CFH, and iron release, is less toxic than transfusion of older blood in critically ill infected subjects. However, washing older blood prevented elevations in plasma circulating iron and improved survival and multiple organ injury in animals with an established pulmonary infection. Our data suggest that fresh blood should not be washed routinely because, in a setting of established infection, washed RBC are prone to release CFH and result in worsened clinical outcomes.


Subject(s)
Blood Specimen Collection/methods , Erythrocyte Transfusion/adverse effects , Erythrocyte Transfusion/methods , Erythrocytes/cytology , Iron/blood , Plasma/chemistry , Pneumonia, Staphylococcal/therapy , Acute Lung Injury/etiology , Acute Lung Injury/mortality , Animals , Blood Preservation , Disease Models, Animal , Dogs , Down-Regulation , Iron/isolation & purification , Pneumonia, Staphylococcal/mortality , Treatment Outcome
12.
Transfusion ; 55(11): 2564-75, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26469998

ABSTRACT

BACKGROUND: Massive exchange transfusion of 42-day-old red blood cells (RBCs) in a canine model of Staphylococcus aureus pneumonia resulted in in vivo hemolysis with increases in cell-free hemoglobin (CFH), transferrin-bound iron (TBI), non-transferrin-bound iron (NTBI), and mortality. We have previously shown that washing 42-day-old RBCs before transfusion significantly decreased NTBI levels and mortality, but washing 7-day-old RBCs increased mortality and CFH levels. We now report the results of altering volume, washing, and age of RBCs. STUDY DESIGN AND METHODS: Two-year-old purpose-bred infected beagles were transfused with increasing volumes (5-10, 20-40, or 60-80 mL/kg) of either 42- or 7-day-old RBCs (n = 36) or 80 mL/kg of either unwashed or washed RBCs with increasing storage age (14, 21, 28, or 35 days; n = 40). RESULTS: All volumes transfused (5-80 mL/kg) of 42-day-old RBCs resulted in alike (i.e., not significantly different) increases in TBI during transfusion as well as in CFH, lung injury, and mortality rates after transfusion. Transfusion of 80 mL/kg RBCs stored for 14, 21, 28, and 35 days resulted in increased CFH and NTBI in between levels found at 7 and 42 days of storage. However, washing RBCs of intermediate ages (14-35 days) does not alter NTBI and CFH levels or mortality rates. CONCLUSIONS: Preclinical data suggest that any volume of 42-day-old blood potentially increases risks during established infection. In contrast, even massive volumes of 7-day-old blood result in minimal CFH and NTBI levels and risks. In contrast to the extremes of storage, washing blood stored for intermediate ages does not alter risks of transfusion or NTBI and CFH clearance.


Subject(s)
Erythrocytes/physiology , Exchange Transfusion, Whole Blood/methods , Pneumonia, Staphylococcal/therapy , Animals , Blood Preservation/adverse effects , Disease Models, Animal , Dogs , Erythrocyte Transfusion/adverse effects , Erythrocytes/cytology , Exchange Transfusion, Whole Blood/adverse effects , Time Factors
13.
Transfusion ; 55(11): 2552-63, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26175134

ABSTRACT

BACKGROUND: In canine models, transfused older stored red blood cells (RBCs) hemolyze in vivo resulting in significantly increased intravascular cell-free hemoglobin (CFH) and non-transferrin-bound iron (NTBI). During canine bacterial pneumonia with septic shock, but not in controls, older stored RBCs were associated with significantly increased lung injury and mortality. It is unknown if in shock without infection transfusion of older RBCs will result in similar adverse effects. STUDY DESIGN AND METHODS: Two-year-old purpose-bred beagles (n = 12) were transfused similar quantities of either older (42-day) or fresher (7-day) stored universal donor canine RBCs 2.5 hours after undergoing controlled hemorrhage (55 mL/kg). RESULTS: With older transfused RBCs, CFH (p < 0.0001) and NTBI (p = 0.004) levels increased, but lung injury (p = 0.01) and C-reactive protein levels (p = 0.002) declined and there was a trend toward lower mortality (18% vs. 50%). All three deaths after transfused fresher RBCs resulted from hepatic fractures. Lowered exogenous norepinephrine requirements (p < 0.05) and cardiac outputs (p < 0.05) after older transfused RBCs were associated with increased CFH levels that have known vasoconstrictive nitric oxide scavenging capability. CONCLUSIONS: In hemorrhagic shock, older RBCs altered resuscitation physiology but did not worsen clinical outcomes. Elevated CFH may lower norepinephrine requirements and cardiac outputs ameliorating reperfusion injuries. With hemorrhagic shock, NTBI levels persist in contrast to the increased clearance, lung injury, and mortality in the previously reported infection model. These preclinical data suggest that whereas iron derived from older RBCs promotes bacterial growth, worsening septic shock mortality during infection, release of CFH and NTBI during hemorrhagic shock is not necessarily harmful.


Subject(s)
Erythrocyte Transfusion/methods , Erythrocytes/physiology , Reperfusion Injury/therapy , Shock, Hemorrhagic/therapy , Animals , Blood Preservation , Dogs , Humans , Random Allocation
14.
Am J Physiol Endocrinol Metab ; 307(11): E994-E1008, 2014 Dec 01.
Article in English | MEDLINE | ID: mdl-25294215

ABSTRACT

The clinical significance and even existence of critical illness-related corticosteroid insufficiency is controversial. Here, hypothalamic-pituitary-adrenal (HPA) function was characterized in severe canine Staphylococcus aureus pneumonia. Animals received antibiotics and titrated life-supportive measures. Treatment with dexamethasone, a glucocorticoid, but not desoxycorticosterone, a mineralocorticoid, improves outcome in this model. Total and free cortisol, adrenocorticotropic hormone (ACTH). and aldosterone levels, as well as responses to exogenous ACTH were measured serially. At 10 h after the onset of infection, the acute HPA axis stress response, as measured by cortisol levels, exceeded that seen with high-dose ACTH stimulation but was not predictive of outcome. In contrast to cortisol, aldosterone was largely autonomous from HPA axis control, elevated longer, and more closely associated with survival in early septic shock. Importantly, dexamethasone suppressed cortisol and ACTH levels and restored ACTH responsiveness in survivors. Differing strikingly, nonsurvivors, sepsis-induced hypercortisolemia, and high ACTH levels as well as ACTH hyporesponsiveness were not influenced by dexamethasone. During septic shock, only serial measurements and provocative testing over a well-defined timeline were able to demonstrate a strong relationship between HPA axis function and prognosis. HPA axis unresponsiveness and high aldosterone levels identify a septic shock subpopulation with poor outcomes that may have the greatest potential to benefit from new therapies.


Subject(s)
Dog Diseases/physiopathology , Hypothalamo-Hypophyseal System/physiopathology , Pituitary-Adrenal System/physiopathology , Staphylococcal Infections/physiopathology , Staphylococcal Infections/veterinary , Adrenocorticotropic Hormone/metabolism , Animals , Dexamethasone , Dogs , Hydrocortisone/metabolism , Mineralocorticoids/metabolism , Pneumonia, Staphylococcal/physiopathology , Pneumonia, Staphylococcal/veterinary , Sepsis/physiopathology , Sepsis/veterinary , Survival Analysis
15.
Transfusion ; 54(7): 1712-24, 2014 Jul.
Article in English | MEDLINE | ID: mdl-24588210

ABSTRACT

BACKGROUND: In experimental pneumonia we found that transfused older blood increased mortality and lung injury that was associated with increased in vivo hemolysis and elevated plasma cell-free hemoglobin (CFH), non-transferrin-bound iron (NTBI), and plasma labile iron (PLI) levels. In this study, we additionally analyze identically treated animals that received lower or higher bacterial doses. STUDY DESIGN AND METHODS: Two-year-old purpose-bred beagles (n = 48) challenged intrabronchially with Staphylococcus aureus (0 [n = 8], 1.0 × 10(9) [n = 8], 1.25 × 10(9) [n = 24], and ≥1.5 × 10(9) [n = 8] colony-forming units/kg) were exchange transfused with either 7- or 42-day-old canine universal donor blood (80 mL/kg in four divided doses). RESULTS: The greater increases in CFH with older blood over days after exchange proved relatively independent of bacterial dose. The lesser increases in CFH observed with fresher blood were bacterial dose dependent potentially related to bacterial hemolysins. Without bacterial challenge, levels of CFH, NTBI, and PLI were significantly higher with older versus fresher blood transfusion but there was no significant measurable injury. With higher-dose bacterial challenge, the elevated NTBI and PLI levels declined more rapidly and to a greater extent after transfusion with older versus fresher blood, and older blood was associated with significantly worse shock, lung injury, and mortality. CONCLUSION: The augmented in vivo hemolysis of transfused older red blood cells (RBCs) appears to result in excess plasma CFH and iron release, which requires the presence of established infection to worsen outcome. These data suggest that transfused older RBCs increase the risks from infection in septic subjects.


Subject(s)
Acute Lung Injury/etiology , Blood Preservation/adverse effects , Exchange Transfusion, Whole Blood/adverse effects , Pneumonia, Staphylococcal/complications , Staphylococcus aureus , Acute Lung Injury/blood , Animals , Disease Models, Animal , Dogs , Iron/blood , Pneumonia, Staphylococcal/blood , Risk Factors , Severity of Illness Index , Time Factors , Treatment Outcome
16.
Am J Respir Crit Care Med ; 187(7): 761-7, 2013 Apr 01.
Article in English | MEDLINE | ID: mdl-23370917

ABSTRACT

RATIONALE: A revised definition of clinical criteria for acute respiratory distress syndrome (ARDS), the Berlin definition, was recently established to classify patients according to their severity. OBJECTIVE: To evaluate the accuracy of these clinical criteria using diffuse alveolar damage (DAD) at autopsy as the reference standard. METHODS: All patients who died and had a clinical autopsy in our intensive care unit over a 20-year period (1991-2010) were included. Patients with clinical criteria for ARDS were identified from the medical charts and were classified as mild, moderate, or severe according to the Berlin definition using PaO2/FiO2 oxygenation criteria. Microscopic analysis from each pulmonary lobe was performed by two pathologists. MEASUREMENTS AND MAIN RESULTS: Among 712 autopsies analyzed, 356 patients had clinical criteria for ARDS at time of death, classified as mild (n = 49, 14%), moderate (n = 141, 40%), and severe (n = 166, 46%). Sensitivity was 89% and specificity 63% to identify ARDS using the Berlin definition. DAD was found in 159 of 356 (45%) patients with clinical criteria for ARDS (in 12, 40, and 58% of patients with mild, moderate, and severe ARDS, respectively). DAD was more frequent in patients who met clinical criteria for ARDS during more than 72 hours and was found in 69% of those with severe ARDS for 72 hours or longer. CONCLUSIONS: Histopathological findings were correlated to severity and duration of ARDS. Using clinical criteria the revised Berlin definition for ARDS allowed the identification of severe ARDS of more than 72 hours as a homogeneous group of patients characterized by a high proportion of DAD.


Subject(s)
Pulmonary Alveoli/pathology , Respiratory Distress Syndrome/pathology , Aged , Aged, 80 and over , Autopsy , Female , Humans , Male , Middle Aged , Reference Standards , Respiratory Distress Syndrome/classification , Sensitivity and Specificity , Severity of Illness Index
17.
Article in English | MEDLINE | ID: mdl-38745445

ABSTRACT

BACKGROUND: Bleeding is a known complication during bronchoscopy, with increased incidence in patients undergoing a more invasive procedure. Phenylephrine is a potent vasoconstrictor that can control airway bleeding when applied topically and has been used as an alternative to epinephrine. The clinical effects of endobronchial phenylephrine on systemic vasoconstriction have not been clearly evaluated. Here, we compared the effects of endobronchial phenylephrine versus cold saline on systemic blood pressure. METHODS: In all, 160 patients who underwent bronchoscopy and received either endobronchial phenylephrine or cold saline from July 1, 2017 to June 30, 2022 were included in this retrospective observational study. Intra-procedural blood pressure absolute and percent changes were measured and compared between the 2 groups. RESULTS: There were no observed statistical differences in blood pressure changes between groups. The median absolute change between the median and the maximum intra-procedural systolic blood pressure in the cold saline group was 29 mm Hg (IQR 19 to 41) compared with 31.8 mm Hg (IQR 18 to 45.5) in the phenylephrine group. The corresponding median percent changes in SBP were 33.6 % (IQR 18.8 to 39.4) and 28% (IQR 16.8 to 43.5) for the cold saline and phenylephrine groups, respectively. Similarly, there were no statistically significant differences in diastolic and mean arterial blood pressure changes between both groups. CONCLUSIONS: We found no significant differences in median intra-procedural systemic blood pressure changes comparing patients who received endobronchial cold saline to those receiving phenylephrine. Overall, this argues for the vascular and systemic safety of phenylephrine for airway bleeding as a reasonable alternative to epinephrine.


Subject(s)
Bronchoscopy , Phenylephrine , Vasoconstrictor Agents , Humans , Phenylephrine/administration & dosage , Phenylephrine/adverse effects , Retrospective Studies , Bronchoscopy/adverse effects , Bronchoscopy/methods , Male , Female , Middle Aged , Aged , Vasoconstrictor Agents/administration & dosage , Vasoconstrictor Agents/adverse effects , Hypertension/drug therapy , Blood Pressure/drug effects
18.
Am J Crit Care ; 33(3): 171-179, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38688854

ABSTRACT

BACKGROUND: Early mobility interventions in intensive care units (ICUs) are safe and improve outcomes in subsets of critically ill adults. However, implementation varies, and the optimal mobility dose remains unclear. OBJECTIVE: To test for associations between daily dose of out-of-bed mobility and patient outcomes in different ICUs. METHODS: In this retrospective cohort study of electronic records from 7 adult ICUs in an academic quarternary hospital, multivariable linear regression was used to examine the effects of out-of-bed events per mobility-eligible day on mechanical ventilation duration and length of ICU and hospital stays. RESULTS: In total, 8609 adults hospitalized in ICUs from 2015 through 2018 were included. Patients were mobilized out of bed on 46.5% of ICU days and were eligible for mobility interventions on a median (IQR) of 2.0 (1-3) of 2.7 (2-9) ICU days. Median (IQR) out-of-bed events per mobility-eligible day were 0.5 (0-1.2) among all patients. For every unit increase in out-of-bed events per mobility-eligible day before extubation, mechanical ventilation duration decreased by 10% (adjusted coefficient [95% CI], -0.10 [-0.18 to -0.01]). Daily mobility increased ICU stays by 4% (adjusted coefficient [95% CI], 0.04 [0.03-0.06]) and decreased hospital stays by 5% (adjusted coefficient [95% CI], -0.05 [-0.07 to -0.03]). Effect sizes differed among ICUs. CONCLUSIONS: More daily out-of-bed mobility for ICU patients was associated with shorter mechanical ventilation duration and hospital stays, suggesting a dose-response relationship between daily mobility and patient outcomes. However, relationships differed across ICU subpopulations.


Subject(s)
Critical Illness , Early Ambulation , Intensive Care Units , Length of Stay , Respiration, Artificial , Humans , Retrospective Studies , Male , Female , Early Ambulation/statistics & numerical data , Early Ambulation/methods , Middle Aged , Respiration, Artificial/statistics & numerical data , Length of Stay/statistics & numerical data , Aged , Adult
19.
Respir Care ; 2024 Apr 23.
Article in English | MEDLINE | ID: mdl-38653556

ABSTRACT

BACKGROUND: The ratio of oxygen saturation index (ROX index; or SpO2 /FIO2 /breathing frequency) has been shown to predict risk of intubation after high-flow nasal cannula (HFNC) support among adults with acute hypoxemic respiratory failure primarily due to pneumonia. However, its predictive value for other subtypes of respiratory failure is unknown. This study investigated whether the ROX index predicts liberation from HFNC or noninvasive ventilation (NIV), intubation with mechanical ventilation, or death in adults admitted for respiratory failure due to an exacerbation of COPD. METHODS: We performed a retrospective study of 260 adults hospitalized with a COPD exacerbation and treated with HFNC and/or NIV (continuous or bi-level). ROX index scores were collected at treatment initiation and predefined time intervals throughout HFNC and/or NIV treatment or until the subject was intubated or died. A ROX index score of ≥ 4.88 was applied to the cohort to determine if the same score would perform similarly in this different cohort. Accuracy of the ROX index was determined by calculating the area under the receiver operator curve. RESULTS: A total of 47 subjects (18%) required invasive mechanical ventilation or died while on HFNC/NIV. The ROX index at treatment initiation, 1 h, and 6 h demonstrated the best prediction accuracy for avoidance of invasive mechanical ventilation or death (area under the receiver operator curve 0.73 [95% CI 0.66-0.80], 0.72 [95% CI 0.65-0.79], and 0.72 [95% CI 0.63-0.82], respectively). The optimal cutoff value for sensitivity (Sn) and specificity (Sp) was a ROX index score > 6.88 (sensitivity 62%, specificity 57%). CONCLUSIONS: The ROX index applied to adults with COPD exacerbations treated with HFNC and/or NIV required higher scores to achieve similar prediction of low risk of treatment failure when compared to subjects with hypoxemic respiratory failure/pneumonia. ROX scores < 4.88 did not accurately predict intubation or death.

SELECTION OF CITATIONS
SEARCH DETAIL