RESUMO
Treatment with mineralocorticoid receptor (MR) antagonists beginning at the outset of disease, or early thereafter, prevents pulmonary vascular remodeling in preclinical models of pulmonary arterial hypertension (PAH). However, the efficacy of MR blockade in established disease, a more clinically relevant condition, remains unknown. Therefore, we investigated the effectiveness of two MR antagonists, eplerenone (EPL) and spironolactone (SPL), after the development of severe right ventricular (RV) dysfunction in the rat SU5416-hypoxia (SuHx) PAH model. Cardiac magnetic resonance imaging (MRI) in SuHx rats at the end of week 5, before study treatment, confirmed features of established disease including reduced RV ejection fraction and RV hypertrophy, pronounced septal flattening with impaired left ventricular filling and reduced cardiac index. Five weeks of treatment with either EPL or SPL improved left ventricular filling and prevented the further decline in cardiac index compared with placebo. Interventricular septal displacement was reduced by EPL whereas SPL effects were similar, but not significant. Although MR antagonists did not significantly reduce pulmonary artery pressure or vessel remodeling in SuHx rats with established disease, animals with higher drug levels had lower pulmonary pressures. Consistent with effects on cardiac function, EPL treatment tended to suppress MR and proinflammatory gene induction in the RV. In conclusion, MR antagonist treatment led to modest, but consistent beneficial effects on interventricular dependence after the onset of significant RV dysfunction in the SuHx PAH model. These results suggest that measures of RV structure and/or function may be useful endpoints in clinical trials of MR antagonists in patients with PAH.
Assuntos
Hipertensão Pulmonar , Hipertensão Arterial Pulmonar , Disfunção Ventricular Direita , Animais , Modelos Animais de Doenças , Hipertensão Pulmonar Primária Familiar , Humanos , Hipertensão Pulmonar/tratamento farmacológico , Hipóxia/tratamento farmacológico , Indóis , Antagonistas de Receptores de Mineralocorticoides/farmacologia , Antagonistas de Receptores de Mineralocorticoides/uso terapêutico , Pirróis , Ratos , Disfunção Ventricular Direita/tratamento farmacológicoRESUMO
BACKGROUND: In experimental canine septic shock, depressed circulating granulocyte counts were associated with a poor outcome and increasing counts with prophylactic granulocyte colony-stimulating factor (G-CSF) improved outcome. Therapeutic G-CSF, in contrast, did not improve circulating counts or outcome, and therefore investigation was undertaken to determine whether transfusing granulocytes therapeutically would improve outcome. STUDY DESIGN AND METHODS: Twenty-eight purpose-bred beagles underwent an intrabronchial Staphylococcus aureus challenge and 4 hours later were randomly assigned to granulocyte (40-100 × 109 cells) or plasma transfusion. RESULTS: Granulocyte transfusion significantly expanded the low circulating counts for hours compared to septic controls but was not associated with significant mortality benefit (1/14, 7% vs. 2/14, 14%, respectively; p = 0.29). Septic animals with higher granulocyte count at 4 hours (median [interquartile range] of 3.81 3.39-5.05] vs. 1.77 [1.25-2.50]) had significantly increased survival independent of whether they were transfused with granulocytes. In a subgroup analysis, animals with higher circulating granulocyte counts receiving donor granulocytes had worsened lung injury compared to septic controls. Conversely, donor granulocytes decreased lung injury in septic animals with lower counts. CONCLUSION: During bacterial pneumonia, circulating counts predict the outcome of transfusing granulocytes. With low but normal counts, transfusing granulocytes does not improve survival and injures the lung, whereas for animals with very low counts, but not absolute neutropenia, granulocyte transfusion improves lung function.
Assuntos
Granulócitos/transplante , Pneumonia Bacteriana/terapia , Animais , Modelos Animais de Doenças , Cães , Fator Estimulador de Colônias de Granulócitos/uso terapêutico , Granulócitos/citologia , Contagem de Leucócitos , Transfusão de Leucócitos , Lesão Pulmonar/prevenção & controle , Pneumonia Bacteriana/mortalidade , Staphylococcus aureus/patogenicidade , Doadores de Tecidos , Resultado do TratamentoRESUMO
BACKGROUND: Storage temperature is a critical factor for maintaining red-blood cell (RBC) viability, especially during prolonged cold storage. The target range of 1 to 6°C was established decades ago and may no longer be optimal for current blood-banking practices. STUDY DESIGN AND METHODS: Human and canine RBCs were collected under standard conditions and stored in precision-controlled refrigerators at 2°C, 4°C, or 6°C. RESULTS: During 42-day storage, human and canine RBCs showed progressive increases in supernatant non-transferrin-bound iron, cell-free hemoglobin, base deficit, and lactate levels that were overall greater at 6°C and 4°C than at 2°C. Animals transfused with 7-day-old RBCs had similar plasma cell-free hemoglobin and non-transferrin-bound iron levels at 1 to 72 hours for all three temperature conditions by chromium-51 recovery analysis. However, animals transfused with 35-day-old RBCs stored at higher temperatures developed plasma elevations in non-transferrin-bound iron and cell-free hemoglobin at 24 and 72 hours. Despite apparent impaired 35-day storage at 4°C and 6°C compared to 2°C, posttransfusion chromium-51 recovery at 24 hours was superior at higher temperatures. This finding was confounded by a preparation artifact related to an interaction between temperature and storage duration that leads to removal of fragile cells with repeated washing of the radiolabeled RBC test sample and renders the test sample unrepresentative of the stored unit. CONCLUSIONS: RBCs stored at the lower bounds of the temperature range are less metabolically active and produce less anaerobic acidosis and hemolysis, leading to a more suitable transfusion product. The higher refrigeration temperatures are not optimal during extended RBC storage and may confound chromium viability studies.
Assuntos
Preservação de Sangue/métodos , Cromo/metabolismo , Eritrócitos/citologia , Animais , Células Cultivadas , Cães , Hemólise/fisiologia , Humanos , TemperaturaRESUMO
BACKGROUND: During sepsis, higher plasma cell-free hemoglobin (CFH) levels portend worse outcomes. In sepsis models, plasma proteins that bind CFH improve survival. In our canine antibiotic-treated Staphylococcus aureus pneumonia model, with and without red blood cell (RBC) exchange transfusion, commercial human haptoglobin (Hp) concentrates bound and compartmentalized CFH intravascularly, increased CFH clearance, and lowered iron levels, improving shock, lung injury, and survival. We now investigate in our model how very high CFH levels and treatment time affect Hp's beneficial effects. MATERIALS AND METHODS: Two separate canine pneumonia sepsis Hp studies were undertaken: one with exchange transfusion of RBCs after prolonged storage to raise CFH to very high levels and another with rapidly lethal sepsis alone to shorten time to treat. All animals received continuous standard intensive care unit supportive care for 96 hours. RESULTS: Older RBCs markedly elevated plasma CFH levels and, when combined with Hp therapy, created supraphysiologic CFH-Hp complexes that did not increase CFH or iron clearance or improve lung injury and survival. In a rapidly lethal bacterial challenge model without RBC transfusion, Hp binding did not increase clearance of complexes or iron or show benefits seen previously in the less lethal model. DISCUSSION: High-level CFH-Hp complexes may impair clearance mechanisms and eliminate Hp's beneficial effect during sepsis. Rapidly lethal sepsis narrows the therapeutic window for CFH and iron clearance, also decreasing Hp's beneficial effects. In designing clinical trials, dosing and kinetics may be critical factors if Hp infusion is used to treat sepsis.
Assuntos
Haptoglobinas/uso terapêutico , Hemoglobinas/metabolismo , Pneumonia Estafilocócica/tratamento farmacológico , Pneumonia Estafilocócica/metabolismo , Choque Séptico/tratamento farmacológico , Choque Séptico/metabolismo , Animais , Modelos Animais de Doenças , Cães , Transfusão de Eritrócitos , Pneumonia Estafilocócica/terapia , Sepse/tratamento farmacológico , Sepse/metabolismo , Sepse/terapia , Choque Séptico/terapiaRESUMO
Dysfunction in the nitric oxide (NO) signaling pathway can lead to the development of pulmonary hypertension (PH) in mammals. Discovery of an alternative pathway to NO generation involving reduction from nitrate to nitrite and to NO has motivated the evaluation of nitrite as an alternative to inhaled NO for PH. In contrast, inhaled nitrate has not been evaluated to date, and potential benefits include a prolonged half-life and decreased risk of methemoglobinemia. In a canine model of acute hypoxia-induced PH we evaluated the effects of inhaled nitrate to reduce pulmonary arterial pressure (PAP). In a randomized controlled trial, inhaled nitrate was compared to inhaled nitrite and inhaled saline. Exhaled NO, PAP and systemic blood pressures were continuously monitored. Inhaled nitrite significantly decreased PAP and increased exhaled NO. In contrast, inhaled nitrate and inhaled saline did not decrease PAP or increase exhaled NO. Unexpectedly, we found that inhaled nitrite resulted in prolonged (>5â¯h) exhaled NO release, increase in nitrate venous/arterial levels and a late surge in venous nitrite levels. These findings do not support a therapeutic role for inhaled nitrate in PH but may have therapeutic implications for inhaled nitrite in various disease states.
Assuntos
Hipertensão Pulmonar/tratamento farmacológico , Nitratos/uso terapêutico , Nitrito de Sódio/uso terapêutico , Administração por Inalação , Animais , Cães , Hipertensão Pulmonar/etiologia , Hipóxia/complicações , Hipóxia/fisiopatologia , Nitratos/administração & dosagem , Nitratos/sangue , Óxido Nítrico/metabolismo , Ratos , Nitrito de Sódio/administração & dosagem , Nitrito de Sódio/sangueRESUMO
BACKGROUND: No studies have been performed comparing intravenous (IV) iron with transfused red blood cells (RBCs) for treating anemia during infection. In a previous report, transfused older RBCs increased free iron release and mortality in infected animals when compared to fresher cells. We hypothesized that treating anemia during infection with transfused fresh RBCs, with minimal free iron release, would prove superior to IV iron therapy. STUDY DESIGN AND METHODS: Purpose-bred beagles (n = 42) with experimental Staphylococcus aureus pneumonia rendered anemic were randomized to be transfused RBCs stored for 7 days or one of two IV iron preparations (7 mg/kg), iron sucrose, a widely used preparation, or ferumoxytol, a newer formulation that blunts circulating iron levels. RESULTS: Both irons increased the alveolar-arterial oxygen gradient at 24 to 48 hours (p = 0.02-0.001), worsened shock at 16 hours (p = 0.02-0.003, respectively), and reduced survival (transfusion 56%; iron sucrose 8%, p = 0.01; ferumoxytol 9%, p = 0.04). Compared to fresh RBC transfusion, plasma iron measured by non-transferrin-bound iron levels increased with iron sucrose at 7, 10, 13, 16, 24, and 48 hours (p = 0.04 to p < 0.0001) and ferumoxytol at 7, 24, and 48 hours (p = 0.04 to p = 0.004). No significant differences in cardiac filling pressures or performance, hemoglobin (Hb), or cell-free Hb were observed. CONCLUSIONS: During canine experimental bacterial pneumonia, treatment of mild anemia with IV iron significantly increased free iron levels, shock, lung injury, and mortality compared to transfusion of fresh RBCs. This was true for iron preparations that do or do not blunt circulating free iron level elevations. These findings suggest that treatment of anemia with IV iron during infection should be undertaken with caution.
Assuntos
Anemia/terapia , Transfusão de Eritrócitos , Ferro/administração & dosagem , Pneumonia Bacteriana/complicações , Anemia/complicações , Anemia/etiologia , Anemia/mortalidade , Animais , Modelos Animais de Doenças , Cães , Transfusão de Eritrócitos/normas , Compostos Férricos/administração & dosagem , Compostos Férricos/uso terapêutico , Óxido de Ferro Sacarado , Óxido Ferroso-Férrico/administração & dosagem , Óxido Ferroso-Férrico/uso terapêutico , Ácido Glucárico/administração & dosagem , Ácido Glucárico/uso terapêutico , Ferro/efeitos adversos , Ferro/uso terapêutico , Lesão Pulmonar , Mortalidade , Pneumonia Estafilocócica/terapiaRESUMO
In a randomized controlled blinded trial, 2-year-old purpose-bred beagles (n = 24), with Staphylococcus aureus pneumonia, were exchanged-transfused with either 7- or 42-day-old washed or unwashed canine universal donor blood (80 mL/kg in 4 divided doses). Washing red cells (RBC) before transfusion had a significantly different effect on canine survival, multiple organ injury, plasma iron, and cell-free hemoglobin (CFH) levels depending on the age of stored blood (all, P < .05 for interactions). Washing older units of blood improved survival rates, shock score, lung injury, cardiac performance and liver function, and reduced levels of non-transferrin bound iron and plasma labile iron. In contrast, washing fresh blood worsened all these same clinical parameters and increased CFH levels. Our data indicate that transfusion of fresh blood, which results in less hemolysis, CFH, and iron release, is less toxic than transfusion of older blood in critically ill infected subjects. However, washing older blood prevented elevations in plasma circulating iron and improved survival and multiple organ injury in animals with an established pulmonary infection. Our data suggest that fresh blood should not be washed routinely because, in a setting of established infection, washed RBC are prone to release CFH and result in worsened clinical outcomes.
Assuntos
Coleta de Amostras Sanguíneas/métodos , Transfusão de Eritrócitos/efeitos adversos , Transfusão de Eritrócitos/métodos , Eritrócitos/citologia , Ferro/sangue , Plasma/química , Pneumonia Estafilocócica/terapia , Lesão Pulmonar Aguda/etiologia , Lesão Pulmonar Aguda/mortalidade , Animais , Preservação de Sangue , Modelos Animais de Doenças , Cães , Regulação para Baixo , Ferro/isolamento & purificação , Pneumonia Estafilocócica/mortalidade , Resultado do TratamentoRESUMO
Two-year-old purpose-bred beagles (n = 24) infected with Staphylococcus aureus pneumonia were randomized in a blinded fashion for exchange transfusion with either 7- or 42-day-old canine universal donor blood (80 mL/kg in 4 divided doses). Older blood increased mortality (P = .0005), the arterial alveolar oxygen gradient (24-48 hours after infection; P ≤ .01), systemic and pulmonary pressures during transfusion (4-16 hours) and pulmonary pressures for ~ 10 hours afterward (all P ≤ .02). Further, older blood caused more severe lung damage, evidenced by increased necrosis, hemorrhage, and thrombosis (P = .03) noted at the infection site postmortem. Plasma cellfree hemoglobin and nitric oxide (NO) consumption capability were elevated and haptoglobin levels were decreased with older blood during and for 32 hours after transfusion (all P ≤ .03). The low haptoglobin (r = 0.61; P = .003) and high NO consumption levels at 24 hours (r = −0.76; P < .0001) were associated with poor survival. Plasma nontransferrin-bound and labile iron were significantly elevated only during transfusion (both P = .03) and not associated with survival (P = NS). These data from canines indicate that older blood after transfusion has a propensity to hemolyze in vivo, releases vasoconstrictive cell-free hemoglobin over days, worsens pulmonary hypertension, gas exchange, and ischemic vascular damage in the infected lung, and thereby increases the risk of death from transfusion.
Assuntos
Preservação de Sangue/efeitos adversos , Transfusão Total/mortalidade , Pneumonia Estafilocócica/mortalidade , Pneumonia Estafilocócica/terapia , Animais , Modelos Animais de Doenças , Cães , Transfusão Total/efeitos adversos , Transfusão Total/métodos , Frequência Cardíaca/fisiologia , Hipertensão Pulmonar/etiologia , Pneumonia Estafilocócica/patologia , Pneumonia Estafilocócica/fisiopatologia , Troca Gasosa Pulmonar/fisiologia , Distribuição Aleatória , Método Simples-Cego , Staphylococcus aureus/fisiologia , Análise de Sobrevida , Fatores de TempoRESUMO
BACKGROUND: In canine models, transfused older stored red blood cells (RBCs) hemolyze in vivo resulting in significantly increased intravascular cell-free hemoglobin (CFH) and non-transferrin-bound iron (NTBI). During canine bacterial pneumonia with septic shock, but not in controls, older stored RBCs were associated with significantly increased lung injury and mortality. It is unknown if in shock without infection transfusion of older RBCs will result in similar adverse effects. STUDY DESIGN AND METHODS: Two-year-old purpose-bred beagles (n = 12) were transfused similar quantities of either older (42-day) or fresher (7-day) stored universal donor canine RBCs 2.5 hours after undergoing controlled hemorrhage (55 mL/kg). RESULTS: With older transfused RBCs, CFH (p < 0.0001) and NTBI (p = 0.004) levels increased, but lung injury (p = 0.01) and C-reactive protein levels (p = 0.002) declined and there was a trend toward lower mortality (18% vs. 50%). All three deaths after transfused fresher RBCs resulted from hepatic fractures. Lowered exogenous norepinephrine requirements (p < 0.05) and cardiac outputs (p < 0.05) after older transfused RBCs were associated with increased CFH levels that have known vasoconstrictive nitric oxide scavenging capability. CONCLUSIONS: In hemorrhagic shock, older RBCs altered resuscitation physiology but did not worsen clinical outcomes. Elevated CFH may lower norepinephrine requirements and cardiac outputs ameliorating reperfusion injuries. With hemorrhagic shock, NTBI levels persist in contrast to the increased clearance, lung injury, and mortality in the previously reported infection model. These preclinical data suggest that whereas iron derived from older RBCs promotes bacterial growth, worsening septic shock mortality during infection, release of CFH and NTBI during hemorrhagic shock is not necessarily harmful.
Assuntos
Transfusão de Eritrócitos/métodos , Eritrócitos/fisiologia , Traumatismo por Reperfusão/terapia , Choque Hemorrágico/terapia , Animais , Preservação de Sangue , Cães , Humanos , Distribuição AleatóriaRESUMO
BACKGROUND: Massive exchange transfusion of 42-day-old red blood cells (RBCs) in a canine model of Staphylococcus aureus pneumonia resulted in in vivo hemolysis with increases in cell-free hemoglobin (CFH), transferrin-bound iron (TBI), non-transferrin-bound iron (NTBI), and mortality. We have previously shown that washing 42-day-old RBCs before transfusion significantly decreased NTBI levels and mortality, but washing 7-day-old RBCs increased mortality and CFH levels. We now report the results of altering volume, washing, and age of RBCs. STUDY DESIGN AND METHODS: Two-year-old purpose-bred infected beagles were transfused with increasing volumes (5-10, 20-40, or 60-80 mL/kg) of either 42- or 7-day-old RBCs (n = 36) or 80 mL/kg of either unwashed or washed RBCs with increasing storage age (14, 21, 28, or 35 days; n = 40). RESULTS: All volumes transfused (5-80 mL/kg) of 42-day-old RBCs resulted in alike (i.e., not significantly different) increases in TBI during transfusion as well as in CFH, lung injury, and mortality rates after transfusion. Transfusion of 80 mL/kg RBCs stored for 14, 21, 28, and 35 days resulted in increased CFH and NTBI in between levels found at 7 and 42 days of storage. However, washing RBCs of intermediate ages (14-35 days) does not alter NTBI and CFH levels or mortality rates. CONCLUSIONS: Preclinical data suggest that any volume of 42-day-old blood potentially increases risks during established infection. In contrast, even massive volumes of 7-day-old blood result in minimal CFH and NTBI levels and risks. In contrast to the extremes of storage, washing blood stored for intermediate ages does not alter risks of transfusion or NTBI and CFH clearance.
Assuntos
Eritrócitos/fisiologia , Transfusão Total/métodos , Pneumonia Estafilocócica/terapia , Animais , Preservação de Sangue/efeitos adversos , Modelos Animais de Doenças , Cães , Transfusão de Eritrócitos/efeitos adversos , Eritrócitos/citologia , Transfusão Total/efeitos adversos , Fatores de TempoRESUMO
The clinical significance and even existence of critical illness-related corticosteroid insufficiency is controversial. Here, hypothalamic-pituitary-adrenal (HPA) function was characterized in severe canine Staphylococcus aureus pneumonia. Animals received antibiotics and titrated life-supportive measures. Treatment with dexamethasone, a glucocorticoid, but not desoxycorticosterone, a mineralocorticoid, improves outcome in this model. Total and free cortisol, adrenocorticotropic hormone (ACTH). and aldosterone levels, as well as responses to exogenous ACTH were measured serially. At 10 h after the onset of infection, the acute HPA axis stress response, as measured by cortisol levels, exceeded that seen with high-dose ACTH stimulation but was not predictive of outcome. In contrast to cortisol, aldosterone was largely autonomous from HPA axis control, elevated longer, and more closely associated with survival in early septic shock. Importantly, dexamethasone suppressed cortisol and ACTH levels and restored ACTH responsiveness in survivors. Differing strikingly, nonsurvivors, sepsis-induced hypercortisolemia, and high ACTH levels as well as ACTH hyporesponsiveness were not influenced by dexamethasone. During septic shock, only serial measurements and provocative testing over a well-defined timeline were able to demonstrate a strong relationship between HPA axis function and prognosis. HPA axis unresponsiveness and high aldosterone levels identify a septic shock subpopulation with poor outcomes that may have the greatest potential to benefit from new therapies.
Assuntos
Doenças do Cão/fisiopatologia , Sistema Hipotálamo-Hipofisário/fisiopatologia , Sistema Hipófise-Suprarrenal/fisiopatologia , Infecções Estafilocócicas/fisiopatologia , Infecções Estafilocócicas/veterinária , Hormônio Adrenocorticotrópico/metabolismo , Animais , Dexametasona , Cães , Hidrocortisona/metabolismo , Mineralocorticoides/metabolismo , Pneumonia Estafilocócica/fisiopatologia , Pneumonia Estafilocócica/veterinária , Sepse/fisiopatologia , Sepse/veterinária , Análise de SobrevidaRESUMO
BACKGROUND: In experimental pneumonia we found that transfused older blood increased mortality and lung injury that was associated with increased in vivo hemolysis and elevated plasma cell-free hemoglobin (CFH), non-transferrin-bound iron (NTBI), and plasma labile iron (PLI) levels. In this study, we additionally analyze identically treated animals that received lower or higher bacterial doses. STUDY DESIGN AND METHODS: Two-year-old purpose-bred beagles (n = 48) challenged intrabronchially with Staphylococcus aureus (0 [n = 8], 1.0 × 10(9) [n = 8], 1.25 × 10(9) [n = 24], and ≥1.5 × 10(9) [n = 8] colony-forming units/kg) were exchange transfused with either 7- or 42-day-old canine universal donor blood (80 mL/kg in four divided doses). RESULTS: The greater increases in CFH with older blood over days after exchange proved relatively independent of bacterial dose. The lesser increases in CFH observed with fresher blood were bacterial dose dependent potentially related to bacterial hemolysins. Without bacterial challenge, levels of CFH, NTBI, and PLI were significantly higher with older versus fresher blood transfusion but there was no significant measurable injury. With higher-dose bacterial challenge, the elevated NTBI and PLI levels declined more rapidly and to a greater extent after transfusion with older versus fresher blood, and older blood was associated with significantly worse shock, lung injury, and mortality. CONCLUSION: The augmented in vivo hemolysis of transfused older red blood cells (RBCs) appears to result in excess plasma CFH and iron release, which requires the presence of established infection to worsen outcome. These data suggest that transfused older RBCs increase the risks from infection in septic subjects.
Assuntos
Lesão Pulmonar Aguda/etiologia , Preservação de Sangue/efeitos adversos , Transfusão Total/efeitos adversos , Pneumonia Estafilocócica/complicações , Staphylococcus aureus , Lesão Pulmonar Aguda/sangue , Animais , Modelos Animais de Doenças , Cães , Ferro/sangue , Pneumonia Estafilocócica/sangue , Fatores de Risco , Índice de Gravidade de Doença , Fatores de Tempo , Resultado do TratamentoRESUMO
One of the most ominous predictions related to recent climatic warming is that low-lying coastal environments will be inundated by higher sea levels. The threat is especially acute in polar regions because reductions in extent and duration of sea ice cover increase the risk of storm surge occurrence. The Mackenzie Delta of northwest Canada is an ecologically significant ecosystem adapted to freshwater flooding during spring breakup. Marine storm surges during the open-water season, which move saltwater into the delta, can have major impacts on terrestrial and aquatic systems. We examined growth rings of alder shrubs (Alnus viridis subsp. fruticosa) and diatoms preserved in dated lake sediment cores to show that a recent marine storm surge in 1999 caused widespread ecological changes across a broad extent of the outer Mackenzie Delta. For example, diatom assemblages record a striking shift from freshwater to brackish species following the inundation event. What is of particular significance is that the magnitude of this recent ecological impact is unmatched over the > 1,000-year history of this lake ecosystem. We infer that no biological recovery has occurred in this lake, while large areas of terrestrial vegetation remain dramatically altered over a decade later, suggesting that these systems may be on a new ecological trajectory. As climate continues to warm and sea ice declines, similar changes will likely be repeated in other coastal areas of the circumpolar Arctic. Given the magnitude of ecological changes recorded in this study, such impacts may prove to be long lasting or possibly irreversible.
Assuntos
Ecossistema , Regiões Árticas , Canadá , Clima , Clima Frio , Diatomáceas , Ecologia/métodos , Geografia , Paleontologia/métodos , Sais/química , Água do Mar , Fatores de Tempo , Tempo (Meteorologia)RESUMO
CONTEXT: Majority of research surrounding the predictive value of clinical measurements and assessments for future athletic injury does not differentiate between contact and non-contact injuries. OBJECTIVE: We assessed the association between clinical measures and questionnaire data collected prior to sport participation and the incidence of non-contact lower extremity (LE) injuries among Division III collegiate athletes. DESIGN: Prospective cohort study. SETTING: University setting, NCAA Division III. PARTICIPANTS: 488 Division III freshmen athletes were recruited to participate in the study during their preseason physical examinations. PATIENTS OR OTHER PARTICIPANTS: 10,983 public schools. MAIN OUTCOME MEASURE: Prospective incidence of non-contact Lower extremity Injury. METHODS: Athletes completed questionnaires to collect demographics and musculoskeletal pain history. Clinical tests, performed by trained examiners, included hip provocative tests, visual appraisal of a single leg squat to identify dynamic knee valgus, and hip range of motion (ROM). Injury surveillance for each athlete's collegiate career was performed. The athletic training department documented each athlete-reported, new onset injury and documented the injury location, type, and outcome (days lost, surgery performed). Univariable Generalized Estimating Equations (GEE) models were used to analyze the relationship between each clinical measure and the first occurrence of non-contact LE injury. An exchangeable correlation structure was used to account for repeated measurements within athletes (right and left limbs). RESULTS: Of the 488 athletes, 369 athletes (75%) were included in the final analysis. 69 non-contact LE injuries were reported. Responding "Yes" to "Have you ever had pain or an injury to your low back" was associated with an increased risk of non-contact LE, odds ratio = 1.59 (95%CI 1.03- 2.45, p=.04). No other clinical measures were associated with increased injury risk. CONCLUSION: A history of prior low back pain or injury was associated with an increased risk of sustaining a non-contact LE injury while participating in NCAA Division III athletics.
RESUMO
BACKGROUND: High levels of catecholamines are cardiotoxic and associated with stress-induced cardiomyopathies. Using a septic shock model that reproduces the reversible cardiomyopathy seen over 10 days associated with human septic shock, we investigated the effects of catecholamines on microcirculatory perfusion and cardiac dysfunction. METHODS AND RESULTS: Purpose-bred beagles received intrabronchial Staphylococcus aureus (n=30) or saline (n=6). The septic animals were than randomized to epinephrine (1 µg/kg per minute, n=15) or saline (n=15) infusions from 4 to 44 hours. Serial cardiac magnetic resonance imaging, catecholamine levels, and troponins were collected over 92 hours. Serial adenosine-stress perfusion cardiac magnetic resonance imaging was performed on septic animals randomized to receive saline (n=8 out of 15) or epinephrine (n=8 out of 15). High-dose sedation was given to suppress endogenous catecholamine release. Despite catecholamine levels largely remaining within the normal range throughout, by 48 hours, septic animals receiving saline versus nonseptic animals still developed significant worsening of left ventricular ejection fraction, circumferential strain, and ventricular-aortic coupling. In septic animals that received epinephrine versus saline infusions, plasma epinephrine levels increased 800-fold, but epinephrine produced no significant further worsening of left ventricular ejection fraction, circumferential strain, or ventricular-aortic coupling. Septic animals receiving saline had a significant increase in microcirculatory reserve without troponin elevations. Septic animals receiving epinephrine had decreased edema, blunted microcirculatory perfusion, and elevated troponin levels that persisted for hours after the epinephrine infusion stopped. CONCLUSIONS: Cardiac dysfunction during sepsis is not primarily due to elevated endogenous or exogenous catecholamines nor due to decreased microvascular perfusion-induced ischemia. However, epinephrine itself has potentially harmful long-lasting ischemic effects during sepsis including impaired cardiac microvascular perfusion that persists after stopping the infusion.
Assuntos
Cardiomiopatias , Modelos Animais de Doenças , Epinefrina , Microcirculação , Choque Séptico , Animais , Cães , Choque Séptico/fisiopatologia , Choque Séptico/complicações , Choque Séptico/sangue , Epinefrina/sangue , Microcirculação/efeitos dos fármacos , Cardiomiopatias/fisiopatologia , Cardiomiopatias/sangue , Cardiomiopatias/etiologia , Volume Sistólico/efeitos dos fármacos , Circulação Coronária/efeitos dos fármacos , Isquemia Miocárdica/fisiopatologia , Isquemia Miocárdica/sangue , Isquemia Miocárdica/complicações , Função Ventricular Esquerda/efeitos dos fármacos , Catecolaminas/sangue , Troponina/sangue , Infecções Estafilocócicas/microbiologia , Infecções Estafilocócicas/complicações , Infecções Estafilocócicas/fisiopatologia , Fatores de Tempo , Imagem de Perfusão do Miocárdio/métodos , Imageamento por Ressonância MagnéticaRESUMO
BACKGROUND: Septic shock is associated with increases in end-diastolic volume (EDV) and decreases in ejection fraction that reverse within 10 days. Nonsurvivors do not develop EDV increases. The mechanism is unknown. METHODS AND RESULTS: Purpose-bred beagles (n=33) were randomized to receive intrabronchial Staphylococcus aureus or saline. Over 96 hours, cardiac magnetic resonance imaging and echocardiograms were performed. Tissue was obtained at 66 hours. From 0 to 96 hours after bacterial challenge, septic animals versus controls had significantly increased left ventricular wall edema (6%) and wall thinning with loss of mass (15%). On histology, the major finding was nonocclusive microvascular injury with edema in myocytes, the interstitium, and endothelial cells. Edema was associated with significant worsening of biventricular ejection fractions, ventricular-arterial coupling, and circumferential strain. Early during sepsis, (0-24 hours), the EDV decreased; significantly more in nonsurvivors (ie, greater diastolic dysfunction). From 24 to 48 hours, septic animals' biventricular chamber sizes increased; in survivors significantly greater than baseline and nonsurvivors, whose EDVs were not different from baseline. Preload, afterload, or heart rate differences did not explain these differential changes. CONCLUSIONS: The cardiac dysfunction of sepsis is associated with wall edema. In nonsurvivors, at 0 to 24 hours, sepsis induces a more severe diastolic dysfunction, further decreasing chamber size. The loss of left ventricular mass with wall thinning in septic survivors may, in part, explain the EDV increases from 24 to 48 hours because of a potentially reparative process removing damaged wall tissue. Septic cardiomyopathy is most consistent with a nonocclusive microvascular injury resulting in edema causing reversible systolic and diastolic dysfunction with more severe diastolic dysfunction being associated with a decreased EDV and death.
Assuntos
Modelos Animais de Doenças , Choque Séptico , Volume Sistólico , Animais , Cães , Choque Séptico/fisiopatologia , Choque Séptico/complicações , Imageamento por Ressonância Magnética , Edema Cardíaco/fisiopatologia , Edema Cardíaco/patologia , Edema Cardíaco/diagnóstico por imagem , Função Ventricular Esquerda , Fatores de Tempo , Humanos , Infecções Estafilocócicas/complicações , Infecções Estafilocócicas/fisiopatologia , Ecocardiografia , Disfunção Ventricular Esquerda/fisiopatologia , Disfunção Ventricular Esquerda/diagnóstico por imagem , Disfunção Ventricular Esquerda/etiologia , MasculinoRESUMO
Background: Septic shock, in humans and in our well-established animal model, is associated with increases in biventricular end diastolic volume (EDV) and decreases in ejection fraction (EF). These abnormalities occur over 2 days and reverse within 10 days. Septic non-survivors do not develop an increase in EDV. The mechanism for this cardiac dysfunction and EDV differences is unknown. Methods: Purpose-bred beagles randomized to receive intrabronchial Staphylococcus aureus (n=27) or saline (n=6) were provided standard ICU care including sedation, mechanical ventilation, and fluid resuscitation to a pulmonary arterial occlusion pressure of over 10mmHg. No catecholamines were administered. Over 96h, cardiac magnetic resonance imaging, echocardiograms, and invasive hemodynamics were serially performed, and laboratory data was collected. Tissue was obtained at 66h from six septic animals. Results: From 0-96h after bacterial challenge, septic animals vs. controls had significantly increased left ventricular wall edema (6%) and wall thinning with loss of mass (15%) which was more pronounced at 48h in non-survivors than survivors. On histology, edema was located predominantly in myocytes, the interstitium, and endothelial cells. Edema was associated with significantly worse biventricular function (lower EFs), ventricular-arterial coupling, and circumferential strain. In septic animals, from 0-24h, the EDV decreased from baseline and, despite cardiac filling pressures being similar, decreased significantly more in non-survivors. From 24-48h, all septic animals had increases in biventricular chamber sizes. Survivors biventricular EDVs were significantly greater than baseline and in non-survivors, where biventricular EDVs were not different from baseline. Preload, afterload, or HR differences did not explain these differential serial changes in chamber size. Conclusion: Systolic and diastolic cardiac dysfunction during sepsis is associated with ventricular wall edema. Rather than differences in preload, afterload, or heart rate, structural alterations to the ventricular wall best account for the volume changes associated with outcome during sepsis. In non-survivors, from 0-24h, sepsis induces a more severe diastolic dysfunction, further decreasing chamber size. The loss of left ventricular mass with wall thinning in septic survivors may, in part explain, the EDV increases from 24-48h. However, these changes continued and even accelerated into the recovery phase consistent with a reparative process rather than ongoing injury.
RESUMO
B. anthracis edema toxin (ET) and lethal toxin (LT) are each composed of protective antigen (PA), necessary for toxin uptake by host cells, and their respective toxic moieties, edema factor (EF) and lethal factor (LF). Although both toxins likely contribute to shock during infection, their mechanisms are unclear. To test whether ET and LT produce arterial relaxation, their effects on phenylephrine (PE)-stimulated contraction in a Sprague-Dawley rat aortic ring model were measured. Rings were prepared and connected to pressure transducers. Their viability was confirmed, and peak contraction with 60 mM KCl was determined. Compared with PA pretreatment (control, 60 min), ET pretreatment at concentrations similar to those noted in vivo decreased the mean (±SE) maximum contractile force (MCF; percent peak contraction) in rings generated during stimulation with increasing PE concentrations (96.2 ± 7.0 vs. 57.3 ± 9.1) and increased the estimated PE concentration producing half the MCF (EC50; 10(-7) M, 1.1 ± 0.3 vs. 3.7 ± 0.8, P ≤ 0.002). ET inhibition with PA-directed monoclonal antibodies, selective EF inhibition with adefovir, or removal of the ring endothelium inhibited the effects of ET on MCF and EC50 (P ≤ 0.02). Consistent with its adenyl cyclase activity, ET increased tissue cAMP in endothelium-intact but not endothelium-denuded rings (P < 0.0001 and 0.25, respectively). LT pretreatment, even in high concentrations, did not significantly decrease MCF or increase EC50 (all P > 0.05). In rings precontracted with PE compared with posttreatment with PA (90 min), ET posttreatment produced progressive reductions in contractile force and increases in relaxation in endothelium-intact rings (P < 0.0001) but not endothelium-denuded rings (P = 0.51). Thus, ET may contribute to shock by producing arterial relaxation.
Assuntos
Antígenos de Bactérias/farmacologia , Aorta/efeitos dos fármacos , Toxinas Bacterianas/farmacologia , AMP Cíclico/metabolismo , Fenilefrina/farmacologia , Sistemas do Segundo Mensageiro/efeitos dos fármacos , Vasoconstrição/efeitos dos fármacos , Vasoconstritores/farmacologia , Vasodilatação/efeitos dos fármacos , Vasodilatadores/farmacologia , Adenina/análogos & derivados , Adenina/farmacologia , Animais , Aorta/metabolismo , Relação Dose-Resposta a Droga , Endotélio Vascular/efeitos dos fármacos , Endotélio Vascular/metabolismo , Organofosfonatos/farmacologia , Ratos , Ratos Sprague-Dawley , Regulação para CimaRESUMO
BACKGROUND: Cell-free hemoglobin (Hb) in the vasculature leads to vasoconstriction and injury. Proposed mechanisms have been based on nitric oxide (NO) scavenging by oxyhemoglobin (oxyHb) or processes mediated by oxidative reactions of methemoglobin (metHb). To clarify this, we tested the vascular effect and fate of oxyHb or metHb infusions. STUDY DESIGN AND METHODS: Twenty beagles were challenged with 1-hour similar infusions of (200 µmol/L) metHb (n = 5), oxyHb (n = 5), albumin (n = 5), or saline (n = 5). Measurements were taken over 3 hours. RESULTS: Infusions of the two pure Hb species resulted in increases in mean arterial blood pressure (MAP), systemic vascular resistance index, and NO consumption capacity of plasma (all p < 0.05) with the effects of oxyHb being greater than that from metHb (MAP; increase 0 to 3 hr; 27 ± 6% vs. 7 ± 2%, respectively; all p < 0.05). The significant vasoconstrictive response of metHb (vs. albumin and saline controls) was related to in vivo autoreduction of metHb to oxyHb, and the vasoactive Hb species that significantly correlated with MAP was always oxyHb, either from direct infusion or after in vivo reduction from metHb. Clearance of total Hb from plasma was faster after metHb than oxyHb infusion (p < 0.0001). CONCLUSION: These findings indicate that greater NO consumption capacity makes oxyHb more vasoactive than metHb. Additionally, metHb is reduced to oxyHb after infusion and cleared faster or is less stable than oxyHb. Although we found no direct evidence that metHb itself is involved in acute vascular effects, in aggregate, these studies suggest that metHb is not inert and its mechanism of vasoconstriction is due to its delayed conversion to oxyHb by plasma-reducing agents.
Assuntos
Metemoglobina/farmacologia , Oxiemoglobinas/farmacologia , Vasoconstrição/efeitos dos fármacos , Albuminas/farmacologia , Animais , Pressão Sanguínea/efeitos dos fármacos , Cães , Metemoglobina/metabolismo , Óxido Nítrico/metabolismo , Oxiemoglobinas/metabolismo , Distribuição AleatóriaRESUMO
BACKGROUND: Anthrax-associated shock is closely linked to lethal toxin (LT) release and is highly lethal despite conventional hemodynamic support. We investigated whether protective antigen-directed monoclonal antibody (PA-mAb) treatment further augments titrated hemodynamic support. METHODS AND RESULTS: Forty sedated, mechanically ventilated, instrumented canines challenged with anthrax LT were assigned to no treatment (controls), hemodynamic support alone (protocol-titrated fluids and norepinephrine), PA-mAb alone (administered at start of LT infusion [0 hours] or 9 or 12 hours later), or both, and observed for 96 hours. Although all 8 controls died, 2 of 8 animals receiving hemodynamic support alone survived (median survival times 65 vs 85 hours, respectively; P = .03). PA-mAb alone at 0 hour improved survival (5 of 5 animals survived), but efficacy decreased progressively with delayed treatment (9 hours, 2 of 3 survived; 12 hours, 0 of 4 survived) (P = .004 comparing survival across treatment times). However, combined treatment increased survival irrespective of PA-mAb administration time (0 hours, 4 of 5 animals; 9 hours, 3 of 3 animals; and 12 hours, 4 of 5 animals survived) (P = .95 comparing treatment times). Compared to hemodynamic support alone, when combined over PA-mAb treatment times (0, 9, and 12 hours), combination therapy produced higher survival (P = .008), central venous pressures, and left ventricular ejection fractions, and lower heart rates, norepinephrine requirements and fluid retention (P ≤ .03). CONCLUSIONS: PA-mAb may augment conventional hemodynamic support during anthrax LT-associated shock.