RESUMO
INTRODUCTION: Hemorrhage is the leading cause of preventable death, with a majority of mortalities in the prehospital setting. Current hemorrhage resuscitation guidelines cannot predict the critical point of intervention to activate massive transfusion (MT) and prevent cardiovascular decompensation. We hypothesized that cerebral regional tissue oxygenation (CrSO2) would indicate MT need in nonhuman primate models of hemorrhagic shock. METHODS: Nineteen anesthetized male rhesus macaques underwent hemorrhage via a volume-targeted (VT) or pressure-targeted (PT) method. VT animals were monitored for 30 min following 30% blood volume hemorrhage. PT animals were hemorrhaged to mean arterial pressure (MAP) of 20 mmHg and maintained for at least 60 min until decompensation. Statistics for MAP, heart rate (HR), end tidal carbon dioxide (EtCO2), and CrSO2 were analyzed via one- or two-way repeated-measures analysis of variance, Pearson's R, and receiver-operator curve. A P < 0.05 is considered significant. RESULTS: Following initial hemorrhage (S0), there were no significant differences between groups. After cessation of hemorrhage in the VT group, MAP and EtCO2 returned to baseline while CrSO2 plateaued. The PT group maintained model-defined low MAP, suppressing EtCO2, and significantly decreased CrSO2 compared to the VT group by S25. Linear regression of CrSO2versus shed blood volume demonstrated R2 = 0.7539. CrSO2 of 47% was able to detect >40% blood loss with an area under the curve of 0.9834 at 92.3% (66.7%-99.6%) sensitivity and 95.5% (84.9%-99.2%) specificity. CONCLUSIONS: Regardless of hemorrhage modality and compensatory response, CrSO2 correlated strongly with shed blood volume. Analysis demonstrated that CrSO2 values below 49% indicate Advanced Trauma Life Support class IV shock (blood loss>40%). CrSO2 at the point of care may help indicate MT need earlier and more accurately than traditional markers.
Assuntos
Dióxido de Carbono , Choque Hemorrágico , Animais , Masculino , Macaca mulatta , Pressão Sanguínea/fisiologia , Choque Hemorrágico/terapia , Hemorragia/etiologia , Hemorragia/terapiaRESUMO
Hypovolemia is a physiological state of reduced blood volume that can exist as either (1) absolute hypovolemia because of a lower circulating blood (plasma) volume for a given vascular space (dehydration, hemorrhage) or (2) relative hypovolemia resulting from an expanded vascular space (vasodilation) for a given circulating blood volume (e.g., heat stress, hypoxia, sepsis). This paper examines the physiology of hypovolemia and its association with health and performance problems common to occupational, military and sports medicine. We discuss the maturation of individual-specific compensatory reserve or decompensation measures for future wearable sensor systems to effectively manage these hypovolemia problems. The paper then presents areas of future work to allow such technologies to translate from lab settings to use as decision aids for managing hypovolemia. We envision a future that incorporates elements of the compensatory reserve measure with advances in sensing technology and multiple modalities of cardiovascular sensing, additional contextual measures, and advanced noise reduction algorithms into a fully wearable system, creating a robust and physiologically sound approach to manage physical work, fatigue, safety and health issues associated with hypovolemia for workers, warfighters and athletes in austere conditions.
Assuntos
Militares , Medicina Esportiva , Dispositivos Eletrônicos Vestíveis , Algoritmos , Humanos , Hipovolemia/diagnóstico , Aprendizado de MáquinaRESUMO
The application of artificial intelligence (AI) has provided new capabilities to develop advanced medical monitoring sensors for detection of clinical conditions of low circulating blood volume such as hemorrhage. The purpose of this study was to compare for the first time the discriminative ability of two machine learning (ML) algorithms based on real-time feature analysis of arterial waveforms obtained from a non-invasive continuous blood pressure system (Finometer®) signal to predict the onset of decompensated shock: the compensatory reserve index (CRI) and the compensatory reserve metric (CRM). One hundred ninety-one healthy volunteers underwent progressive simulated hemorrhage using lower body negative pressure (LBNP). The least squares means and standard deviations for each measure were assessed by LBNP level and stratified by tolerance status (high vs. low tolerance to central hypovolemia). Generalized Linear Mixed Models were used to perform repeated measures logistic regression analysis by regressing the onset of decompensated shock on CRI and CRM. Sensitivity and specificity were assessed by calculation of receiver-operating characteristic (ROC) area under the curve (AUC) for CRI and CRM. Values for CRI and CRM were not distinguishable across levels of LBNP independent of LBNP tolerance classification, with CRM ROC AUC (0.9268) being statistically similar (p = 0.134) to CRI ROC AUC (0.9164). Both CRI and CRM ML algorithms displayed discriminative ability to predict decompensated shock to include individual subjects with varying levels of tolerance to central hypovolemia. Arterial waveform feature analysis provides a highly sensitive and specific monitoring approach for the detection of ongoing hemorrhage, particularly for those patients at greatest risk for early onset of decompensated shock and requirement for implementation of life-saving interventions.
Assuntos
Inteligência Artificial , Hipovolemia , Algoritmos , Pressão Sanguínea/fisiologia , Volume Sanguíneo/fisiologia , Frequência Cardíaca/fisiologia , Hemodinâmica , Hemorragia/diagnóstico , Humanos , Hipovolemia/diagnóstico , Aprendizado de MáquinaRESUMO
Traumatic brain injury (TBI) and hemorrhage remain challenging to treat in austere conditions. Developing a therapeutic to mitigate the associated pathophysiology is critical to meet this treatment gap, especially as these injuries and associated high mortality are possibly preventable. Here, Thera-101 (T-101) was evaluated as low-volume resuscitative fluid in a rat model of TBI and hemorrhage. The therapeutic, T-101, is uniquely situated as a TBI and hemorrhage intervention. It contains a cocktail of proteins and microvesicles from the secretome of adipose-derived mesenchymal stromal cells that can act on repair and regenerative mechanisms associated with poly-trauma. T-101 efficacy was determined at 4, 24, 48, and 72 h post-injury by evaluating blood chemistry, inflammatory chemo/cytokines, histology, and diffusion tensor imaging. Blood chemistry indicated that T-101 reduced the markers of liver damage to Sham levels while the levels remained elevated with the control (saline) resuscitative fluid. Histology supports the potential protective effects of T-101 on the kidneys. Diffusion tensor imaging showed that the injury caused the most damage to the corpus callosum and the fimbria. Immunohistochemistry suggests that T-101 may mitigate astrocyte activation at 72 h. Together, these data suggest that T-101 may serve as a potential field deployable low-volume resuscitation therapeutic.
Assuntos
Lesões Encefálicas Traumáticas , Traumatismo Múltiplo , Animais , Ratos , Imagem de Tensor de Difusão , Modelos Animais de Doenças , Traumatismo Múltiplo/terapia , Lesões Encefálicas Traumáticas/tratamento farmacológico , Hemorragia/complicações , Citocinas/uso terapêuticoRESUMO
BACKGROUND: The physiological response to hemorrhage includes vasoconstriction in an effort to shunt blood to the heart and brain. Hemorrhaging patients can be classified as "good" compensators who demonstrate high tolerance (HT) or "poor" compensators who manifest low tolerance (LT) to central hypovolemia. Compensatory vasoconstriction is manifested by lower tissue oxygen saturation (StO2 ), which has propelled this measure as a possible early marker of shock. The compensatory reserve measurement (CRM) has also shown promise as an early indicator of decompensation. METHODS: Fifty-one healthy volunteers (37% LT) were subjected to progressive lower body negative pressure (LBNP) as a model of controlled hemorrhage designed to induce an onset of decompensation. During LBNP, CRM was determined by arterial waveform feature analysis. StO2 , muscle pH, and muscle H+ concentration were calculated from spectrum using near-infrared spectroscopy (NIRS) on the forearm. RESULTS: These values were statistically indistinguishable between HT and LT participants at baseline (p ≥ 0.25). HT participants exhibited lower (p = 0.01) StO2 at decompensation compared to LT participants. CONCLUSIONS: Lower StO2 measured in patients during low flow states associated with significant hemorrhage does not necessarily translate to a more compromised physiological state, but may reflect a greater resistance to the onset of shock. Only the CRM was able to distinguish between HT and LT participants early in the course of hemorrhage, supported by a significantly greater ROC AUC (0.90) compared with STO2 (0.68). These results support the notion that measures of StO2 could be misleading for triage and resuscitation decision support.
Assuntos
Volume Sanguíneo/fisiologia , Consumo de Oxigênio/fisiologia , Adulto , Área Sob a Curva , Pressão Sanguínea , Feminino , Voluntários Saudáveis , Hemodinâmica , Hemoglobinas/análise , Humanos , Concentração de Íons de Hidrogênio , Pressão Negativa da Região Corporal Inferior/métodos , Masculino , Músculo Esquelético/fisiologia , Curva ROC , Adulto JovemRESUMO
Vital signs historically served as the primary method to triage patients and resources for trauma and emergency care, but have failed to provide clinically-meaningful predictive information about patient clinical status. In this review, a framework is presented that focuses on potential wearable sensor technologies that can harness necessary electronic physiological signal integration with a current state-of-the-art predictive machine-learning algorithm that provides early clinical assessment of hypovolemia status to impact patient outcome. The ability to study the physiology of hemorrhage using a human model of progressive central hypovolemia led to the development of a novel machine-learning algorithm known as the compensatory reserve measurement (CRM). Greater sensitivity, specificity, and diagnostic accuracy to detect hemorrhage and onset of decompensated shock has been demonstrated by the CRM when compared to all standard vital signs and hemodynamic variables. The development of CRM revealed that continuous measurements of changes in arterial waveform features represented the most integrated signal of physiological compensation for conditions of reduced systemic oxygen delivery. In this review, detailed analysis of sensor technologies that include photoplethysmography, tonometry, ultrasound-based blood pressure, and cardiogenic vibration are identified as potential candidates for harnessing arterial waveform analog features required for real-time calculation of CRM. The integration of wearable sensors with the CRM algorithm provides a potentially powerful medical monitoring advancement to save civilian and military lives in emergency medical settings.
Assuntos
Hemorragia/diagnóstico , Hipovolemia , Monitorização Fisiológica/instrumentação , Dispositivos Eletrônicos Vestíveis , Ferimentos e Lesões/diagnóstico , Algoritmos , Hemodinâmica , Humanos , Hipovolemia/diagnósticoRESUMO
The US Food and Drug Administration (FDA) held a workshop on red blood cell (RBC) product regulatory science on October 6 and 7, 2016, at the Natcher Conference Center on the National Institutes of Health (NIH) Campus in Bethesda, Maryland. The workshop was supported by the National Heart, Lung, and Blood Institute, NIH; the Department of Defense; the Office of the Assistant Secretary for Health, Department of Health and Human Services; and the Center for Biologics Evaluation and Research, FDA. The workshop reviewed the status and scientific basis of the current regulatory framework and the available scientific tools to expand it to evaluate innovative and future RBC transfusion products. A full record of the proceedings is available on the FDA website (http://www.fda.gov/BiologicsBloodVaccines/NewsEvents/WorkshopsMeetingsConferences/ucm507890.htm). The contents of the summary are the authors' opinions and do not represent agency policy.
Assuntos
Eritrócitos , United States Food and Drug Administration , Adulto , Animais , Produtos Biológicos , Preservação de Sangue/normas , Segurança do Sangue/normas , Criança , Transfusão de Eritrócitos , Humanos , Modelos Animais , Ensaios Clínicos Controlados Aleatórios como Assunto , Reação Transfusional , Estados Unidos , United States Food and Drug Administration/normasRESUMO
ABSTRACT: In recent years, it has become apparent that fibrinolytic dysfunction and endotheliopathy develop in up to 40% of patients during the first hours following thermal injury and are associated with poor outcomes and increased resuscitation requirements. Rapidly following burn injury, the fibrinolytic system is activated, with activation generally greater with increased severity of injury. Very high plasma concentrations of plasmin-antiplasmin complex (marker of activation), have been associated with mortality. Patients display hyperfibrinolytic, physiologic/normal or hypofibrinolytic/fibrinolytic shutdown phenotypes, as assessed by viscoelastic assay. Phenotypes change in over 50% of patients during the acute burn resuscitation period, with some patterns (maladaptive) associated with increased mortality risk and others (adaptive, trending toward the physiologic phenotype) associated with survival. Endotheliopathy, as reflected in elevated plasma concentrations of syndecan-1 has also been associated with increased mortality. Here we review the incidence and effects of these responses after burn injury and explore mechanisms and potential interactions with the early inflammatory response. Available data from burn and non-burn trauma suggest that the fibrinolytic, endothelial, and inflammatory systems interact extensively and that dysregulation in one may exacerbate dysregulation in the others. This raises the possibility that successful treatment of one may favorably impact the others.
RESUMO
BACKGROUND: Hemorrhage control in prolonged field care (PFC) presents unique challenges that drive the need for enhanced point of injury treatment capabilities to maintain patient stability beyond the Golden Hour. To address this, two hemostatic agents, Combat Gauze (CG) and XSTAT, were evaluated in a porcine model of uncontrolled junctional hemorrhage for speed of deployment and hemostatic efficacy over 72 hours. METHODS: The left subclavian artery and subscapular vein were isolated in anesthetized male Yorkshire swine (70-85 kg) and injured via 50% transection, followed by 30 seconds of hemorrhage. Combat Gauze (n = 6) or XSTAT (n = 6) was administered until bleeding stopped and remained within subjects for observation over 72 hours. Physiologic monitoring, hemostatic efficacy, and hematological parameters were measured throughout the protocol. Gross necropsy and histology were performed following humane euthanasia. RESULTS: Both CG and XSTAT maintained hemostasis throughout the full duration of the protocol. There were no significant differences between groups in hemorrhage volume (CG: 1021.0 ± 183.7 mL vs. XSTAT: 968.2 ± 243.3 mL), total blood loss (CG: 20.8 ± 2.7% vs. XSTAT: 20.1 ± 5.1%), or devices used (CG: 3.8 ± 1.2 vs. XSTAT: 5.3 ± 1.4). XSTAT absorbed significantly more blood than CG (CG: 199.5 ± 50.3 mL vs. XSTAT: 327.6 ± 71.4 mL) and was significantly faster to administer (CG: 3.4 ± 1.6 minutes vs. XSTAT: 1.4 ± 0.5 minutes). There were no significant changes in activated clot time, prothrombin time, or international normalized ratio between groups or compared with baseline throughout the 72-hour protocol. Histopathology revealed no evidence of microthromboemboli or disseminated coagulopathies across evaluated tissues in either group. CONCLUSION: Combat Gauze and XSTAT demonstrated equivalent hemostatic ability through 72 hours, with no overt evidence of coagulopathies from prolonged indwelling. In addition, XSTAT offered significantly faster administration and the ability to absorb more blood. Taken together, XSTAT offers logistical and efficiency advantages over CG for immediate control of junctional noncompressible hemorrhage, particularly in a tactical environment. In addition, extension of indicated timelines to 72 hours allows translation to PFC.
Assuntos
Hemostáticos , Suínos , Masculino , Humanos , Animais , Hemostáticos/uso terapêutico , Modelos Animais de Doenças , Hemorragia/terapia , Exsanguinação/terapia , Hemostasia , Técnicas HemostáticasRESUMO
Objectives: Prehospital transfusion can be life-saving when transport is delayed but conventional plasma, red cells, and whole blood are often unavailable out of hospital. Shelf-stable products are needed as a temporary bridge to in-hospital transfusion. Bioplasma FDP (freeze-dried plasma) and Hemopure (hemoglobin-based oxygen carrier; HBOC) are products with potential for prehospital use. In vivo use of these products together has not been reported. This study assessed the safety of intravenous administration of HBOC+FDP, relative to normal saline (NS), in rhesus macaques (RM). Methods: After 30% blood volume removal and 30 minutes in shock, animals were resuscitated with either NS or two units (RM size adjusted) each of HBOC+FDP during 60 minutes. Sequential blood samples were collected. After neurological assessment, animals were killed at 24 hours and tissues collected for histopathology. Results: Due to a shortage of RM during the COVID-19 pandemic, the study was stopped after nine animals (HBOC+FDP, seven; NS, two). All animals displayed physiologic and tissue changes consistent with hemorrhagic shock and recovered normally. There was no pattern of cardiovascular, blood gas, metabolic, coagulation, histologic, or neurological changes suggestive of risk associated with HBOC+FDP. Conclusion: There was no evidence of harm associated with the combined use of Hemopure and Bioplasma FDP. No differences were noted between groups in safety-related cardiovascular, pulmonary, renal or other organ or metabolic parameters. Hemostasis and thrombosis-related parameters were consistent with expected responses to hemorrhagic shock and did not differ between groups. All animals survived normally with intact neurological function. Level of evidence: Not applicable.
RESUMO
INTRODUCTION: Knowing when suicidal ideation (SI) or suicide attempt (SA) is most likely to occur in a deployed environment would aid in focusing prevention efforts. This study aims to determine when evacuation for SA and SI is most likely to occur based on the absolute and relative number of months in a deployed setting. MATERIALS AND METHODS: This is a case-control study of active-duty military personnel evacuated from the U.S. Central Command area of responsibility for SI or an SA between April 1, 2020, and March 30, 2021. The arrival month and expected departure month were identified for all the included evacuees. The month of evacuation and proportion of completed deployment were compared. Secondary outcomes of mental health diagnosis or need for a waiver was also examined. RESULTS: A total of 138 personnel evacuated for SI or attempted suicide during the 12-month study period were included in the analysis. Evacuations occurring during month 3 of deployment were significantly higher (P < .0001) than those during other months. The 30% and 50% completion point of deployment had statistically higher frequencies of evacuations for SI/SA (<.0001). A secondary analysis revealed that 25.4% of the individuals had a documented preexisting behavioral health condition before deployment (P < .0001). CONCLUSION: Specific points along a deployment timeline were significant predictors for being evacuated for SI and SA.
Assuntos
Militares , Tentativa de Suicídio , Humanos , Tentativa de Suicídio/psicologia , Ideação Suicida , Estudos de Casos e Controles , Incidência , Militares/psicologia , Fatores de RiscoRESUMO
BACKGROUND: Shock index (SI) equals the ratio of heart rate (HR) to systolic blood pressure (SBP) with clinical evidence that it is more sensitive for trauma patient status assessment and prediction of outcome compared with either HR or SBP alone. We used lower body negative pressure (LBNP) as a human model of central hypovolemia and compensatory reserve measurement (CRM) validated for accurate tracking of reduced central blood volume to test the hypotheses that SI: (1) presents a late signal of central blood volume status; (2) displays poor sensitivity and specificity for predicting the onset of hemodynamic decompensation; and (3) cannot identify individuals at greatest risk for the onset of circulatory shock. METHODS: We measured HR, SBP, and CRM in 172 human subjects (19-55 years) during progressive LBNP designed to determine tolerance to central hypovolemia as a model of hemorrhage. Subjects were subsequently divided into those with high tolerance (HT) (n = 118) and low tolerance (LT) (n = 54) based on completion of 60 mm Hg LBNP. The time course relationship between SI and CRM was determined and receiver operating characteristic (ROC) area under the curve (AUC) was calculated for sensitivity and specificity of CRM and SI to predict hemodynamic decompensation using clinically defined thresholds of 40% for CRM and 0.9 for SI. RESULTS: The time and level of LBNP required to reach a SI = 0.9 (~60 mm Hg LBNP) was significantly greater ( p < 0.001) compared with CRM that reached 40% at ~40 mm Hg LBNP. Shock index did not differ between HT and LT subjects at 45 mm Hg LBNP levels. ROC AUC for CRM was 0.95 (95% CI = 0.94-0.97) compared with 0.91 (0.89-0.94) for SI ( p = 0.0002). CONCLUSION: Despite high sensitivity and specificity, SI delays time to detect reductions in central blood volume with failure to distinguish individuals with varying tolerances to central hypovolemia. LEVEL OF EVIDENCE: Diagnostic Test or Criteria; Level III.
Assuntos
Hemodinâmica , Hipovolemia , Humanos , Hipovolemia/diagnóstico , Hemodinâmica/fisiologia , Volume Sanguíneo/fisiologia , Pressão Sanguínea/fisiologia , Frequência Cardíaca/fisiologia , Pressão Negativa da Região Corporal InferiorRESUMO
Background: Immersive virtual reality (iVR)-based digital therapeutics are gaining clinical attention in the field of pain management. Based on known analogies between pain and dyspnoea, we investigated the effects of visual respiratory feedback on persistent dyspnoea in patients recovering from coronavirus disease 2019 (COVID-19) pneumonia. Methods: We performed a controlled, randomised, single-blind, crossover proof-of-concept study (feasibility and initial clinical efficacy) to evaluate an iVR-based intervention to alleviate dyspnoea in patients recovering from COVID-19 pneumonia. Included patients reported persistent dyspnoea (≥5 on a 10-point scale) and preserved cognitive function (Montreal Cognitive Assessment score >24). Assignment was random and concealed. Patients received synchronous (intervention) or asynchronous (control) feedback of their breathing, embodied via a gender-matched virtual body. The virtual body flashed in a waxing and waning visual effect that could be synchronous or asynchronous to the patient's respiratory movements. Outcomes were assessed using questionnaires and breathing recordings. Results: Study enrolment was open between November 2020 and April 2021. 26 patients were enrolled (27% women; median age 55â years, interquartile range (IQR) 18â years). Data were available for 24 of 26 patients. The median rating on a 7-point Likert scale of breathing comfort improved from 1 (IQR 2) at baseline to 2 (IQR 1) for synchronous feedback, but remained unchanged at 1 (IQR 1.5) for asynchronous feedback (p<0.05 between iVR conditions). Moreover, 91.2% of all patients were satisfied with the intervention (p<0.0001) and 66.7% perceived it as beneficial for their breathing (p<0.05). Conclusion: Our iVR-based digital therapy presents a feasible and safe respiratory rehabilitation tool that improves breathing comfort in patients recovering from COVID-19 infection presenting with persistent dyspnoea. Future research should investigate the intervention's generalisability to persistent dyspnoea with other aetiologies and its potential for preventing chronification.
RESUMO
Background: Although hemorrhage remains the leading cause of survivable death in casualties, modern conflicts are becoming more austere limiting available resources to include resuscitation products. With limited resources also comes prolonged evacuation time, leaving suboptimal prehospital field care conditions. When blood products are limited or unavailable, crystalloid becomes the resuscitation fluid of choice. However, there is concern of continuous crystalloid infusion during a prolonged period to achieve hemodynamic stability for a patient. This study evaluates the effect of hemodilution from a 6-hour prehospital hypotensive phase on coagulation in a porcine model of severe hemorrhagic shock. Methods: Adult male swine (n=5/group) were randomized into three experimental groups. Non-shock (NS)/normotensive did not undergo injury and were controls. NS/permissive hypotensive (PH) was bled to the PH target of systolic blood pressure (SBP) 85±5 mm Hg for 6 hours of prolonged field care (PFC) with SBP maintained via crystalloid, then recovered. Experimental group underwent controlled hemorrhage to mean arterial pressure 30 mm Hg until decompensation (Decomp/PH), followed by PH resuscitation with crystalloid for 6 hours. Hemorrhaged animals were then resuscitated with whole blood and recovered. Blood samples were collected at certain time points for analysis of complete blood counts, coagulation function, and inflammation. Results: Throughout the 6-hour PFC, hematocrit, hemoglobin, and platelets showed significant decreases over time in the Decomp/PH group, indicating hemodilution, compared with the other groups. However, this was corrected with whole blood resuscitation. Despite the appearance of hemodilution, coagulation and perfusion parameters were not severely compromised. Conclusions: Although significant hemodilution occurred, there was minimal impact on coagulation and endothelial function. This suggests that it is possible to maintain the SBP target to preserve perfusion of vital organs at a hemodilution threshold in resource-constrained environments. Future studies should address therapeutics that can mitigate potential hemodilutional effects such as lack of fibrinogen or platelets. Level of evidence: Not applicable-Basic Animal Research.
RESUMO
ABSTRACT: Introduction: Traumatic shock and hemorrhage (TSH) is a leading cause of preventable death in military and civilian populations. Using a TSH model, we compared plasma with whole blood (WB) as prehospital interventions, evaluating restoration of cerebral tissue oxygen saturation (CrSO 2 ), systemic hemodynamics, colloid osmotic pressure (COP) and arterial lactate, hypothesizing plasma would function in a noninferior capacity to WB, despite dilution of hemoglobin (Hgb). Methods: Ten anesthetized male rhesus macaques underwent TSH before randomization to receive a bolus of O(-) WB or AB(+) plasma at T0. At T60, injury repair and shed blood (SB) to maintain MAP > 65 mm Hg began, simulating hospital arrival. Hematologic data and vital signs were analyzed via t test and two-way repeated measures ANOVA, data presented as mean ± SD, significance = P < 0.05. Results: There were no significant group differences for shock time, SB volume, or hospital SB. At T0, MAP and CrSO 2 significantly declined from baseline, though not between groups, normalizing to baseline by T10. Colloid osmotic pressure declined significantly in each group from baseline at T0 but restored by T30, despite significant differences in Hgb (WB 11.7 ± 1.5 vs. plasma 6.2 ± 0.8 g/dL). Peak lactate at T30 was significantly higher than baseline in both groups (WB 6.6 ± 4.9 vs. plasma 5.7 ± 1.6 mmol/L) declining equivalently by T60. Conclusions: Plasma restored hemodynamic support and CrSO 2 , in a capacity not inferior to WB, despite absence of additional Hgb supplementation. This was substantiated via return of physiologic COP levels, restoring oxygen delivery to microcirculation, demonstrating the complexity of restoring oxygenation from TSH beyond simply increasing oxygen carrying capacity.
RESUMO
INTRODUCTION: Traumatic brain injury (TBI) is a major health issue for service members deployed and is more common in recent conflicts; however, a thorough understanding of risk factors and trends is not well described. This study aims to characterize the epidemiology of TBI in U.S. service members and the potential impacts of changes in policy, care, equipment, and tactics over the 15 years studied. METHODS: Retrospective analysis of U.S. Department of Defense Trauma Registry data (2002-2016) was performed on service members treated for TBI at Role 3 medical treatment facilities in Iraq and Afghanistan. Risk factors and trends in TBI were examined in 2021 using Joinpoint regression and logistic regression. RESULTS: Nearly one third of 29,735 injured service members (32.4%) reaching Role 3 medical treatment facilities had TBI. The majority sustained mild (75.8%), followed by moderate (11.6%) and severe (10.6%) TBI. TBI proportion was higher in males than in females (32.6% vs 25.3%; p<0.001), in Afghanistan than in Iraq (43.8% vs 25.5%; p<0.001), and in battle than in nonbattle (38.6% vs 21.9%; p<0.001). Patients with moderate or severe TBI were more likely to have polytrauma (p<0.001). TBI proportion increased over time, primarily in mild TBI (p=0.02), slightly in moderate TBI (p=0.04), and most rapidly between 2005 and 2011, with a 2.48% annual increase. CONCLUSIONS: One third of injured service members at Role 3 medical treatment facilities experienced TBI. Findings suggest that additional preventive measures may decrease TBI frequency and severity. Clinical guidelines for field management of mild TBI may reduce the burden on evacuation and hospital systems. Additional capabilities may be needed for military field hospitals.
Assuntos
Concussão Encefálica , Lesões Encefálicas Traumáticas , Militares , Masculino , Feminino , Humanos , Estados Unidos/epidemiologia , Estudos Retrospectivos , Afeganistão/epidemiologia , Iraque/epidemiologia , Guerra do Iraque 2003-2011 , Campanha Afegã de 2001- , Lesões Encefálicas Traumáticas/epidemiologia , Lesões Encefálicas Traumáticas/terapiaRESUMO
ABSTRACT: Hemorrhagic shock remains the leading cause of mortality in civilian trauma and battlefield settings. The ability of combat medics and other military medical personnel to obtain early identification and assessment of a bleeding casualty is hampered by the use of standard vital signs that fail to provide early predictive indicators of the onset of shock because of compensatory mechanisms. Over the past decade, the emergence and application of new technologies that incorporate the use of artificial intelligence have revealed that continuous, real-time arterial waveform analysis (AWFA) reflects the recruitment of such compensatory mechanism. As such, AWFA can provide early hemorrhage detection and indication of the onset of overt shock compared with standard vital signs. In this review, we provide for the first time a summary of clinical data collected in patients with varying conditions of blood loss, sepsis, and resuscitation with direct comparison of AWFA and standard vital signs. Receiver operating characteristic area under the curve data clearly demonstrate that AWFA provides greater accuracy with early indicators for changes in blood volume compared with standard vital signs. A consistently greater sensitivity generated by AWFA compared with vital signs is associated with its ability to provide earlier hemorrhage detection, while higher specificity reflects its propensity to distinguish "poor" compensators (i.e., those with relatively low tolerance to blood loss) from "good" compensators. The data presented in this review demonstrate that integration of AWFA into medical monitoring capabilities has the potential to improve clinical outcomes of casualties by providing earlier and individualized assessment of blood loss and resuscitation.
Assuntos
Inteligência Artificial , Choque Hemorrágico , Hemorragia/diagnóstico , Hemorragia/etiologia , Hemorragia/terapia , Humanos , Monitorização Fisiológica , Ressuscitação/efeitos adversos , Choque Hemorrágico/etiologiaRESUMO
BACKGROUND: Military involvement in Afghanistan ended in 2021, and while low-intensity troop engagements continue globally, casualty numbers are dwindling. To understand the clinical and operational connections between blood utilization and clinical paradigm shifts in resuscitation strategies, a review of blood product utilization and the changes in the last decade was conducted within the US Central Command area of responsibility. The intent of this review was to assess patterns of blood use during the last decade of the United States' involvement in the most recent major conflicts to potentially inform future blood requirements. METHODS: Blood product and types transfused between January 1, 2011, and December 31, 2020, were acquired from the Medical Situational Awareness in Theater blood reports. All reported blood usage data in the US Central Command area of responsibility were queried. RESULTS: Packed red blood cells and fresh frozen plasma (FFP) usage showed no statistically significant change over time ( τb = 0.24, p = 0.3252; τb = -0.47, p = 0.0603). Fresh and stored whole blood (SWB) use increased overtime ( τb = 0.69, p = 0.0056; τb = 0.83, p = 0.0015). A strong inverse relationship was found between SWB and FFP usage ( r = -0.68, p = 0.0309) and liquid plasma and FFP usage ( r = -0.65, p = 0.0407) over time. CONCLUSION: Whole blood usage increased significantly over time with a preference for SWB. Component therapy is anticipated to remain a critical element of resuscitation in the event of large-scale combat operations secondary to supply chain and longer storage times. LEVEL OF EVIDENCE: Therapeutic/care management; Level III.
Assuntos
Medicina Militar , Militares , Transfusão de Sangue , Humanos , Plasma , Ressuscitação , Estados UnidosRESUMO
ABSTRACT: Traumatic brain injury (TBI) is associated with increased morbidity and mortality in civilian trauma and battlefield settings. It has been classified across a continuum of dysfunctions, with as much as 80% to 90% of cases diagnosed as mild to moderate in combat casualties. In this report, a framework is presented that focuses on the potential benefits for acute noninvasive treatment of reduced cerebral perfusion associated with mild TBI by harnessing the natural transfer of negative intrathoracic pressure during inspiration. This process is known as intrathoracic pressure regulation (IPR) therapy, which can be applied by having a patient breath against a small inspiratory resistance created by an impedance threshold device. Intrathoracic pressure regulation therapy leverages two fundamental principles for improving blood flow to the brain: (1) greater negative intrathoracic pressure enhances venous return, cardiac output, and arterial blood pressure; and (2) lowering of intracranial pressure provides less resistance to cerebral blood flow. These two effects work together to produce a greater pressure gradient that results in an improvement in cerebral perfusion pressure. In this way, IPR therapy has the potential to counter hypotension and hypoxia, potentially significant contributing factors to secondary brain injury, particularly in conditions of multiple injuries that include severe hemorrhage. By implementing IPR therapy in patients with mild-to-moderate TBI, a potential exists to provide early neuroprotection at the point of injury and a bridge to more definitive care, particularly in settings of prolonged delays in evacuation such as those anticipated in future multidomain operations. LEVEL OF EVIDENCE: Report.