Your browser doesn't support javascript.
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 94
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Mil Med ; 185(Supplement_1): 413-419, 2020 Jan 07.
Artigo em Inglês | MEDLINE | ID: mdl-32074349

RESUMO

INTRODUCTION: Musculoskeletal (MSK) conditions are commonly seen among military service members (SM) and Veterans. We explored correlates of award of MSK-related service-connected disability benefits (SCDB) among SM seeking care in Veterans Affairs (VA) hospitals. MATERIALS AND METHODS: Department of Defense data on SM who separated from October 1, 2001 to May 2017 were linked to VA administrative data. Using adjusted logistic regression models, we determined the odds of receiving MSK SCDB. RESULTS: A total of 1,558,449 (79% of separating SM) had at least one encounter in VA during the study period (7.8% disability separations). Overall, 51% of this cohort had at least one MSK SCDB (88% among disability separations, 48% among normal). Those with disability separations (as compared to normal separations) were significantly more likely to receive MSK SCDB (odds ratio 2.37) as were females (compared to males, odds ratio 1.15). CONCLUSIONS: Although active duty SM with disability separations were more likely to receive MSK-related service-connected disability ratings in the VA, those with normal separations also received such awards. Identifying those at highest risk for MSK-related disability could lead to improved surveillance and prevention strategies in the Department of Defense and VA health care systems to prevent further damage and disability.

2.
Mil Med ; 185(Supplement_1): 296-302, 2020 Jan 07.
Artigo em Inglês | MEDLINE | ID: mdl-32074380

RESUMO

INTRODUCTION: We explore disparities in awarding post-traumatic stress disorder (PTSD) service-connected disability benefits (SCDB) to veterans based on gender, race/ethnicity, and misconduct separation. METHODS: Department of Defense data on service members who separated from October 1, 2001 to May 2017 were linked to Veterans Administration (VA) administrative data. Using adjusted logistic regression models, we determined the odds of receiving a PTSD SCDB conditional on a VA diagnosis of PTSD. RESULTS: A total of 1,558,449 (79% of separating service members) had at least one encounter in VA during the study period (12% female, 4.5% misconduct separations). Females (OR 0.72) and Blacks (OR 0.93) were less likely to receive a PTSD award and were nearly equally likely to receive a PTSD diagnosis (OR 0.97, 1.01). Other racial/ethnic minorities were more likely to receive an award and diagnosis, as were those with misconduct separations (award OR 1.3, diagnosis 2.17). CONCLUSIONS: Despite being diagnosed with PTSD at similar rates to their referent categories, females and Black veterans are less likely to receive PTSD disability awards. Other racial/ethnic minorities and those with misconduct separations were more likely to receive PTSD diagnoses and awards. Further study is merited to explore variation in awarding SCDB.

3.
Artigo em Inglês | MEDLINE | ID: mdl-32044875

RESUMO

Hemorrhage remains the leading cause of death following traumatic injury in both civilian and military settings. Heart rate variability (HRV) and complexity (HRC) have been proposed as potential "new vital signs" for monitoring trauma patients; however, the added benefit of HRV or HRC for decision support remains unclear. Another new paradigm, the compensatory reserve measurement (CRM), represents the integration of all cardiopulmonary mechanisms responsible for compensation during relative blood loss and was developed to identify current physiologic status by estimating the progression toward hemodynamic decompensation. In the present study, we hypothesized that CRM would provide greater sensitivity and specificity to detect progressive reductions in central circulating blood volume and onset of decompensation as compared to measurements of HRV and HRC. Continuous, noninvasive measurements of compensatory reserve and electrocardiogram (ECG) signals were made on 101 healthy volunteers during lower body negative pressure (LBNP) to the point of decompensation. Measures of HRV and HRC were taken from ECG signal data. CRM demonstrated a superior sensitivity and specificity (receiver operator characteristic area under the curve [ROC AUC] = 0.93) compared with all HRV measures (ROC AUC ≤ 0.84) and all HRC measures (ROC AUC ≤ 0.86). Sensitivity and specificity values at the ROC optimal thresholds were greater for CRM (sensitivity=0.84; specificity=0.84) than HRV (sensitivity≤ 0.78; specificity ≤ 0.77), and HRC (sensitivity ≤ 0.79; specificity ≤ 0.77). With standardized values across all levels of LBNP, CRM had a steeper decline, less variability, and explained a greater proportion of the variation in the data than both HRV and HRC during progressive hypovolemia. These findings add to the growing body of literature describing the advantages of CRM for detecting reductions in central blood volume. Most importantly, these results provide further support for the potential use of CRM in the triage and monitoring of patients at highest risk for the onset of shock following blood loss.

4.
Artigo em Inglês | MEDLINE | ID: mdl-32039975

RESUMO

BACKGROUND: Comprehensive analyses of battle-injured fatalities, incorporating a multi-disciplinary process with a standardized lexicon, is necessary to elucidate opportunities for improvement (OFI) to increase survivability. METHODS: A mortality review was conducted on United States Special Operations Command (USSOCOM) battle-injured fatalities who died from September 11, 2001 to September 10, 2018. Fatalities were analyzed by demographics, operational posture, mechanism of injury, cause of death, mechanism of death, classification of death, and injury severity. Injury survivability was determined by a subject matter expert panel and compared to injury patterns among Department of Defense Trauma Registry survivors. Death preventability and OFI were determined for fatalities with potentially survivable or survivable injuries (PS-S) using tactical data and documented medical interventions. RESULTS: Of 369 USSOCOM battle-injured fatalities (median age, 29 years; male, 98.6%), most were killed in action (89.4%) and more than half died from injuries sustained during mounted operations (52.3%). The cause of death was blast injury (45.0%), gunshot wound (39.8%), and multiple/blunt force injury (15.2%). The leading mechanism of death was catastrophic tissue destruction (73.7%). Most fatalities sustained non-survivable injuries (74.3%). For fatalities with PS-S injuries, most had hemorrhage as a component of mechanism of death (88.4%); however, the mechanism of death was multifactorial in the majority of these fatalities (58.9%). Only 5.4% of all fatalities and 21.1% of fatalities with PS-S injuries had comparable injury patterns among survivors. Accounting for tactical situation, a minority of deaths were potentially preventable (5.7%) and a few preventable (1.1%). Time to surgery (93.7%) and prehospital blood transfusion (89.5%) were the leading OFI for PS-S fatalities. Most fatalities with PS-S injuries requiring blood (83.5%) also had an additional prehospital OFI. CONCLUSIONS: Comprehensive mortality reviews of battlefield fatalities can identify OFI in combat casualty care and prevention. Standardized lexicon is essential for translation to civilian trauma systems. LEVEL OF EVIDENCE: Performance Improvement and Epidemiological, level IV.

5.
J Hypertens ; 2020 Jan 27.
Artigo em Inglês | MEDLINE | ID: mdl-31990903

RESUMO

BACKGROUND: Although the long-term effects of combat injury are not well understood, there is emerging concern that exposure to combat environments and subsequent injury may increase the risk of hypertension through changes in inflammatory responses, psychological stress and mental health, and health behaviors. METHODS: Data from the Millennium Cohort Study and the Department of Defense Trauma Registry were used to identify combat-exposed and combat-injured participants. Incident hypertension diagnoses were ascertained from the Millennium Cohort survey. The associations between combat exposure/injury and hypertension risk was estimated using multivariable complementary log-log survival models. RESULTS: The final analysis sample consisted of 38 734 participants. Of these, 50.8% deployed but were not exposed to combat, 48.6% deployed and were exposed to combat, and 0.6% had combat injury. Overall prevalence of hypertension was 7.6%. Compared with participants who deployed but did not experience combat (mild exposure), elevated odds of hypertension were observed among those who experienced combat but not wounded (moderate exposure; AOR, 1.28; 95% CI, 1.19-1.38) and those wounded in combat (high exposure; AOR, 1.46; 95% CI, 1.07-2.00). Sleep duration of less than 4 h (AOR, 1.21; 95% CI, 1.03-1.43), sleep duration of 4-6 h (AOR, 1.16; 95% CI, 1.05-1.29), posttraumatic stress disorder (AOR, 1.54; 95% CI, 1.26-1.87), and overweight (AOR, 1.77; 95% CI, 1.61-1.95) and obese (AOR, 2.77; 95% CI, 2.45-3.12) status were also associated with higher odds of hypertension. CONCLUSION: Results support the hypotheses that combat exposure increases hypertension risk and that combat injury exacerbates this risk.

6.
J Burn Care Res ; 2020 Jan 21.
Artigo em Inglês | MEDLINE | ID: mdl-31960038

RESUMO

Acute kidney injury (AKI) is associated with high mortality in burn patients. Urinary biomarkers can aid in the prediction of AKI and its consequences, such as death and the need for renal replacement therapy (RRT). The purpose of this study was to investigate a novel methodology for detecting urinary biomarkers, the NephroCheck® Test System, and assess its ability to predict death or the need for RRT in burn patients. Burn patients admitted to the United States Army Institute of Surgical Research (USAISR) burn intensive care unit were prospectively enrolled between March 2016 and April 2018. A urine sample was obtained from all study participants using the NephroCheck® system. Patient and injury characteristics were gathered, and descriptive statistics were calculated and multivariable logistic regression analyses were performed using this data. Of the 69 patients in this study, 15 patients (21.7%) attained the composite outcome of death or needing RRT within 30 days of urine collection. NephroCheck® scores were higher for patients with the composite outcome, with p=0.06 for centrifuged scores and p=0.04 for non-centrifuged scores. Centrifuged and non-centrifuged scores were in high agreement and correlation (R2=0.97, p<0.0001). Non-centrifuged scores were significant in the unadjusted analysis, but they were not significant in the adjusted analysis. Although these scores had a lower sensitivity and negative predictive value compared to other parameters, they had the second highest specificity and positive predictive value. NephroCheck® scores were higher in burn patients with the composite outcome of death or RRT, and they demonstrated comparable sensitivity and specificity to creatinine and TBSA.

7.
Environ Pollut ; 255(Pt 3): 113350, 2019 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-31671370

RESUMO

A study was undertaken to test the hypothesis that the presence of fly ash and other artifactual materials (AMs) significantly increases the toxicity of urban soil and street dust. AMs were distinguished as artifacts (artificial particles > 2 mm in size), and particulate artifacts (≤2 mm in size); street dust was the <63 µm fraction of street sediments. Reference artifacts, street dusts, and topsoils representing different land use types in Detroit, Michigan were analyzed for miscellaneous radionuclides, trace elements, magnetic susceptibility (MS), and acetic acid-extractable (leachable) Pb. Background levels were established using native glacial sediments. Street sediments were found to have a roadside provenance, hence street dusts inherited their contamination primarily from local soils. All soils and dusts had radionuclide concentrations similar to background levels, and radiological hazard indices within the safe range. Artifacts, fly ash-impacted soils and street dusts contained elevated concentrations of toxic trace elements, which varied with land use type, but none produced a significant amount of leachable Pb. It is inferred that toxic elements in AMs are not bioavailable because they are occluded within highly insoluble materials. Hence, these results do not support our hypothesis. Rather, AMs contribute to artificially-elevated total concentrations leading to an overestimation of toxicity. MS increased with increasing total concentration, hence proximal sensing can be used to map contamination level, but the weak correlation between total and leachable Pb suggests that such maps do not necessarily indicate the associated biohazard. Home site soils with total Pb concentrations >500 mg kg-1 were sporadically toxic. Thus, these results argue against street dust as the local cause of seasonally elevated blood-Pb levels in children. Lead-bearing home site soil tracked directly indoors to form house dust is an alternative exposure pathway.


Assuntos
Monitoramento Ambiental , Poluentes do Solo/toxicidade , Criança , Cidades , Poeira/análise , Humanos , Metais Pesados/análise , Michigan , Medição de Risco/métodos , Solo , Poluentes do Solo/análise , Urbanização
8.
Artigo em Inglês | MEDLINE | ID: mdl-31718019

RESUMO

BACKGROUND: School physical activity (PA) policy, physical education curriculum, teacher training, knowledge of physical fitness, and parental support are among the key issues underlying the declining trend of physical fitness in children and adolescents. The Chinese CHAMPS was a multi-faceted intervention program to maximize the opportunities for moderate and vigorous physical activity (MVPA), and increase physical fitness in middle school students. The purpose of the study was to test whether the levels of modification in school physical education policy and curriculum incrementally influenced the changes in cardiorespiratory fitness and other physical fitness outcomes. METHODS: This 8-month study was a clustered randomized controlled trial using a 2 × 2 factorial design. The participants were 680 7th grade students (mean age = 12.66 years) enrolled in 12 middle schools that were randomly assigned to one of four treatment conditions: school physical education intervention (SPE), afterschool program intervention (ASP), SPE+ASP, and control. Targeted behaviors of the Chinese CHAMPS were the student's sedentary behavior and MVPA. The study outcomes were assessed by a test battery of physical fitness at the baseline and posttest. Sedentary behavior and MVPA were measured in randomly selected students using observations and accelerometry. RESULTS: The terms contrasting the pooled effect of SPE, ASP, and SPE+ASP vs. Control, the pooled effect of SPE and SPE+ASP vs. ASP only, and the effect of SPE+ASP vs. ASP on CRF and other physical fitness outcomes were all significant after adjusting for covariates, supporting the study hypothesis. Process evaluation demonstrated high fidelity of the intervention in the targeted students' behaviors. CONCLUSIONS: Chinese CHAMPS demonstrated the impact of varying the amount of MVPA and vigorous physical activity (VPA) on the physical fitness in middle school students in support of the need to increase the opportunity for PA in schools and to introduce high-intensity exercises in school-based PA programs. Modification of school policy, quality of physical education curriculum, and teacher training were important moderators of the improvement in physical fitness. (Trial registration: ChiCTR-IOR-14005388, the Childhood Health; Activity and Motor Performance Study.).

9.
Ann Surg ; 2019 Nov 08.
Artigo em Inglês | MEDLINE | ID: mdl-31714315

RESUMO

OBJECTIVE: To determine whether persistent opioid use after injury is associated with subsequent long-term development of clinically recognized opioid abuse. SUMMARY BACKGROUND DATA: Opioid abuse is an epidemic in the United States and trauma can initiate persistent use; however, it remains unclear whether persistent opioid use contributes to the subsequent development of opioid abuse. The care of combat casualties by the Departments of Defense and Veterans Affairs uniquely allows investigation of this long-term outcome. METHODS: This retrospective cohort study randomly selected 10,000 battle-injured United States military personnel. We excluded patients who died during initial hospitalization or within 180 days of discharge, had a preinjury opioid abuse diagnosis, or had missing data in a preselected variable. We defined persistent opioid use as filling an opioid prescription 3 to 6 months after discharge and recorded clinically recognized opioid abuse using relevant diagnosis codes. RESULTS: After exclusion, 9284 subjects were analyzed, 2167 (23.3%) of whom developed persistent opioid use. During a median follow-up time of 8 years, 631 (6.8%) patients developed clinically recognized opioid abuse with a median time to diagnosis of 3 years. Injury severity and discharge opioid prescription amount were associated with persistent opioid use after trauma. After adjusting for patient and injury-specific factors, persistent opioid use was associated with the long-term development of clinically recognized opioid abuse (adjusted hazard ratio, 2.39; 95% confidence interval, 1.99-2.86). CONCLUSIONS: Nearly a quarter of patients filled an opioid prescription 3 to 6 months after discharge, and this persistent use was associated with long-term development of opioid abuse.

11.
Sci Rep ; 9(1): 13767, 2019 Sep 24.
Artigo em Inglês | MEDLINE | ID: mdl-31551454

RESUMO

A mortality review of death caused by injury requires a determination of injury survivability prior to a determination of death preventability. If injuries are nonsurvivable, only non-medical primary prevention strategies have potential to prevent the death. Therefore, objective measures are needed to empirically inform injury survivability from complex anatomic patterns of injury. As a component of injury mortality reviews, network structures show promise to objectively elucidate survivability from complex anatomic patterns of injury resulting from explosive and firearm mechanisms. In this network analysis of 5,703 critically injured combat casualties, patterns of injury among fatalities from explosive mechanisms were associated with both a higher number and severity of anatomic injuries to regions such as the extremities, abdomen, and thorax. Patterns of injuries from a firearm were more isolated to individual body regions with fatal patterns involving more severe injuries to the head and thorax. Each injury generates a specific level of risk as part of an overall anatomic pattern to inform injury survivability not always captured by traditional trauma scoring systems. Network models have potential to further elucidate differences between potentially survivable and nonsurvivable anatomic patterns of injury as part of the mortality review process relevant to improving both the military and civilian trauma care systems.

13.
Ethn Dis ; 29(3): 451-462, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31367165

RESUMO

Objective: To determine: 1) rates of cardiovascular disease (CVD) among individuals with and without prior US military service; and 2) variation in CVD outcomes by race/ethnicity. Methods: We performed a cross-sectional study of the 2011-2016 Behavioral Risk Factor Surveillance System during 2018-2019. Groups with (n=369,844) and without (n=2,491,784) prior service were compared overall, and by race/ethnicity. CVD odds were compared using logistic regression. Rate-difference decomposition was used to estimate relative contributions of covariates to differences in CVD prevalence. Results: CVD was associated with military service (OR=1.34; P<.001). Among non-Hispanic Blacks, prior service was associated with a lower odds of CVD (OR=.69; P<.001), fully attenuating the net difference in CVD between individuals with and without prior service. Non-Hispanic Whites who served had the highest odds of CVD, while Hispanics with prior service had the same odds of CVD as non-Hispanic Whites without prior service. After age, smoking and body mass index status were the largest contributors to CVD differences by race/ethnicity. Conclusions: Results from this study support an association between prior military service and CVD and highlight differences in this association by race/ethnicity. Knowledge of modifiable health behaviors that contribute to differences in CVD outcomes could be used to guide prevention efforts.

14.
BMC Pediatr ; 19(1): 190, 2019 06 10.
Artigo em Inglês | MEDLINE | ID: mdl-31179916

RESUMO

BACKGROUND: One in three Head Start children is either overweight or obese. We will test the efficacy of an early childhood obesity prevention program, "¡Míranos! Look at Us, We Are Healthy!" (¡Míranos!), which promotes healthy growth and targets multiple energy balance-related behaviors in predominantly Latino children in Head Start. The ¡Míranos! intervention includes center-based (policy changes, staff development, gross motor program, and nutrition education) and home-based (parent engagement/education and home visits) interventions to address key enablers and barriers in obesity prevention in childcare. In partnership with Head Start, we have demonstrated the feasibility and acceptability of the proposed interventions to influence energy balance-related behaviors favorably in Head Start children. METHODS: Using a three-arm cluster randomized controlled design, 12 Head Start centers will be randomly assigned in equal number to one of three conditions: 1) a combined center- and home-based intervention, 2) center-based intervention only, or 3) comparison. The interventions will be delivered by trained Head Start staff during the academic year. A total of 444 3-year-old children (52% females; n = 37 per center at baseline) in two cohorts will be enrolled in the study and followed prospectively 1 year post-intervention. Data collection will be conducted at baseline, immediately post-intervention, and at the one-year follow-up and will include height, weight, physical activity (PA) and sedentary behaviors, sleep duration and screen time, gross motor development, dietary intake and food and activity preferences. Information on family background, parental weight, PA- and nutrition-related practices and behaviors, PA and nutrition policy and environment at center and home, intervention program costs, and treatment fidelity will also be collected. DISCUSSION: With endorsement and collaboration of two local Head Start administrators, ¡Míranos!, as a culturally tailored obesity prevention program, is poised to provide evidence of efficacy and cost-effectiveness of a policy and environmental approach to prevent early onset of obesity in low-income Latino preschool children. ¡Míranos! can be disseminated to various organized childcare settings, as it is built on the Head Start program and its infrastructure, which set a gold standard for early childhood education, as well as current PA and nutrition recommendations for preschool children. TRIAL REGISTRATION: ClinicalTrials.Gov ( NCT03590834 ) July 18, 2018.

15.
J Trauma Acute Care Surg ; 87(3): 645-657, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31045733

RESUMO

BACKGROUND: Studies of fatalities from injury and disease guide prevention and treatment efforts for populations at risk. Findings can inform leadership and direct clinical practice guidelines, research, and personnel, training, and equipment requirements. METHODS: A retrospective review and descriptive analysis was conducted of United States Special Operations Command (USSOCOM) fatalities who died while performing duties from September 11, 2001, to September 10, 2018. Characteristics analyzed included subcommand, military activity, operational posture, and manner of death. RESULTS: Of 614 USSOCOM fatalities (median age, 30 years; male, 98.5%) the leading cause of death was injury (97.7%); specifically, multiple/blunt force injury (34.5%), blast injury (30.7%), gunshot wound (GSW; 30.3%), and other (4.5%). Most died outside the United States (87.1%), during combat operations (85.3%), in the prehospital environment (91.5%), and the same day of insult (90.4%). Most fatalities were with the US Army Special Operations Command (67.6%), followed by the Naval Special Warfare Command (16.0%), Air Force Special Operations Command (9.3%), and Marine Corps Forces Special Operations Command (7.2%). Of 54.6% who died of injuries incurred during mounted operations, most were on ground vehicles (53.7%), followed by rotary-wing (37.3%) and fixed-wing (9.0%) aircrafts. The manner of death was primarily homicide (66.0%) and accident (30.5%), followed by natural (2.1%), suicide (0.8%), and undetermined (0.7%). Specific homicide causes of death were GSW (43.7%), blast injury (42.2%), multiple/blunt force injury (13.8%), and other (0.2%). Specific accident causes of death were multiple/blunt force injury (80.7%), blast injury (6.4%), GSW (0.5%), and other (12.3%). Of accident fatalities with multiple/blunt force injury, the mechanism was mostly aircraft mishaps (62.9%), particularly rotary wing (68.4%). CONCLUSION: Most USSOCOM fatalities died abroad from injury in the prehospital setting. To improve survival from military activities worldwide, leaders must continue to optimize prehospital capability and develop strategies that rapidly connect patients to advanced resuscitative and surgical care. LEVEL OF EVIDENCE: Epidemiological, level IV; Therapeutic level IV.


Assuntos
Militares/estatística & dados numéricos , Ferimentos e Lesões/mortalidade , Acidentes/mortalidade , Acidentes Aeronáuticos/mortalidade , Adolescente , Adulto , Traumatismos por Explosões/mortalidade , Causas de Morte , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Sistema de Registros , Estudos Retrospectivos , Estados Unidos , Ferimentos por Arma de Fogo/mortalidade , Ferimentos não Penetrantes/mortalidade , Adulto Jovem
17.
Transfusion ; 59(S2): 1499-1506, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30980742

RESUMO

BACKGROUND: The ability to rapidly administer whole blood (WB) at the point of injury is an important intervention to save lives. This can be accomplished using low titer group O WB donors. Titers of immunoglobulin M anti-A and anti-B might change over time. This study describes titer testing in a large series of donors. STUDY DESIGN AND METHODS: Data were collected retrospectively from the Armed Services Blood Program and the Theater Medical Data Store. Soldiers assigned to the 75th Ranger Regiment were screened and titered upon completion of training or before deployment or during periodic unit readiness activities. A Ranger group O low-titer (ROLO) donor was defined as having titers of both anti-A and -B of less than 256 by immediate spin testing. RESULTS: Between May 2015 and January 2017, of a total of 2237 participating soldiers, 1892 (84.5%) soldiers underwent antibody titering once, while 266 (11.9%) were titered twice, 62 (2.8%) were titered three times, and 17 (0.8%) were titered at least four times. The mean age was 26.5 ± 6.5, and 2197 (98.2%) were male. A total of 69.5% of donors met ROLO donor criteria on the first test. The percentage of donors meeting universal-donor criteria increased to 83.5% on the second test, 91.1% on the third test, and 100% on the fourth and fifth tests. CONCLUSIONS: With successive titer testing, it appears that individuals display a tendency toward lower titers. This may indicate that titer testing may not be required after the second test if donors have been identified initially as low titer.


Assuntos
Sistema do Grupo Sanguíneo ABO/sangue , Doadores de Sangue , Imunoglobulina M/sangue , Isoanticorpos/sangue , Militares , Adulto , Feminino , Humanos , Masculino , Estudos Retrospectivos , Fatores de Tempo
18.
Curr Eye Res ; 44(7): 770-780, 2019 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-30947563

RESUMO

Purpose: Blast-related brain and ocular injuries can lead to acute and chronic visual dysfunction. The chronic visual consequences of blast exposure and its progression remain unclear. The goal of this study is to analyze ocular functional response to four levels of blast exposure and identify a threshold of blast exposure leading to acute and chronic visual dysfunction. Methods: Anesthetized adult male Long-Evans rats received a single-blast exposure at a peak overpressure of 78, 117, 164 or 213 kPa, delivered by a compressed air-driven shock tube. Clinical eye examination, intraocular pressure (IOP), flash electroretinography (fERG) and spectral-domain optical coherence tomography (SD-OCT) images were assessed prior to, and at multiple time points post exposure. Results: No abnormal fERG were observed for the two lowest-level blast groups (78 kPa or 117 kPa). For the 164 kPa group, the a- and b-wave amplitudes of the fERG were decreased at 3 days postexposure (p = 0.009 for a-wave, p = 0.010 for b-wave), but recovered to baseline levels by 7 days post-exposure. The IOP was unchanged for the 117 kPa and 164 kPa groups. The 78 kPa group demonstrated a small transient increase during week one (p = 0.046). For the highest blast group (213 kPa), the IOP was significantly elevated immediately post-exposure (p = 0.0001), but recovered by 24 hr. A bimodal depression in the fERG a- and b-wave amplitudes was observed for this group: the amplitudes were depressed at day 3 post-exposure (p = 0.007 for a-wave, p = 0.012 for b-wave), and recovered by day 7 post-exposure. However, the fERG amplitudes were once again depressed at week 8 post-exposure, suggesting a chronic retinal dysfunction. All retinae appeared normal in SD-OCT images. Conclusions: Our study demonstrates that a single-blast exposure may result in acute and chronic fERG deficit, and traumatic IOP elevation. Noninvasive functional tests may hold promise for identifying individuals with a risk for developing chronic visual deficits, and indicating a time window for early clinical diagnosis, rehabilitation, and treatment.

19.
20.
JAMA Surg ; 154(7): 600-608, 2019 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-30916730

RESUMO

Importance: Although the Afghanistan and Iraq conflicts have the lowest US case-fatality rates in history, no comprehensive assessment of combat casualty care statistics, major interventions, or risk factors has been reported to date after 16 years of conflict. Objectives: To analyze trends in overall combat casualty statistics, to assess aggregate measures of injury and interventions, and to simulate how mortality rates would have changed had the interventions not occurred. Design, Setting, and Participants: Retrospective analysis of all available aggregate and weighted individual administrative data compiled from Department of Defense databases on all 56 763 US military casualties injured in battle in Afghanistan and Iraq from October 1, 2001, through December 31, 2017. Casualty outcomes were compared with period-specific ratios of the use of tourniquets, blood transfusions, and transport to a surgical facility within 60 minutes. Main Outcomes and Measures: Main outcomes were casualty status (alive, killed in action [KIA], or died of wounds [DOW]) and the case-fatality rate (CFR). Regression, simulation, and decomposition analyses were used to assess associations between covariates, interventions, and individual casualty status; estimate casualty transitions (KIA to DOW, KIA to alive, and DOW to alive); and estimate the contribution of interventions to changes in CFR. Results: In aggregate data for 56 763 casualties, CFR decreased in Afghanistan (20.0% to 8.6%) and Iraq (20.4% to 10.1%) from early stages to later stages of the conflicts. Survival for critically injured casualties (Injury Severity Score, 25-75 [critical]) increased from 2.2% to 39.9% in Afghanistan and from 8.9% to 32.9% in Iraq. Simulations using data from 23 699 individual casualties showed that without interventions assessed, CFR would likely have been higher in Afghanistan (15.6% estimated vs 8.6% observed) and Iraq (16.3% estimated vs 10.1% observed), equating to 3672 additional deaths (95% CI, 3209-4244 deaths), of which 1623 (44.2%) were associated with the interventions studied: 474 deaths (12.9%) (95% CI, 439-510) associated with the use of tourniquets, 873 (23.8%) (95% CI, 840-910) with blood transfusion, and 275 (7.5%) (95% CI, 259-292) with prehospital transport times. Conclusions and Relevance: Our analysis suggests that increased use of tourniquets, blood transfusions, and more rapid prehospital transport were associated with 44.2% of total mortality reduction. More critically injured casualties reached surgical care, with increased survival, implying improvements in prehospital and hospital care.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA