Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 485
Filter
1.
Anim Welf ; 33: e34, 2024.
Article in English | MEDLINE | ID: mdl-39315351

ABSTRACT

The objective of this study was to identify factors more commonly observed on farms with poor livestock welfare compared to farms with good welfare. Potentially, these factors may be used to develop an animal welfare risk assessment tool (AWRAT) that could be used to identify livestock at risk of poor welfare. Identifying livestock at risk of poor welfare would facilitate early intervention and improve strategies to promptly resolve welfare issues. This study focuses on cattle, sheep and goats in non-dairy extensive farming systems in Australia. To assist with identifying potential risk factors, a survey was developed presenting 99 factors about the farm, farmers, animals and various aspects of management. Based on their experience, key stakeholders, including veterinarians, stock agents, consultants, extension and animal welfare officers were asked to consider a farm where the welfare of the livestock was either high or low and rate the likelihood of observing these factors. Of the 141 responses, 65% were for farms with low welfare. Only 6% of factors had ratings that were not significantly different between high and low welfare surveys, and these were not considered further. Factors from poor welfare surveys with median ratings in the lowest 25% were considered potential risks (n = 49). Considering correlation, ease of verification and the different livestock farming systems in Australia, 18 risk factors relating to farm infrastructure, nutrition, treatment and husbandry were selected. The AWRAT requires validation in future studies.

2.
Anim Welf ; 33: e32, 2024.
Article in English | MEDLINE | ID: mdl-39315355

ABSTRACT

If livestock at risk of poor welfare could be identified using a risk assessment tool, more targeted response strategies could be developed by enforcement agencies to facilitate early intervention, prompt welfare improvement and a decrease in reoffending. This study aimed to test the ability of an Animal Welfare Risk Assessment Tool (AWRAT) to identify livestock at risk of poor welfare in extensive farming systems in Australia. Following farm visits for welfare- and non-welfare-related reasons, participants completed a single welfare rating (WR) and an assessment using the AWRAT for the farm just visited. A novel algorithm was developed to generate an AWRAT-Risk Rating (AWRAT-RR) based on the AWRAT assessment. Using linear regression, the relationship between the AWRAT-RR and the WR was tested. The AWRAT was good at identifying farms with poor livestock welfare based on this preliminary testing. As the AWRAT relies upon observation, the intra- and inter-observer agreement were compared in an observation study. This included rating a set of photographs of farm features, on two occasions. Intra-observer reliability was good, with 83% of Intra-class Correlation Coefficients (ICCs) for observers ≥ 0.8. Inter-observer reliability was moderate with an ICC of 0.67. The AWRAT provides a structured framework to improve consistency in livestock welfare assessments. Further research is necessary to determine the AWRAT's ability to identify livestock at risk of poor welfare by studying animal welfare incidents and reoffending over time.

3.
Mil Med ; 2024 Sep 24.
Article in English | MEDLINE | ID: mdl-39316388

ABSTRACT

INTRODUCTION: Calcium derangements remain poorly characterized in the combat trauma population. We describe the incidence of emergency department (ED) calcium derangements, associated physiologic derangements, and 24-hour mortality from the deployed combat setting. MATERIALS AND METHODS: We analyzed adult casualties from 2007 to 2023 from the DoD Trauma Registry for U.S. military, U.S. contractor, and coalition casualties that had at least 1 ionized calcium value documented in the ED at a Role 2 or Role 3 military treatment facility. We constructed a series of multivariable logistic regression models to test for the association of hypocalcemia and hypercalcemia with physiological derangements, blood product consumption, and survival. Vital signs and other laboratory studies were based on the concurrent ED encounter. RESULTS: There were 941 casualties that met inclusion for this analysis with 26% (245) having at least 1 calcium derangement. Among those, 22% (211) had at least 1 episode of hypocalcemia and 5% (43) had at least 1 episode of hypercalcemia in the ED. The vast majority (97%, 917) received calcium at least once. Median composite injury severity scores were lower among those with no calcium derangement (8 versus 17, P < .001). Survival was higher during the total hospitalization (98% versus 93%) among those with calcium derangements but similar at 24 hours (99% versus 98%, P = .059). After adjusting for confounder, any hypocalcemic measurement was associated with an elevated international normalized ratio (odds ratio 1.94, 95% CI 1.19-3.16), acidosis (1.66, 1.17-2.37), tachycardia (2.11, 1.42-3.15), hypotension (1.92, 1.09-3.38), depressed Glasgow coma scale (3.20, 2.13-4.81), elevated shock index (2.19, 1.45-3.31), submassive transfusion (3.97, 2.60-6.05), massive transfusion (4.22, 2.66-6.70), supermassive transfusion (3.65, 2.07-6.43), and all hospital stay mortality (2.30, 1.00-5.29). Comparatively, any hypercalcemic measurement was associated with acidosis (2.96, 1.39-6.32), depressed Glasgow coma scale (4.28, 1.81-10.13), submassive transfusion (3.40, 1.37-8.43), massive transfusion (6.25, 2.63-14.83), and supermassive transfusion (13.00, 5.47-30.85). CONCLUSIONS: Both hypocalcemia and hypercalcemia in the ED were associated with physiological derangements and blood product use, with a greater extent observed in those with hypocalcemia compared to those with hypercalcemia. Prospective studies are underway to better explain and validate these findings.

4.
BMJ Open ; 14(9): e086352, 2024 Sep 18.
Article in English | MEDLINE | ID: mdl-39299790

ABSTRACT

INTRODUCTION: Successful organ transplantation in patients with end-stage organ failure improves long-term survival, improves quality of life and reduces costs to the NHS. Despite an increase in the number of deceased organ donors over the last decade, there remains a considerable shortfall of suitable organs available for transplantation. Over half of UK donors are certified dead by neurological criteria following brain stem compression, which leads to severe physiological stress in the donor, combined with a hyperinflammatory state. Brain stem death-related dysfunction is an important reason for poor organ function and hence utilisation. For example, more than 30% of donation after brain stem death cardiac transplant recipients need short-term mechanical cardiac support, reflecting donor heart dysfunction.A small, randomised study previously showed improved outcomes for cardiac transplant recipients if the donor was given simvastatin. SIGNET takes inspiration from that study and hypothesises a potential reduction in damage to the heart and other organs during the period after diagnosis of death and prior to organ retrieval in donors that receive simvastatin. METHODS AND ANALYSIS: SIGNET is a multicentre, single-blind, prospective, group sequential, randomised controlled trial to evaluate the benefits of a single high dose of simvastatin given to potential organ donors diagnosed dead by neurological criteria on outcomes in all organ recipients. The trial will run across a minimum of 89 UK sites with a recruitment target of 2600 donors over 4 years. ETHICS AND DISSEMINATION: SIGNET received a favourable opinion from the London, Queen Square Research Ethics Committee (Ref: 21/LO/0412) and following approval of substantial amendment 1 in January 2023, the current protocol is version 2 (7 December 2022). Substantial amendment 1 clarified consent procedures and added additional sites and prescribers. Findings from the study will be publicly available and disseminated locally and internationally through manuscript publications in peer-reviewed journals and conference presentations at national and international platforms. TRIAL REGISTRATION NUMBER: ISRCTN11440354.


Subject(s)
Brain Death , Simvastatin , Tissue Donors , Humans , Simvastatin/administration & dosage , Simvastatin/therapeutic use , Single-Blind Method , Prospective Studies , Randomized Controlled Trials as Topic , Multicenter Studies as Topic , United Kingdom , Hydroxymethylglutaryl-CoA Reductase Inhibitors/administration & dosage , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Organ Transplantation
6.
Am J Emerg Med ; 85: 48-51, 2024 Aug 24.
Article in English | MEDLINE | ID: mdl-39226793

ABSTRACT

INTRODUCTION: Airway management is a key intervention during the resuscitation of critically ill trauma patients. Emergency surgical airway (ESA) placement is taught as a backup option when endotracheal intubation (ETI) fails. We sought to (1) describe the incidence of the emergency department (ED) ESA, (2) compare ESA versus ETI-only recipients, and (3) determine which factors were associated with receipt of an ESA. METHODS: We searched within the Trauma Quality Improvement Program datasets from 2017 to 2022 for all emergency department surgical airway placement and/or endotracheal intubations recipients. We compared ESA versus ETI-only recipients. RESULTS: From 2017 to 2022, there were 6,477,759 within the datasets, of which 238,128 met inclusion for this analysis. Within that, there were 236,292 ETIs, 2264 ESAs, with 428 (<1 %) having documentation of both. Of the ESAs performed, there were 82 documented in children <15 years of age with the youngest being 1 year of age. The ETI-only group had a lower proportion serious injuries to the head/neck (52 % versus 59 %), face (2 % versus 8 %), and skin (3 % versus 6 %). However, the ETI-only group had a higher proportion of serious injuries to the abdomen (15 % versus 9 %) and the extremities (19 % versus 12 %). Survival at 24-h was higher in the ETI-only group (83 % versus 76 %) as well as survival to discharge (70 % versus 67 %). In the subanaysis of children <15 years (n = 82), 34 % occurred in the 1-4 years age group, 35 % in the 5-9 years age group, and 30 % in the 10-14 years age group. In our multivariable logistic regression analysis, serious injuries to the head/neck (odds ratio [OR] 1.37, 95 % CI 1.23-1.54), face (OR 3.41, 2.83-4.11), thorax (OR 1.19, 1.06-1.33), and skin (OR 1.53, 1.15-2.05) were all associated with receipt of cricothyrotomy. Firearm (OR 3.62, 3.18-4.12), stabbing (2.85, 2.09-3.89), and other (OR 2.85, 2.09-3.89) were associated with receipt of ESA when using collision as the reference variable. CONCLUSIONS: ESA placement is a rarely performed procedure but frequently used as a primary airway intervention in this dataset. Penetrating mechanisms, and injuries to face were most associated with ESA placement. Our findings reinforce the need to maintain this critical airway skill for trauma management.

7.
Foot Ankle Orthop ; 9(3): 24730114241270207, 2024 Jul.
Article in English | MEDLINE | ID: mdl-39193450

ABSTRACT

Background: The spring ligament fibrocartilaginous complex (SLFC), which is essential for stabilizing the medial longitudinal arch, features a little-explored fibrocartilaginous facet within its superomedial aspect, articulating with the talar head. This research aimed to provide a detailed anatomical description of this facet, designated as the spring ligament articular facet (SLAF). Methods: Nine normally aligned cadaveric lower limbs were dissected, approaching the SLFC from a superior direction. Following talus disarticulation, high-resolution images of the ligament complex were captured and analyzed. ImageJ software was used to determine the areas and dimensions of the superomedial calcaneonavicular (SMCN) spring and SLAF. Results: The fibrocartilage facet exhibited a trapezoid shape in all specimens. The mean area for SMCN spring was 280.39 mm², and for SLAF, it was 200 mm². The proximal-to-distal length for SLAF averaged 11.78 mm at its longest and 5.34 mm at its shortest. Attachment of the SLAF to the calcaneum and the navicular showed robust fibrous structures, with average measurements of 3.75 and 1.75 mm at the medial and lateral calcaneal margins, and 2.75 and 2.98 mm at the medial and lateral navicular margins, respectively. Conclusion: This study clearly delineated the individual structural components of the SLFC articulating with the talar head and detailed its dimensions, emphasizing the need for more specific anatomical terminology that respects the intricate anatomy of the SLFC. Level of Evidence: Level III, descriptive study.

8.
Mol Ther Nucleic Acids ; 35(3): 102284, 2024 Sep 10.
Article in English | MEDLINE | ID: mdl-39165563

ABSTRACT

Adenosine deaminases acting on RNA (ADARs) are enzymes that catalyze the hydrolytic deamination of adenosine to inosine. The editing feature of ADARs has garnered much attention as a therapeutic tool to repurpose ADARs to correct disease-causing mutations at the mRNA level in a technique called site-directed RNA editing (SDRE). Administering a short guide RNA oligonucleotide that hybridizes to a mutant sequence forms the requisite dsRNA substrate, directing ADARs to edit the desired adenosine. However, much is still unknown about ADARs' selectivity and sequence-specific effects on editing. Atomic-resolution structures can help provide additional insight to ADARs' selectivity and lead to novel guide RNA designs. Indeed, recent structures of ADAR domains have expanded our understanding on RNA binding and the base-flipping catalytic mechanism. These efforts have enabled the rational design of improved ADAR guide strands and advanced the therapeutic potential of the SDRE approach. While no full-length structure of any ADAR is known, this review presents an exposition of the structural basis for function of the different ADAR domains, focusing on human ADAR2. Key insights are extrapolated to human ADAR1, which is of substantial interest because of its widespread expression in most human tissues.

10.
Am J Surg ; 238: 115887, 2024 Aug 03.
Article in English | MEDLINE | ID: mdl-39163762

ABSTRACT

BACKGROUND: The risks associated with blood product administration and venous thromboembolic events remains unclear. We sought to determine which blood products were associated with the development of deep vein thrombosis (DVT) and pulmonary embolism (PE). METHODS: We analyzed data from patients ≥18 years of age in the Trauma Quality Improvement Program (TQIP) database that received ≥1 blood product and survived ≥24 â€‹h. RESULTS: There were 42,399 that met inclusion, of whom, 2086 had at least one VTE event. In our multivariable logistic regression model, we found that WB had a unit odds ratio (uOR) of 1.05 (95 â€‹% CI 1.02-1.08) for DVT and 1.08 (1.05-1.12) for PE. Compared to WB, platelets had a higher uOR for DVT of 1.09 (1.04-1.13) but similar uOR for PE of 1.08 (1.03-1.14). CONCLUSIONS: We found an association of both DVT and PE with early whole blood and platelets.

11.
Am J Surg ; 238: 115898, 2024 Aug 13.
Article in English | MEDLINE | ID: mdl-39173564

ABSTRACT

BACKGROUND: Use of resuscitative endovascular balloon occlusion of the aorta (REBOA) for temporary hemorrhage control in severe non-compressible torso trauma remains controversial, with limited data on patient selection and outcomes. This study aims to analyze the nationwide trends of its use in the emergency department (EDs). METHODS: A retrospective analysis of the American College of Surgeons Trauma Quality Improvement Program (ACS-TQIP) from 2017 to 2022 was performed, focusing on REBOA placements in EDs. RESULTS: The analysis included 3398 REBOA procedures. Majority patients were male (76 â€‹%) with a median age of 40 years (27-58) and injury severity score of 20 (20-41). The most common mechanism was collision (64 â€‹%), with emergency surgeries most frequently performed for pelvic trauma (14 â€‹%). Level 1 trauma centers performed 82 â€‹% of these procedures, with consistent low annual utilization (<200 facilities). Survival rates were 85 â€‹% at 1-h post-placement, decreasing significantly to 42 â€‹% by discharge. CONCLUSIONS: REBOA usage in remains limited but steady, primarily occurring at level 1 trauma center EDs. While short-term survival rates are favorable, they drop significantly by the time of discharge.

12.
BMJ Mil Health ; 2024 Aug 29.
Article in English | MEDLINE | ID: mdl-39209758

ABSTRACT

INTRODUCTION: Emergency resuscitative thoracotomy (ERT) is a resource-intensive procedure that can deplete a combat surgical team's supply and divert attention from casualties with more survivable injuries. An understanding of survival after ERT in the combat trauma population will inform surgical decision-making. METHODS: We requested all encounters from 2007 to 2023 from the Department of Defense Trauma Registry (DoDTR). We analysed any documented thoracotomy in the emergency department and excluded any case for which it was not possible to distinguish ERT from operating room thoracotomy. The primary outcome was 24-hour mortality. RESULTS: There were 48 301 casualties within the original dataset. Of those, 154 (0.3%) received ERT, with 114 non-survivors and 40 survivors at 24 hours. There were 26 (17%) survivors at 30 days. The majority were performed in role 3. The US military made up the largest proportion among the non-survivors and survivors. Explosives predominated in both groups (61% and 65%). Median Composite Injury Severity Scores were lower among the non-survivors (19 vs 33). Non-survivors had a lower proportion of serious head injuries (13% vs 40%) and thorax injuries (32% vs 58%). Median RBC consumption was lower among non-survivors (10 units vs 19 units), as was plasma (6 vs 16) and platelets (0 vs 3). The most frequent interventions and surgical procedures were exploratory thoracotomy (n=140), chest thoracostomy (n=137), open cardiac massage (n=131) and closed cardiac massage (n=121). CONCLUSION: ERT in this group of combat casualties resulted in 26% survival at 24 hours. Although this proportion is higher than that reported in civilian data, more rigorous prospective studies would need to be conducted or improvement in the DoDTR data capture methods would need to be implemented to determine the utility of ERT in combat populations.

13.
Nat Commun ; 15(1): 7582, 2024 Aug 31.
Article in English | MEDLINE | ID: mdl-39217149

ABSTRACT

Free-electron-lasers fill a critical gap in the space of THz-sources as they can reach high average and peak powers with spectral tunability. Using a waveguide in a THz FEL significantly increases the coupling between the relativistic electrons and electromagnetic field enabling large amounts of radiation to be generated in a single passage of electrons through the undulator. In addition to transversely confining the radiation, the dispersive properties of the waveguide critically affect the velocity and slippage of the radiation pulse which determine the central frequency and bandwidth of the generated radiation. In this paper, we characterize the spectral properties of a compact waveguide THz FEL including simultaneous lasing at two different frequencies and demonstrating tuning of the radiation wavelength in the high frequency branch by varying the beam energy and ensuring that the electrons injected into the undulator are prebunched on the scale of the resonant radiation wavelength.

14.
Transplant Rev (Orlando) ; 38(4): 100872, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-39029393

ABSTRACT

BACKGROUND: Measures of patient experience are increasingly valued as key to healthcare quality assessment. We aimed to identify and describe publicly available measures assessing patient-reported experience of solid organ transplantation healthcare, and identify patient groups, healthcare settings, or aspects of patient experience underserved by existing measures. METHODS: We systematically searched MEDLINE, Embase, CINAHL, PsycINFO, Cochrane CENTRAL, Scopus and Web of Science from inception to 6th July 2023; supplemented with grey literature searches. Two reviewers independently screened search hits; outputs reporting patient-reported measures of multiple aspects of established solid organ transplantation healthcare were eligible. We abstracted measure context, characteristics, content (i.e., attributes of patient experience assessed), and development and validation processes. RESULTS: We identified nine outputs reporting eight measures of patient experience; these related only to kidney (n = 5) or liver (n = 3) transplantation, with no available measures relating to heart, lung, pancreas or intestinal transplantation. Of the identified measures, four were specific to solid organ transplant recipients. Measures sought to assess "patient satisfaction" (n = 4) and "patient experience" (n = 4) of healthcare. Measures mapped to between five and 16 of 20 attributes of patient experience, most often Information and education, Communication, and Access to care (all n = 7). Six measures reported a development process, only three reported a validation process. CONCLUSIONS: Publicly available patient-reported measures of organ transplantation healthcare experiences are limited to kidney and liver transplantation. There is heterogeneity in measure context, characteristics, and content, and insufficient clarity concerning how well measures capture the specific experiences of transplant recipients. Formalised measures of patient experience, specific to solid organ transplantation, with transparent reporting of development and validity are needed.

15.
J Trauma Acute Care Surg ; 97(2S Suppl 1): S91-S97, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-39049142

ABSTRACT

BACKGROUND: Damage-control resuscitation has come full circle, with the use of whole blood and balanced components. Lack of platelet availability may limit effective damage-control resuscitation. Platelets are typically stored and transfused at room temperature and have a short shelf-life, while cold-stored platelets (CSPs) have the advantage of a longer shelf-life. The US military introduced CSPs into the battlefield surgical environment in 2016. This study is a safety analysis for the use of CSPs in battlefield trauma. METHODS: The Department of Defense Trauma Registry and Armed Services Blood Program databases were queried to identify casualties who received room-temperature-stored platelets (RSPs) or both RSPs and CSPs between January 1, 2016, and February 29, 2020. Characteristics of recipients of RSPs and RSPs-CSPs were compared and analyzed. RESULTS: A total of 274 patients were identified; 131 (47.8%) received RSPs and 143 (52.2%) received RSPs-CSPs. The casualties were mostly male (97.1%), similar in age (31.7 years), with a median Injury Severity Score of 22. There was no difference in survival for recipients of RSPs (88.5%) versus RSPs-CSPs (86.7%; p = 0.645). Adverse events were similar between the two cohorts. Blood products received were higher in the RSPs-CSPs cohort compared with the RSPs cohort. The RSPs-CSPs cohort had more massive transfusion (53.5% vs. 33.5%, p = 0.001). A logistic regression model demonstrated that use of RSPs-CSPs was not associated with mortality, with an adjusted odds ratio of 0.96 (p > 0.9; 95% confidence interval, 0.41-2.25). CONCLUSION: In this safety analysis of RSPs-CSPs compared with RSPs in a combat setting, survival was similar between the two groups. Given the safety and logistical feasibility, the results support continued use of CSPs in military environments and further research into how to optimize resuscitation strategies. LEVEL OF EVIDENCE: Therapeutic/Care Management; Level IV.


Subject(s)
Blood Preservation , Feasibility Studies , Platelet Transfusion , Humans , Male , Female , Adult , Blood Preservation/methods , Platelet Transfusion/methods , Platelet Transfusion/statistics & numerical data , United States/epidemiology , Injury Severity Score , Registries , Resuscitation/methods , Cold Temperature , Retrospective Studies , Wounds and Injuries/therapy , Wounds and Injuries/mortality , Military Personnel/statistics & numerical data , War-Related Injuries/therapy , War-Related Injuries/mortality , Military Medicine/methods , Blood Platelets
16.
Mil Med ; 2024 Jul 27.
Article in English | MEDLINE | ID: mdl-39073394

ABSTRACT

INTRODUCTION: Blood transfusions are common during combat casualty care, aiming to address the loss of blood volume that often accompanies severe battlefield injuries. This scoping review delves into the existing military combat casualty data to analyze the efficacy, challenges, and advances in the use of massive and super-massive transfusions in the management of critically injured warfighters. MATERIALS AND METHODS: We performed a scoping review of combat-related literature published between 2006 and 2023 pertaining to massive transfusions used during combat deployments. We utilized PubMed to identify relevant studies and utilized the PRISMA-ScR Checklist to conduct the review. RESULTS: We identified 53 studies that met the inclusion criteria with the majority being retrospective studies from registries used by the United States, British, French, and Dutch Militaries. Most of the studies focused on transfusion ratios, the movement of blood transfusions to more forward locations, implementation of massive transfusions with different fibrinogen-to-red blood cell ratios, the addition of recombinant factor VII, and the use of predictive models for transfusion. Lastly, we identified reports of improved survival for casualties with the rapid implementation of various blood products (warm fresh whole blood, cold-stored low titer group O blood, freeze-dried plasma, and component therapy) and literature relating to pediatric casualties and submassive transfusions. Notable findings include the establishment of hemodynamic and cell blood count parameters as predictors of the requirement for massive transfusions and the association of higher fibrinogen-to-red blood cell ratios with decreased mortality. CONCLUSIONS: We identified 53 studies focused on blood transfusions from the Global War on Terrorism conflicts. The majority were related to transfusion ratios and the movement of blood transfusions to more forward locations. We highlight key lessons learned on the battlefield that have been translated into scientific developments and changes in civilian trauma methods.

17.
PLoS One ; 19(6): e0303646, 2024.
Article in English | MEDLINE | ID: mdl-38861492

ABSTRACT

Due to the competitive nature of the construction industry, the efficiency of requirement analysis is important in enhancing client satisfaction and a company's reputation. For example, determining the optimal configuration of panels (generally called panelization) that form the structure of a building is one aspect of cost estimation. However, existing methods typically rely on rule-based approaches that may lead to suboptimal material usage, particularly in complex designs featuring angled walls and openings. Such inefficiency can increase costs and environmental impact due to unnecessary material waste. To address these challenges, this research proposes a Panelization Algorithm for Architectural Designs, referred to as PAAD, which utilizes a genetic evolutionary strategy built on the 2D bin packing problem. This method is designed to balance between strict adherence to manufacturing constraints and the objective of optimizing material usage. PAAD starts with multiple potential solutions within the predefined problem space, facilitating dynamic exploration of panel configurations. It approaches structural rules as flexible constraints, making necessary corrections in post-processing, and through iterative developments, the algorithm refines panel sets to minimize material use. The methodology is validated through an analysis against an industry implementation and expert-derived solutions, highlighting PAAD's ability to surpass existing results and reduce the need for manual corrections. Additionally, to motivate future research, a synthetic data generator, the architectural drawing encodings used, and a preliminary interface are also introduced. This not only highlights the algorithm's practical applicability but also encourages its use in real-world scenarios.


Subject(s)
Algorithms , Architecture , Construction Materials , Construction Industry/methods , Humans
18.
J Spec Oper Med ; 24(2): 17-21, 2024 Jun 25.
Article in English | MEDLINE | ID: mdl-38866695

ABSTRACT

BACKGROUND: Thoracic trauma occurs frequently in combat and is associated with high mortality. Tube thoracostomy (chest tube) is the treatment for pneumothorax resulting from thoracic trauma, but little data exist to characterize combat casualties undergoing this intervention. We sought to describe the incidence of these injuries and procedures to inform training and materiel development priorities. METHODS: This is a secondary analysis of a Department of Defense Trauma Registry (DoDTR) data set from 2007 to 2020 describing prehospital care within all theaters in the registry. We described all casualties who received a tube thoracostomy within 24 hours of admission to a military treatment facility. Variables described included casualty demographics; abbreviated injury scale (AIS) score by body region, presented as binary serious (=3) or not serious (<3); and prehospital interventions. RESULTS: The database identified 25,897 casualties, 2,178 (8.4%) of whom received a tube thoracostomy within 24 hours of admission. Of those casualties, the body regions with the highest proportions of common serious injury (AIS >3) were thorax 62% (1,351), extremities 29% (629), abdomen 22% (473), and head/neck 22% (473). Of those casualties, 13% (276) had prehospital needle thoracostomies performed, and 19% (416) had limb tourniquets placed. Most of the patients were male (97%), partner forces members or humanitarian casualties (70%), and survived to discharge (87%). CONCLUSIONS: Combat casualties with chest trauma often have multiple injuries complicating prehospital and hospital care. Explosions and gunshot wounds are common mechanisms of injury associated with the need for tube thoracostomy, and these interventions are often performed by enlisted medical personnel. Future efforts should be made to provide a correlation between chest interventions and pneumothorax management in prehospital thoracic trauma.


Subject(s)
Chest Tubes , Emergency Medical Services , Military Personnel , Pneumothorax , Registries , Thoracic Injuries , Thoracostomy , Humans , Thoracostomy/methods , Thoracic Injuries/therapy , Pneumothorax/therapy , Pneumothorax/etiology , Male , Female , Military Personnel/statistics & numerical data , Adult , Abbreviated Injury Scale , Young Adult , United States , Military Medicine/methods
20.
Am J Transplant ; 24(9): 1567-1572, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38729612

ABSTRACT

Liver transplantation is lifesaving for patients with end-stage liver disease. Similar to the role of transplantation for patients with end-stage liver disease, gender-affirming hormone therapy (GAHT) can be lifesaving for transgender and gender diverse (TGGD) patients who experience gender dysphoria. However, management of such hormone therapy during the perioperative period is unknown and without clear guidelines. Profound strides can be made in improving care for TGGD patients through gender-affirming care and appropriate management of GAHT in liver transplantation. In this article, we call for the transplant community to acknowledge the integral role of GAHT in the care of TGGD liver transplant candidates and recipients. We review the current literature and describe how the transplant community is ethically obligated to address this health care gap. We suggest tangible steps that clinicians may take to improve health outcomes for this minoritized patient population.


Subject(s)
Liver Transplantation , Transgender Persons , Female , Humans , Male , End Stage Liver Disease/surgery , Gender Dysphoria/drug therapy , Hormone Replacement Therapy/adverse effects , Hormone Replacement Therapy/ethics , Hormone Replacement Therapy/methods , Hormone Replacement Therapy/standards , Liver Transplantation/adverse effects , Liver Transplantation/ethics , Liver Transplantation/methods , Liver Transplantation/standards
SELECTION OF CITATIONS
SEARCH DETAIL