RESUMO
BACKGROUND: Trauma systems save lives by coordinating timely and effective responses to injury. However, trauma system effectiveness varies geographically, with worse outcomes observed in rural settings. Prior data suggest that undertriage may play a role in this disparity. Our aim was to explore potential driving factors for decision making among clinicians for undertriaged trauma patients. METHODS: We performed a retrospective analysis of the National Emergency Medical Services Information System database among patients who met physiologic or anatomic national field triage guideline criteria for transport to the highest level of trauma center. Undertriage was defined as transport to a non-level I/II trauma center. Multivariable logistic regression was used to determine demographic, injury, and system characteristics associated with undertriage. Undertriaged patients were then categorized into "recognized" and "unrecognized" groups using the documented reason for transport destination to identify underlying factors associated with undertriage. RESULTS: A total of 36,094 patients were analyzed. Patients in urban areas were more likely to be transported to a destination based on protocol rather than the closest available facility. As expected, patients injured in urban regions were less likely to be undertriaged than their suburban (adjusted odds ratio [aOR], 2.69; 95% confidence interval [95% CI], 2.21-3.31), rural (aOR, 2.71; 95% CI, 2.28-3.21), and wilderness counterparts (aOR, 3.99; 95% CI, 2.93-5.45). The strongest predictor of undertriage was patient/family choice (aOR, 6.29; 5.28-7.50), followed by closest facility (aOR, 5.49; 95% CI, 4.91-6.13) as the reason for hospital selection. Nonurban settings had over twice the odds of recognizing the presence of triage criteria among undertriaged patients (p < 0.05). CONCLUSION: Patients with injuries in nonurban settings and those with less apparent causes of severe injury are more likely to experience undertriage. By analyzing how prehospital clinicians choose transport destinations, we identified patient and system factors associated with undertriage. Targeting these at-risk demographics and contributing factors may help alleviate regional disparities in undertriage. LEVEL OF EVIDENCE: Diagnostic; Level IV.
RESUMO
OBJECTIVE: Evaluate the association of survival with helicopter transport directly to a trauma center compared with ground transport to a non-trauma center (NTC) and subsequent transfer. SUMMARY BACKGROUND DATA: Helicopter transport improves survival after injury. One potential mechanism is direct transport to a trauma center when the patient would otherwise be transported to an NTC for subsequent transfer. METHODS: Scene patients 16 years and above with positive physiological or anatomic triage criteria within PTOS 2000-2017 were included. Patients transported directly to level I/II trauma centers by helicopter were compared with patients initially transported to an NTC by ground with a subsequent helicopter transfer to a level I/II trauma center. Propensity score matching was used to evaluate the association between direct helicopter transport and survival. Individual triage criteria were evaluated to identify patients most likely to benefit from direct helicopter transport. RESULTS: In all, 36,830 patients were included. Direct helicopter transport was associated with a nearly 2-fold increase in odds of survival compared with NTC ground transport and subsequent transfer by helicopter (aOR 2.78; 95% CI 2.24-3.44, P <0.01). Triage criteria identifying patients with a survival benefit from direct helicopter transport included GCS≤13 (1.71; 1.22-2.41, P <0.01), hypotension (2.56; 1.39-4.71, P <0.01), abnormal respiratory rate (2.30; 1.36-3.89, P <0.01), paralysis (8.01; 2.03-31.69, P <0.01), hemothorax/pneumothorax (2.34; 1.36-4.05, P <0.01), and multisystem trauma (2.29; 1.08-4.84, P =0.03). CONCLUSIONS: Direct trauma center access is a mechanism driving the survival benefit of helicopter transport. First responders should consider helicopter transport for patients meeting these criteria who would otherwise be transported to an NTC.
Assuntos
Resgate Aéreo , Serviços Médicos de Emergência , Ferimentos e Lesões , Humanos , Estudos Retrospectivos , Aeronaves , Triagem , Centros de Traumatologia , Escala de Gravidade do Ferimento , Ferimentos e Lesões/terapiaRESUMO
BACKGROUND: Hemorrhage is the leading cause of preventable death after injury. Others have shown that delays in massive transfusion cooler arrival increase mortality, while prehospital blood product resuscitation can reduce mortality. Our objective was to evaluate if time to resuscitation initiation impacts mortality. METHODS: We combined data from the Prehospital Air Medical Plasma (PAMPer) trial in which patients received prehospital plasma or standard care and the Study of Tranexamic Acid during Air and ground Medical Prehospital transport (STAAMP) trial in which patients received prehospital tranexamic acid or placebo. We evaluated the time to early resuscitative intervention (TERI) as time from emergency medical services arrival to packed red blood cells, plasma, or tranexamic acid initiation in the field or within 90 minutes of trauma center arrival. For patients not receiving an early resuscitative intervention, the TERI was calculated based on trauma center arrival as earliest opportunity to receive a resuscitative intervention and were propensity matched to those that did to account for selection bias. Mixed-effects logistic regression assessed the association of 30-day and 24-hour mortality with TERI adjusting for confounders. We also evaluated a subgroup of only patients receiving an early resuscitative intervention as defined above. RESULTS: Among the 1,504 propensity-matched patients, every 1-minute delay in TERI was associated with 2% increase in the odds of 30-day mortality (adjusted odds ratio [aOR], 1.020; 95% confidence interval [CI], 1.006-1.033; p < 0.01) and 1.5% increase in odds of 24-hour mortality (aOR, 1.015; 95% CI, 1.001-1.029; p = 0.03). Among the 799 patients receiving an early resuscitative intervention, every 1-minute increase in TERI was associated with a 2% increase in the odds of 30-day mortality (aOR, 1.021; 95% CI, 1.005-1.038; p = 0.01) and 24-hour mortality (aOR, 1.023; 95% CI, 1.005-1.042; p = 0.01). CONCLUSION: Time to early resuscitative intervention is associated with morality in trauma patients with hemorrhagic shock. Bleeding patients need resuscitation initiated early, whether at the trauma center in systems with short prehospital times or in the field when prehospital time is prolonged. LEVEL OF EVIDENCE: Therapeutic/Care Management; Level III.
Assuntos
Serviços Médicos de Emergência , Choque Hemorrágico , Ácido Tranexâmico , Ferimentos e Lesões , Humanos , Transfusão de Sangue , Hemorragia/terapia , Hemorragia/complicações , Ressuscitação/efeitos adversos , Choque Hemorrágico/etiologia , Ácido Tranexâmico/uso terapêutico , Ferimentos e Lesões/complicações , Ferimentos e Lesões/terapiaRESUMO
Importance: Prehospital needle decompression (PHND) is a rare but potentially life-saving procedure. Prior studies on chest decompression in trauma patients have been small, limited to single institutions or emergency medical services (EMS) agencies, and lacked appropriate comparator groups, making the effectiveness of this intervention uncertain. Objective: To determine the association of PHND with early mortality in patients requiring emergent chest decompression. Design, Setting, and Participants: This was a retrospective cohort study conducted from January 1, 2000, to March 18, 2020, using the Pennsylvania Trauma Outcomes Study database. Patients older than 15 years who were transported from the scene of injury were included in the analysis. Data were analyzed between April 28, 2021, and September 18, 2021. Exposures: Patients without PHND but undergoing tube thoracostomy within 15 minutes of arrival at the trauma center were the comparison group that may have benefited from PHND. Main Outcomes and Measures: Mixed-effect logistic regression was used to determine the variability in PHND between patient and EMS agency factors, as well as the association between risk-adjusted 24-hour mortality and PHND, accounting for clustering by center and year. Propensity score matching, instrumental variable analysis using EMS agency-level PHND proportion, and several sensitivity analyses were performed to address potential bias. Results: A total of 8469 patients were included in this study; 1337 patients (11%) had PHND (median [IQR] age, 37 [25-52] years; 1096 male patients [82.0%]), and 7132 patients (84.2%) had emergent tube thoracostomy (median [IQR] age, 32 [23-48] years; 6083 male patients [85.3%]). PHND rates were stable over the study period between 0.2% and 0.5%. Patient factors accounted for 43% of the variation in PHND rates, whereas EMS agency accounted for 57% of the variation. PHND was associated with a 25% decrease in odds of 24-hour mortality (odds ratio [OR], 0.75; 95% CI, 0.61-0.94; P = .01). Similar results were found in patients who survived their ED stay (OR, 0.68; 95% CI, 0.52-0.89; P < .01), excluding severe traumatic brain injury (OR, 0.65; 95% CI, 0.45-0.95; P = .03), and restricted to patients with severe chest injury (OR, 0.72; 95% CI, 0.55-0.93; P = .01). PHND was also associated with lower odds of 24-hour mortality after propensity matching (OR, 0.79; 95% CI, 0.62-0.98; P = .04) when restricting matches to the same EMS agency (OR, 0.74; 95% CI, 0.56-0.99; P = .04) and in instrumental variable probit regression (coefficient, -0.60; 95% CI, -1.04 to -0.16; P < .01). Conclusions and Relevance: In this cohort study, PHND was associated with lower 24-hour mortality compared with emergent trauma center chest tube placement in trauma patients. Although performed rarely, PHND can be a life-saving intervention and should be reinforced in EMS education for appropriately selected trauma patients.
Assuntos
Serviços Médicos de Emergência , Adulto , Estudos de Coortes , Descompressão , Serviços Médicos de Emergência/métodos , Humanos , Masculino , Estudos Retrospectivos , Centros de TraumatologiaRESUMO
BACKGROUND: Growing evidence supports improved survival with prehospital blood products. Recent trials show a benefit of prehospital tranexamic acid (TXA) administration in select subgroups. Our objective was to determine if receiving prehospital packed red blood cells (pRBC) in addition to TXA improved survival in injured patients at risk of hemorrhage. METHODS: We performed a secondary analysis of all scene patients from the Study of Tranexamic Acid during Air and ground Medical Prehospital transport trial. Patients were randomized to prehospital TXA or placebo. Some participating EMS services utilized pRBC. Four resuscitation groups resulted: TXA, pRBC, pRBC+TXA, and neither. Our primary outcome was 30-day mortality and secondary outcome was 24-hour mortality. Cox regression tested the association between resuscitation group and mortality while adjusting for confounders. RESULTS: A total of 763 patients were included. Patients receiving prehospital blood had higher Injury Severity Scores in the pRBC (22 [10, 34]) and pRBC+TXA (22 [17, 36]) groups than the TXA (12 [5, 21]) and neither (10 [4, 20]) groups (p < 0.01). Mortality at 30 days was greatest in the pRBC+TXA and pRBC groups at 18.2% and 28.6% compared with the TXA only and neither groups at 6.6% and 7.4%, respectively. Resuscitation with pRBC+TXA was associated with a 35% reduction in relative hazards of 30-day mortality compared with neither (hazard ratio, 0.65; 95% confidence interval, 0.45-0.94; p = 0.02). No survival benefit was observed in 24-hour mortality for pRBC+TXA, but pRBC alone was associated with a 61% reduction in relative hazards of 24-hour mortality compared with neither (hazard ratio, 0.39; 95% confidence interval, 0.17-0.88; p = 0.02). CONCLUSION: For injured patients at risk of hemorrhage, prehospital pRBC+TXA is associated with reduced 30-day mortality. Use of pRBC transfusion alone was associated with a reduction in early mortality. Potential synergy appeared only in longer-term mortality and further work to investigate mechanisms of this therapeutic benefit is needed to optimize the prehospital resuscitation of trauma patients. LEVEL OF EVIDENCE: Therapeutic/Care Management; Level III.
Assuntos
Antifibrinolíticos , Serviços Médicos de Emergência , Ácido Tranexâmico , Antifibrinolíticos/uso terapêutico , Transfusão de Sangue , Hemorragia/tratamento farmacológico , Hemorragia/terapia , Humanos , Ácido Tranexâmico/uso terapêuticoRESUMO
BACKGROUND: Social determinants of health (SDOH) impact patient outcomes in trauma. Census data are often used to account for SDOH; however, there is no consensus on which variables are most important. Social vulnerability indices offer the advantage of combining multiple constructs into a single variable. Our objective was to determine if incorporation of SDOH in patient-level risk-adjusted outcome modeling improved predictive performance. METHODS: We evaluated two social vulnerability indices at the zip code level: Distressed Community Index (DCI) and National Risk Index (NRI). Individual variable combinations from Agency for Healthcare Research and Quality's SDOH data set were used for comparison. Patients were obtained from the Pennsylvania Trauma Outcomes Study 2000 to 2020. These measures were added to a validated base mortality prediction model with comparison of area under the curve and Bayesian information criterion. We performed center benchmarking using risk-standardized mortality ratios to evaluate change in rank and outlier status based on SDOH. Geospatial analysis identified geographic variation and autocorrelation. RESULTS: There were 449,541 patients included. The DCI and NRI were associated with an increase in mortality (adjusted odds ratio, 1.02; 95% confidence interval, 1.01-1.03 per 10% percentile rank increase; p < 0.01, respectively). The DCI, NRI, and seven Agency for Healthcare Research and Quality variables also improved base model fit but discrimination was similar. Two thirds of centers changed mortality ranking when accounting for SDOH compared with the base model alone. Outlier status changed in 7% of centers, most representing an improvement from worse-than-expected to nonoutlier or nonoutlier to better-than-expected. There was significant geographic variation and autocorrelation of the DCI and NRI (DCI; Moran's I 0.62, p = 0.01; NRI; Moran's I 0.34, p = 0.01). CONCLUSION: Social determinants of health are associated with an individual patient's risk of mortality after injury. Accounting for SDOH may be important in risk adjustment for trauma center benchmarking. LEVEL OF EVIDENCE: Prognostic/Epidemiologic, level IV.
Assuntos
Determinantes Sociais da Saúde , Ferimentos e Lesões/mortalidade , Adulto , Idoso , Teorema de Bayes , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Pennsylvania , Estudos Retrospectivos , Centros de TraumatologiaRESUMO
BACKGROUND: The National Field Triage Guidelines were created to inform triage decisions by emergency medical services (EMS) providers and include eight anatomic injuries that prompt transportation to a Level I/II trauma center. It is unclear how accurately EMS providers recognize these injuries. Our objective was to compare EMS-identified anatomic triage criteria with International Classification of Diseases-10th revision (ICD-10) coding of these criteria, as well as their association with trauma center need (TCN). METHODS: Scene patients 16 years and older in the NTDB during 2017 were included. National Field Triage Guidelines anatomic criteria were classified based on EMS documentation and ICD-10 diagnosis codes. The primary outcome was TCN, a composite of Injury Severity Score greater than 15, intensive care unit admission, urgent surgery, or emergency department death. Prevalence of anatomic criteria and their association with TCN was compared in EMS-identified versus ICD-10-coded criteria. Diagnostic performance to predict TCN was compared. RESULTS: There were 669,795 patients analyzed. The ICD-10 coding demonstrated a greater prevalence of injury detection. Emergency medical service-identified versus ICD-10-coded anatomic criteria were less sensitive (31% vs. 59%), but more specific (91% vs. 73%) and accurate (71% vs. 68%) for predicting TCN. Emergency medical service providers demonstrated a marked reduction in false positives (9% vs. 27%) but higher rates of false negatives (69% vs. 42%) in predicting TCN from anatomic criteria. Odds of TCN were significantly greater for EMS-identified criteria (adjusted odds ratio, 4.5; 95% confidence interval, 4.46-4.58) versus ICD-10 coding (adjusted odds ratio 3.7; 95% confidence interval, 3.71-3.79). Of EMS-identified injuries, penetrating injury, flail chest, and two or more proximal long bone fractures were associated with greater TCN than ICD-10 coding. CONCLUSION: When evaluating the anatomic criteria, EMS demonstrate greater specificity and accuracy in predicting TCN, as well as reduced false positives compared with ICD-10 coding. Emergency medical services identification is less sensitive for anatomic criteria; however, EMS identify the most clinically significant injuries. Further study is warranted to identify the most clinically important anatomic triage criteria to improve our triage protocols. LEVEL OF EVIDENCE: Care management, Level IV; Prognostic, Level III.
Assuntos
Codificação Clínica/estatística & dados numéricos , Serviços Médicos de Emergência/estatística & dados numéricos , Centros de Traumatologia/estatística & dados numéricos , Triagem/estatística & dados numéricos , Ferimentos e Lesões/diagnóstico , Adulto , Idoso , Codificação Clínica/normas , Serviços Médicos de Emergência/normas , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Guias de Prática Clínica como Assunto , Valor Preditivo dos Testes , Estudos Retrospectivos , Centros de Traumatologia/normas , Índices de Gravidade do Trauma , Triagem/normasRESUMO
BACKGROUND AND AIMS: Liver transplantation is the most effective treatment for end-stage liver disease (ESLD). Whether moderately macrosteatotic livers (30%-60%) represent a risk for worsened graft function is controversial. The uncertainty, in large part, is owing to the heterogeneous steatosis grading. Our aim was to determine the short- and long-term outcomes of moderately macrosteatotic allografts that were graded according to a standardized institutional protocol. METHODS: We performed a retrospective analysis of transplants performed between 1994 and 2014. All patients with allografts biopsied pretransplantation were included. Relevant donor and recipient variable were recorded. Moderately macrosteatotic livers were compared with mildly macrosteatotic and nonsteatotic livers. Primary outcomes of interest were patient survival at 90 days, 1 year, and 5 years. Cox regression analyses were carried out to compare survival between the 2 groups. RESULTS: We compared 65 allografts with moderate macrosteatosis and 810 with no or mild macrosteatosis. Patients with moderately macrosteatotic allografts were 2.69 times as likely to die within the first 90 days after transplant (75.1% vs 91.6% survival) after adjusting for donor age, donor race, recipient age, recipient race, recipient body mass index, recipient diabetes, presence of hepatocellular carcinoma, days on waitlist, Model for End-Stage Liver Disease (MELD) score at transplantation, cold ischemia time. However, for recipients who survive 90 days, moderately macrosteatotic allografts had comparable long-term survival. CONCLUSION: Our study shows that moderate macrosteatosis is a strong predictor of early but not late mortality. Further studies are needed to distinguish the specific cohort of patients for whom moderately macrosteatotic allografts will lead to acceptable outcomes.
Assuntos
Doença Hepática Terminal/mortalidade , Fígado Gorduroso/patologia , Transplante de Fígado , Adulto , Idoso , Índice de Massa Corporal , Doença Hepática Terminal/cirurgia , Feminino , Humanos , Estimativa de Kaplan-Meier , Fígado/patologia , Transplante de Fígado/métodos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Índice de Gravidade de Doença , Transplante Homólogo , Resultado do TratamentoRESUMO
BACKGROUND: Social vulnerability indices were created to measure resiliency to environmental disasters based on socioeconomic and population characteristics of discrete geographic regions. They are composed of multiple validated constructs that can also potentially identify geographically vulnerable populations after injury. Our objective was to determine if these indices correlate with injury fatality rates in the US. METHODS: We evaluated three social vulnerability indices: The Hazards & Vulnerability Research Institute's Social Vulnerability Index (SoVI), the Center for Disease Control's Social Vulnerability Index (SVI), and the Economic Innovation Group's Distressed Community Index (DCI). We analyzed SVI subindices and common individual census variables as indicators of socioeconomic status. Outcomes included age-adjusted county-level overall, firearm, and motor vehicle collision deaths per 100,000 population. Linear regression determined the association of injury fatality rates with the SoVI, SVI, and DCI. Bivariate choropleth mapping identified geographic variation and spatial autocorrelation of overall fatality, SoVI, and DCI. RESULTS: A total of 3,137 US counties were included. Only 24.6% of counties fell into the same vulnerability quintile for all three indices. Despite this, all indices were associated with increasing fatality rates for overall, firearm, and motor vehicle collision fatality. The DCI performed best by model fit, explanation of variance, and diagnostic performance on overall injury fatality. There is significant geographic variation in SoVI, DCI, and injury fatality rates at the county level across the United States, with moderate spatial autocorrelation of SoVI (Moran's I, 0.35; p < 0.01) and high autocorrelation of injury fatality rates (Moran's I, 0.77; p < 0.01) and DCI (Moran's I, 0.53; p < 0.01). CONCLUSION: While the indices contribute unique information, higher social vulnerability is associated with higher injury fatality across all indices. These indices may be useful in the epidemiologic and geographic assessment of injury-related fatality rates. Further study is warranted to determine if these indices outperform traditional measures of socioeconomic status and related constructs used in trauma research. LEVEL OF EVIDENCE: Epidemiological, level IV.
Assuntos
Acidentes de Trânsito/mortalidade , Classe Social , Populações Vulneráveis , Ferimentos e Lesões/mortalidade , Ferimentos por Arma de Fogo/mortalidade , Idoso , Idoso de 80 Anos ou mais , Feminino , Mapeamento Geográfico , Humanos , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Análise Espacial , Estados Unidos/epidemiologiaRESUMO
BACKGROUND: Despite evidence of benefit after injury, helicopter emergency medical services (HEMS) overtriage remains high. Scene and transfer overtriage are distinct processes. Our objectives were to identify geographic variation in overtriage and patient-level predictors, and determine if overtriage impacts population-level outcomes. METHODS: Patients 16 years or older undergoing scene or interfacility HEMS in the Pennsylvania Trauma Outcomes Study were included. Overtriage was defined as discharge within 24 hours of arrival. Patients were mapped to zip code, and rates of overtriage were calculated. Hot spot analysis identified regions of high and low overtriage. Mixed-effects logistic regression determined patient predictors of overtriage. High and low overtriage regions were compared for population-level injury fatality rates. Analyses were performed for scene and transfer patients separately. RESULTS: A total of 85,572 patients were included (37.4% transfers). Overtriage was 5.5% among scene and 11.8% among transfer HEMS (p < 0.01). Hot spot analysis demonstrated geographic variation in high and low overtriage for scene and transfer patients. For scene patients, overtriage was associated with distance (odds ratio [OR], 1.03; 95% confidence interval [CI], 1.01-1.06 per 10 miles; p = 0.04), neck injury (OR, 1.27; 95% CI, 1.01-1.60; p = 0.04), and single-system injury (OR, 1.37; 95% CI, 1.15-1.64; p < 0.01). For transfer patients, overtriage was associated with rurality (OR, 1.64; 95% CI, 1.22-2.21; p < 0.01), facial injury (OR, 1.22; 95% CI, 1.03-1.44; p = 0.02), and single-system injury (OR, 1.35; 95% CI, 1.18-2.19; p < 0.01). For scene patients, high overtriage was associated with higher injury fatality rate (coefficient, 1.72; 95% CI, 1.68-1.76; p < 0.01); low overtriage was associated with lower injury fatality rate (coefficient, -0.73; 95% CI, -0.78 to -0.68; p < 0.01). For transfer patients, high overtriage was not associated with injury fatality rate (p = 0.53); low overtriage was associated with lower injury fatality rate (coefficient, -2.87; 95% CI, -4.59 to -1.16; p < 0.01). CONCLUSION: Geographic overtriage rates vary significantly for scene and transfer HEMS, and are associated with population-level outcomes. These findings can help guide targeted performance improvement initiatives to reduce HEMS overtriage. LEVEL OF EVIDENCE: Therapeutic, level IV.
Assuntos
Resgate Aéreo/estatística & dados numéricos , Serviços Médicos de Emergência/organização & administração , Triagem/estatística & dados numéricos , Ferimentos e Lesões/terapia , Adulto , Idoso , Aeronaves , Feminino , Mapeamento Geográfico , Humanos , Escala de Gravidade do Ferimento , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Pennsylvania , Estudos Retrospectivos , Fatores de Tempo , Centros de TraumatologiaRESUMO
BACKGROUND: There are well-known disparities for patients injured in rural setting versus urban setting. Many cite access to care; however, the mechanisms are not defined. One potential factor is differences in field triage. Our objective was to evaluate differences in prehospital undertriage (UT) in rural setting versus urban settings. METHODS: Adult patients in the Pennsylvania Trauma Outcomes Study (PTOS) registry 2000 to 2017 were included. Rural/urban setting was defined by county according to the Pennsylvania Trauma Systems Foundation. Rural/urban classification was performed for patients and centers. Undertriage was defined as patients meeting physiologic or anatomic triage criteria from the National Field Triage Guidelines who were not initially transported to a Level I or Level II trauma center. Logistic regression determined the association between UT and rural/urban setting, adjusting for transport distance and prehospital time. Models were expanded to evaluate the effect of individual triage criteria, trauma center setting, and transport mode on UT. RESULTS: There were 453,112 patients included (26% rural). Undertriage was higher in rural patients (8.6% vs. 3.4%, p < 0.01). Rural setting was associated with UT after adjusting for distance and prehospital time (odds ratio [OR], 3.52; 95% confidence interval [CI], 1.82-6.78; p < 0.01). Different triage criteria were associated with UT in rural/urban settings. Rural setting was associated with UT for patients transferred to an urban center (OR, 3.32; 95% CI, 1.75-6.25; p < 0.01), but not a rural center (OR, 0.68; 95% CI, 0.08-5.53; p = 0.72). Rural setting was associated with UT for ground (OR, 5.01; 95% CI, 2.65-9.46; p < 0.01) but not air transport (OR, 1.18; 95% CI, 0.54-2.55; p = 0.68). CONCLUSION: Undertriage is more common in rural settings. Specific triage criteria are associated with UT in rural settings. Lack of a rural trauma center requiring transfer to an urban center is a risk factor for UT of rural patients. Air medical transport mitigated the risk of UT in rural patients. Provider and system interventions may help reduce UT in rural settings. LEVEL OF EVIDENCE: Care Management, Level IV.
Assuntos
Disparidades em Assistência à Saúde , Triagem/normas , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Pennsylvania , Saúde da População Rural , Centros de Traumatologia , Saúde da População UrbanaRESUMO
BACKGROUND/AIMS: Transfusion rates in colon cancer surgery are traditionally very high. Allogeneic red blood cell (RBC) transfusions are reported to induce immunomodulation that contributes to infectious morbidity and adverse oncologic outcomes. In an effort to attenuate these effects, the study institution implemented a universal leukocyte reduction protocol. The purpose of this study was to examine the impact of leukocyte-reduced (LR) transfusions on postoperative infectious complications, recurrence-free survival, and overall survival (OS). METHODS: In a retrospective study, patients with stage I-III adenocarcinoma of the colon from 2003 to 2010 who underwent elective resection were studied. The primary outcome measures were postoperative infectious complications and recurrence-free and OS in patients that received a transfusion. Bivariate and multivariable regression analyses were performed for each endpoint. RESULTS: Of 294 patients, 66 (22%) received a LR RBC transfusion. After adjustment, transfusion of LR RBCs was found to be independently associated with increased infectious complications (OR 3.10, 95% CI 1.24-7.73), increased odds of cancer recurrence (hazard ratio [HR] 3.74, 95% CI 1.94-7.21), and reduced OS when ≥3 units were administered (HR 2.24, 95% CI 1.12-4.48). CONCLUSION: Transfusion of LR RBCs is associated with an increased risk of infectious complications and worsened survival after elective surgery for colon cancer, irrespective of leukocyte reduction.
Assuntos
Adenocarcinoma/cirurgia , Neoplasias do Colo/cirurgia , Transfusão de Eritrócitos/efeitos adversos , Recidiva Local de Neoplasia/etiologia , Cuidados Pós-Operatórios/efeitos adversos , Infecção da Ferida Cirúrgica/etiologia , Adenocarcinoma/mortalidade , Adulto , Idoso , Idoso de 80 Anos ou mais , Neoplasias do Colo/mortalidade , Transfusão de Eritrócitos/métodos , Feminino , Seguimentos , Humanos , Masculino , Pessoa de Meia-Idade , Recidiva Local de Neoplasia/epidemiologia , Cuidados Pós-Operatórios/métodos , Estudos Retrospectivos , Fatores de Risco , Infecção da Ferida Cirúrgica/epidemiologia , Análise de Sobrevida , Resultado do TratamentoRESUMO
PURPOSE: Perioperative blood transfusions are costly and linked to adverse clinical outcomes. We investigated the factors associated with variation in blood transfusion utilization following upper gastrointestinal cancer resection and its association with infectious complications. METHODS: The Statewide Planning and Research Cooperative System was queried for elective esophagectomy, gastrectomy, and pancreatectomy for malignancy in NY State from 2001 to 2013. Bivariate and hierarchical logistic regression analyses were performed to assess the factors associated with receiving a perioperative allogeneic red blood cell transfusion. Additional multivariable analysis examined the relationship between transfusion and infectious complications. RESULTS: Among 14,875 patients who underwent upper GI cancer resection, 32 % of patients received a perioperative blood transfusion. After controlling for patient, surgeon, and hospital-level factors, significant variation in transfusion rates was present across both surgeons (p < 0.0001) and hospitals (p < 0.0001). Receipt of a blood transfusion was also independently associated with wound infection (OR = 1.68, 95% CI = 1.47 and 1.91), pneumonia (OR = 1.98, 95% CI = 1.74 and 2.26), and sepsis (OR = 2.49, 95% CI = 2.11 and 2.94). CONCLUSION: Significant variation in perioperative blood transfusion utilization is present at both the surgeon and hospital level. These findings are unexplained by patient-level factors and other known hospital characteristics, suggesting that variation is due to provider preferences and/or lack of standardized transfusion protocols. Implementing institutional transfusion guidelines is necessary to limit unwarranted variation and reduce infectious complication rates.
Assuntos
Transfusão de Eritrócitos/estatística & dados numéricos , Esofagectomia , Gastrectomia , Neoplasias Gastrointestinais/cirurgia , Pancreatectomia , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , New YorkRESUMO
BACKGROUND: High BMI is often used as a proxy for obesity and has been considered a risk factor for the development of an incisional hernia after abdominal surgery. However, BMI does not accurately reflect fat distribution. OBJECTIVE: The purpose of this work was to investigate the relationship among different obesity measurements and the risk of incisional hernia. DESIGN: This was a retrospective cohort study. SETTINGS: The study included a single academic institution in New York from 2003 to 2010. PATIENTS: The study consists of 193 patients who underwent colorectal cancer resection. MAIN OUTCOME MEASURES: Preoperative CT scans were used to measure visceral fat volume, subcutaneous fat volume, total fat volume, and waist circumference. A diagnosis of incisional hernia was made either through physical examination in medical chart documentation or CT scan. RESULTS: Forty-one patients (21.2%) developed an incisional hernia. The median time to hernia was 12.4 months. After adjusting for patient and surgical characteristics using Cox regression analysis, visceral obesity (HR 2.04, 95% CI 1.07-3.91) and history of an inguinal hernia (HR 2.40, 95% CI 1.09-5.25) were significant risk factors for incisional hernia. Laparoscopic resection using a transverse extraction site led to a >75% reduction in the risk of incisional hernia (HR 0.23, 95% CI 0.07-0.76). BMI > 30 kg/m was not significantly associated with incisional hernia development. LIMITATIONS: Limitations include the retrospective design without standardized follow-up to detect hernias and the small sample size attributed to inadequate or unavailable CT scans. CONCLUSIONS: Visceral obesity, history of inguinal hernia, and location of specimen extraction site are significantly associated with the development of an incisional hernia, whereas BMI is poorly associated with hernia development. These findings suggest that a lateral transverse location is the incision site of choice and that new strategies, such as prophylactic mesh placement, should be considered in viscerally obese patients.
Assuntos
Adenocarcinoma/cirurgia , Índice de Massa Corporal , Neoplasias Colorretais/cirurgia , Procedimentos Cirúrgicos do Sistema Digestório , Hérnia Ventral/epidemiologia , Obesidade Abdominal/epidemiologia , Complicações Pós-Operatórias/epidemiologia , Idoso , Estudos de Coortes , Feminino , Humanos , Laparoscopia , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Obesidade/epidemiologia , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Fatores de RiscoRESUMO
BACKGROUND: Surgical cases that include trainees are associated with worse outcomes in comparison with those that include attending surgeons alone. OBJECTIVE: This study aimed to identify whether resident involvement in partial colectomy was associated with worse outcomes when evaluated by surgical approach and resident experience. DESIGN: This is a retrospective study using the National Surgical Quality Improvement Program database. SETTINGS: This study evaluates cases included in the National Surgical Quality Improvement Program database. PATIENTS: All patients were included who underwent partial colectomy including both open and laparoscopic approaches. INTERVENTIONS: Residents were involved. MAIN OUTCOME MEASURES: The primary outcome measures were the association of resident involvement and major complication events, minor complication events, unplanned return to operating room, and operative time. RESULTS: Cases with residents were associated with major complications (OR 1.18, CI 1.09-1.27, p < 0.001) on multivariate analysis. However, after including operative time in the model only open cases involving fifth year residents were still associated with major complications (OR 1.13, p = 0.037). Resident involvement was associated with increased likelihood of minor complications (OR 1.3, p < 0.001) and an increased risk of unplanned return to the operating room (OR 1.20, p < 0.001). Operative time was longer for cases with residents on average by 33.7 minutes and 27 minutes for open and laparoscopic cases. LIMITATIONS: This study was limited by its retrospective design and lack of data on teachings status, case complexity, and intraoperative evaluation of technique. CONCLUSIONS: Resident involvement in partial colectomies is associated with an increased major complications, minor complications, likelihood of return to the operating room, and operative time.
Assuntos
Competência Clínica , Colectomia , Cirurgia Geral/educação , Internato e Residência , Avaliação de Resultados em Cuidados de Saúde , Idoso , Colectomia/efeitos adversos , Doenças do Colo/epidemiologia , Doenças do Colo/cirurgia , Comorbidade , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Complicações Pós-Operatórias/epidemiologia , Melhoria de Qualidade , Estudos RetrospectivosRESUMO
BACKGROUND: There is a paucity of quality data on the effects of chronic kidney disease in abdominal surgery. The aim of this study was to define the risk and outcome predictors of bowel resection in stage 5 chronic kidney disease using a large national clinical database. METHODS: The American College of Surgeons National Surgical Quality Improvement Program database was queried from years 2005-2010 for major bowel resection in dialysis-dependent patients. Patient demographics, preoperative risk factors, and intraoperative variables were evaluated. Primary endpoints were mortality and morbidity after 30 days. Predictors of outcome were assessed by multivariate regression. RESULTS: The study included 1,685 patients with chronic kidney disease undergoing bowel resection. Overall mortality and morbidity were 27.5 and 58.3 %, respectively. Acute presentation was the strongest predictor of mortality (OR 2.39, CI 1.54-3.72, p < 0.001). Other predictors of mortality included hypoalbuminemia (OR 2.12, CI 1.39-3.24, p < 0.001), pulmonary comorbidity (OR 2.25, CI 1.67-3.03, p < 0.001), and cardiac comorbidity (OR 1.54, CI 1.16-2.05, p = 0.003). CONCLUSION: This study demonstrates that bowel resection in patients with chronic kidney disease confers a high mortality risk. Preoperative optimization of comorbid conditions may reduce mortality after bowel resection in dialysis-dependent patients. In addition, laparoscopy was associated with a reduction in postoperative morbidity suggesting that it should be used preferentially.
Assuntos
Colectomia , Enteropatias/cirurgia , Intestino Grosso/cirurgia , Intestino Delgado/cirurgia , Insuficiência Renal Crônica/complicações , Adulto , Idoso , Idoso de 80 Anos ou mais , Colectomia/mortalidade , Bases de Dados Factuais , Feminino , Humanos , Enteropatias/complicações , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/etiologia , Diálise Renal , Insuficiência Renal Crônica/mortalidade , Insuficiência Renal Crônica/terapia , Estudos Retrospectivos , Medição de Risco , Fatores de Risco , Resultado do TratamentoRESUMO
INTRODUCTION: Compared to subcutaneous fat, visceral fat is more metabolically active, leading to chronic inflammation and tumorigenesis. The aim of this study is to describe the effect of visceral obesity on colorectal cancer outcomes using computed tomography (CT) imaging to measure visceral fat. MATERIALS AND METHODS: We conducted a retrospective chart review of patients who underwent surgical resection for colorectal cancer. Visceral fat volume was measured by preoperative CT scans. Final analysis was performed by stratifying patients based on oncologic stage. RESULTS: Two hundred nineteen patients met the inclusion criteria, 111 viscerally obese and 108 nonobese. Body mass index (BMI) weakly correlated with visceral fat volume measurements (R (2) = 0.304). Whereas obese patients had no difference in survival when categorizing obesity by BMI, categorizing based on visceral fat volume resulted in significant differences in stage II and stage III patients. In stage II cancer, viscerally obese patients had a nearly threefold decrease in disease-free survival (hazard ratio (HR) = 2.72; 95 % confidence interval (CI) = 1.21, 6.10). In stage III cancer, viscerally obese patients had a longer time to recurrence (HR = 0.39; 95 % CI = 0.16, 0.99). CONCLUSION: This study shows that viscerally obese patients with stage II colorectal cancer are at higher risk for poor outcomes and should be increasingly considered for adjuvant chemotherapy.
Assuntos
Adenocarcinoma/cirurgia , Índice de Massa Corporal , Colectomia , Neoplasias Colorretais/cirurgia , Gordura Intra-Abdominal/diagnóstico por imagem , Obesidade Abdominal/complicações , Reto/cirurgia , Adenocarcinoma/complicações , Adenocarcinoma/mortalidade , Adenocarcinoma/patologia , Idoso , Neoplasias Colorretais/complicações , Neoplasias Colorretais/mortalidade , Neoplasias Colorretais/patologia , Feminino , Humanos , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Estadiamento de Neoplasias , Obesidade Abdominal/diagnóstico por imagem , Cuidados Pré-Operatórios , Estudos Retrospectivos , Análise de Sobrevida , Tomografia Computadorizada por Raios X , Resultado do Tratamento , Circunferência da CinturaRESUMO
OBJECTIVE: The aim of this study was to delineate the impact of smoking on postoperative outcomes after colorectal resection for malignant and benign processes. BACKGROUND: Studies to date have implicated smoking as a risk factor for increased postoperative complications. However, there is a paucity of data on the effects of smoking after colorectal surgery and in particular for malignant compared with benign processes. METHODS: The American College of Surgeon's National Surgical Quality Improvement Program (2005-2010) database was queried for patients undergoing elective major colorectal resection for colorectal cancer, diverticular disease, or inflammatory bowel disease. Risk-adjusted 30-day outcomes were assessed and compared between patient cohorts identified as never-smokers, ex-smokers, and current smokers. Primary outcomes of incisional infections, infectious and major complications, and mortality were evaluated using regression modeling adjusting for patient characteristics and comorbidities. RESULTS: A total of 47,574 patients were identified, of which 26,333 had surgery for colorectal cancer, 14,019 for diverticular disease, and 7222 for inflammatory bowel disease. More than 60% of patients had never smoked, 20.4% were current smokers, and 19.2% were ex-smokers. After adjustment, current smokers were at a significantly increased risk of postoperative morbidity [odds ratio (OR), 1.3; 95% confidence interval (CI), 1.21-1.40] and mortality (OR, 1.5; 95% CI, 1.11-1.94) after colorectal surgery. This finding persisted across malignant and benign diagnoses and also demonstrated a significant dose-dependent effect when stratifying by pack-years of smoking. CONCLUSIONS: Smoking increases the risk of complications after all types of major colorectal surgery, with the greatest risk apparent for current smokers. A concerted effort should be made toward promoting smoking cessation in all patients scheduled for elective colorectal surgery.
Assuntos
Colectomia , Neoplasias Colorretais/cirurgia , Doença Diverticular do Colo/cirurgia , Doenças Inflamatórias Intestinais/cirurgia , Complicações Pós-Operatórias/etiologia , Reto/cirurgia , Fumar/efeitos adversos , Idoso , Colectomia/mortalidade , Neoplasias Colorretais/mortalidade , Bases de Dados Factuais , Doença Diverticular do Colo/mortalidade , Feminino , Humanos , Doenças Inflamatórias Intestinais/mortalidade , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Razão de Chances , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/mortalidade , Risco Ajustado , Fatores de Risco , Autorrelato , Resultado do TratamentoRESUMO
PURPOSE: Laparoscopy is an increasingly prevalent choice for elective splenectomy but it carries an inconsistent documentation of complications. This study examines 30-day postoperative outcomes after open (OS) and laparoscopic (LS) splenectomy. METHODS: Elective splenectomies were extracted from the National Surgical Quality Improvement Program database. Multivariate analysis determined factors associated with complications and an increased postoperative length of stay (LOS). RESULTS: There were a total of 1583 splenectomies with 991 (63.0%) laparoscopic cases. On univariate analysis, the LS group had fewer major (10.6% vs. 18.8%, P<0.0001) and minor complications (2.6% vs. 7.1%, P<0.0001). Adjusting for baseline differences, LS was not associated with an increase in major complications [odds ratio (OR), 0.76; 95% confidence interval, 0.54-1.08; P = 0.1255] but offered a decrease in minor complications (OR, 0.41; 95% confidence interval, 0.24-0.69; P = 0.0010) coupled with a decrease in postoperative LOS of 1.89 ± 0.30 days (P<0.0001) compared with OS. CONCLUSIONS: After accounting for comorbidities and intraoperative factors, laparoscopy remains a safe choice for elective splenectomy with fewer complications and shorter LOS.
Assuntos
Bases de Dados Factuais/estatística & dados numéricos , Procedimentos Cirúrgicos Eletivos/métodos , Laparoscopia , Melhoria de Qualidade , Esplenectomia/métodos , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Duração da CirurgiaRESUMO
INTRODUCTION: The dynamic helical hip system (DHHS; Synthes, Paoli, Pennsylvania) differs from the standard dynamic sliding hip screw (SHS) in that in preparing for its insertion, reaming of the femoral head is not performed, thereby preserving bone stock. It also requires less torque for insertion of the helical screw. The associated plate has locking options to allow locking screw fixation in the femoral shaft, thereby decreasing the chance of the plate pulling off. While biomechanical studies have shown improved resistance to cutout and increased rotational stability of the femoral head fragment when compared with traditional hip lag screws, there is limited information on clinical outcome of the implant available in the literature. METHODS: We report a single surgeon series of 87 patients who were treated for their per-trochanteric hip fractures with this implant to evaluate their clinical outcome and compare it with a cohort of 344 patients who were treated with the standard SHS. All data were prospectively collected, most as part of a structured Geriatric Fracture Care Program. RESULTS: The 2 groups were similar demographically, and medically, with similar rates of in-hospital complications and implant failure. Failure in the DHHS group was attributable to use of the implant outside its indications and repeated fall of the patient. CONCLUSION: This limited case series showed that the DHHS outcomes are comparable with that of the SHS. Whether there is any benefit to its use will require larger, prospective randomized controlled trials.