RESUMO
BACKGROUND: Laparoscopic sleeve gastrectomy (LSG) has gained popularity for the treatment of morbid obesity as gastric banding (BAND) has fallen out of favor. As a result, simultaneous conversion (CONV) of BAND to LSG is commonly performed. We hypothesized that CONV is associated with higher 30-day risk-adjusted serious morbidity. METHODS: Preoperative characteristics and 30-day outcomes from the American College of Surgeons National Surgical Quality Improvement Program Participant Use Files 2010-2014 were selected for patients who underwent LSG. Patients undergoing CONV were identified. Descriptive comparisons were performed using Chi-square and Wilcoxon rank-sum tests as appropriate. Multivariate logistic regression was performed to assess the association between CONV and a composite measure of 30-day serious morbidity and mortality. RESULTS: Overall, 35,307 patients met criteria for inclusion, of which 943 (2.7%) underwent CONV. The median age of patients undergoing CONV was higher (46 vs 44 years, p < 0.001) and a greater percentage of CONV patients was female (84.8 vs 77.9%, p < 0.001) than LSG patients. CONV patients had lower rates of common comorbidities, including diabetes (14.9 vs 23.1%, p < 0.001), hypertension (41.9 vs 48.6%, p < 0.001), and tobacco use (7.2 vs 9.8%, p < 0.001), as well as lower median BMI (41 vs 44, p < 0.001). Individual unadjusted outcomes of serious 30-day complications were similar between both groups, as was a composite measure of serious morbidity (CONV 4.3% vs LSG 3.6%, p = 0.1). However, after controlling for demographics, comorbidities, and concurrent band removal, CONV was associated with increased odds of serious 30-day morbidity (1.44, 95% CI 1.03-1.97) (c-statistic: 0.60). CONCLUSIONS: Serious morbidity following LSG is uncommon; however, CONV is associated with a modest increase in risk-adjusted adverse 30-day outcomes. Patients being evaluated for CONV should be counseled about the added risks versus LSG alone. Further research is warranted to identify whether the incremental risks of CONV may be modifiable.
Assuntos
Gastrectomia/métodos , Gastroplastia , Laparoscopia , Obesidade Mórbida/cirurgia , Reoperação , Adulto , Idoso , Bases de Dados Factuais , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Complicações Pós-Operatórias/epidemiologia , Complicações Pós-Operatórias/etiologia , Melhoria de Qualidade , Resultado do Tratamento , Estados UnidosRESUMO
Cyclin-dependent kinase 4/6 inhibitors are targeted therapies demonstrated to significantly improve overall survival as adjuvant treatment of estrogen receptor-positive breast cancers. Although intended to preferentially arrest cell cycle transitions in tumor cells, these agents can have undesirable systemic side effects, including hepatotoxicity. We report the first case of cyclin-dependent kinase 4/6 inhibitor therapy leading to acute-on-chronic liver failure requiring liver transplantation. Our case highlights the multidisciplinary approach required to manage acute-on-chronic liver failure induced by cancer-directed therapies in those with extrahepatic malignancies.
RESUMO
BACKGROUND: Frailty is prevalent in patients with end-stage liver disease and predicts waitlist mortality, posttransplant mortality, and frequency of hospitalizations. The Liver Frailty Index (LFI) is a validated measure of frailty in liver transplant (LT) candidates but requires an in-person assessment. METHODS: We studied the association between patient-reported physical function and LFI in a single-center prospective study of adult patients with cirrhosis undergoing LT evaluation from October 2020 to December 2021. Frailty was assessed with the LFI and 4-m gait speed. Patient-reported physical function was evaluated using a brief Patient-Reported Outcomes Measurement Information System (PROMIS) survey. RESULTS: Eighty-one LT candidates were enrolled, with a mean model of end-stage liver disease-sodium of 17.6 (±6.3). The mean LFI was 3.7 (±0.77; 15% frail and 59% prefrail) and the mean PROMIS Physical Function score was 45 (±8.6). PROMIS Physical Function correlated with LFI ( r = -0.54, P < 0.001) and 4-m gait speed ( r = 0.48, P < 0.001). The mean hospitalization rate was 1.1 d admitted per month. After adjusting for age, sex, and model of end-stage liver disease-sodium, patient-reported physical function-predicted hospitalization rate ( P = 0.001). CONCLUSIONS: This study suggests that a brief patient-reported outcome measure can be used to screen for frailty and predict hospitalizations in patients with cirrhosis.
Assuntos
Doença Hepática Terminal , Fragilidade , Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Doença Hepática Terminal/diagnóstico , Doença Hepática Terminal/cirurgia , Fragilidade/diagnóstico , Estudos Prospectivos , Cirrose Hepática/diagnóstico , Cirrose Hepática/cirurgia , Hospitalização , SódioRESUMO
Background: Living kidney donation is possible for people living with HIV (PLWH) in the United States within research studies under the HIV Organ Policy Equity (HOPE) Act. There are concerns that donor nephrectomy may have an increased risk of end-stage renal disease (ESRD) in PLWH due to HIV-associated kidney disease and antiretroviral therapy (ART) nephrotoxicity. Here we report the first 3 cases of living kidney donors with HIV under the HOPE Act in the United States. Methods: Within the HOPE in Action Multicenter Consortium, we conducted a prospective study of living kidney donors with HIV. Pre-donation, we estimated the 9-year cumulative incidence of ESRD, performed genetic testing of apolipoprotein L1 (APOL1), excluding individuals with high-risk variants, and performed pre-donation kidney biopsies (HOPE Act requirement). The primary endpoint was ≥grade 3 nephrectomy-related adverse events (AEs) in year one. Post-donation, we monitored glomerular filtration rate (measured by iohexol/Tc-99m DTPA [mGFR] or estimated with serum creatinine [eGFR]), HIV RNA, CD4 count, and ART. Findings: There were three donors with two-four years of follow-up: a 35 year-old female, a 52 year-old male, and a 47 year-old male. Pre-donation 9-year estimated cumulative incidence of ESRD was 3.01, 8.01, and 7.76 per 10,000 persons, respectively. In two donors with APOL1 testing, no high-risk variants were detected. Biopsies from all three donors showed no kidney disease. Post-donation, two donors developed nephrectomy-related ≥grade 3 AEs: a medically-managed ileus and a laparoscopically-repaired incisional hernia. GFR declined from 103 to 84 mL/min/1.73 m2 at four years (mGFR) in donor 1, from 77 to 52 mL/min/1.73 m2 at three years (eGFR) in donor 2, and from 65 to 39 mL/min/1.73 m2 at two years (eGFR) in donor 3. HIV RNA remained <20 copies/mL and CD4 count remained stable in all donors. Interpretation: The first three living kidney donors with HIV under the HOPE Act in the United States have had promising outcomes at two-four years, providing proof-of-concept to support living donation from PLWH to recipients with HIV. Funding: National Institute of Allergy and Infectious Diseases, National Institutes of Health.
RESUMO
Background: Fever is a common response to both infectious and non-infectious physiologic insults in the critically ill, and in certain populations it appears to be protective. Fever is particularly common in trauma patients, and even more so in those with infections. The relationship between fever, trauma status, and mortality in patients with an infection is unclear. Patients and Methods: A review of a prospectively maintained institutional database over a 17-year period was performed. Surgical and trauma intensive care unit (ICU) patients with a nosocomial infection were extracted to compare in-hospital mortality among trauma and non-trauma patients with and without fever. Univariable analyses compared patient and infection characteristics between trauma and non-trauma patients. A multivariable logistic regression model was created to identify predictors of in-hospital mortality, with a focus on fever and trauma status. Results: Nine hundred forty-one trauma patients and 1,449 non-trauma patients with ICU-acquired infections were identified. Trauma patients were younger (48 vs. 59, p < 0.001), more likely to be male (73% vs. 56%, p < 0.001), more likely to require blood transfusion (74% vs. 47%, p < 0.001), had lower Acute Physiology and Chronic Health Evaluation (APACHE) II scores (18 vs. 19, p = 0.02), and had lower rates of comorbidities. Trauma patients were more likely to develop a fever (72% vs. 43%, p < 0.001) and had lower in-hospital mortality (9.6% vs. 22.6%, p < 0.001). In multivariable analysis, non-trauma patients with fever had a lower odds of mortality compared with non-trauma patients without fever (odds ratio [OR] 0.63, p = 0.004). Trauma patients with fever had the lowest odds ratio for mortality when compared to non-trauma patients without fever (OR 0.25, p < 0.001). Conclusions: In this large cohort of trauma and surgical ICU patients with ICU-acquired infections, fever was associated with a lower odds of mortality in both trauma and non-trauma patients. Further investigation is needed to determine the mechanisms behind the interplay between trauma status, fever, and mortality.
Assuntos
Estado Terminal , Unidades de Terapia Intensiva , APACHE , Cuidados Críticos , Feminino , Mortalidade Hospitalar , Humanos , MasculinoRESUMO
Background: Scoring systems have been proposed to select donation after circulatory death (DCD) donors and recipients for liver transplantation (LT). We hypothesized that complex scoring systems derived in large datasets might not predict outcomes locally. Methods: Based on 1-year DCD-LT graft survival predictors in multivariate logistic regression models, we designed, validated, and compared a simple index using the University of California, San Francisco (UCSF) cohort (n = 136) and a universal-comprehensive (UC)-DCD score using the United Network for Organ Sharing (UNOS) cohort (n = 5,792) to previously published DCD scoring systems. Results: The total warm ischemia time (WIT)-index included donor WIT (dWIT) and hepatectomy time (dHep). The UC-DCD score included dWIT, dHep, recipient on mechanical ventilation, transjugular-intrahepatic-portosystemic-shunt, cause of liver disease, model for end-stage liver disease, body mass index, donor/recipient age, and cold ischemia time. In the UNOS cohort, the UC-score outperformed all previously published scores in predicting DCD-LT graft survival (AUC: 0.635 vs. ≤0.562). In the UCSF cohort, the total WIT index successfully stratified survival and biliary complications, whereas other scores did not. Conclusion: DCD risk scores generated in large cohorts provide general guidance for safe recipient/donor selection, but they must be tailored based on non-/partially-modifiable local circumstances to expand DCD utilization.
RESUMO
BACKGROUND: Although hospital systems have largely halted elective surgical practices in preparing their response to the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic, transplantation remains an essential and lifesaving surgical practice. To continue transplantation while protecting immunocompromised patients and health care workers, significant restructuring of normal patient care practice habits is required. METHODS: This is a nonrandomized, descriptive study of the abdominal transplant program at 1 academic center (University of California, San Francisco) and the programmatic changes undertaken to safely continue transplantations. Patient transfers, fellow use, and patient discharge education were identified as key areas requiring significant reorganization. RESULTS: The University of California, San Francisco abdominal transplant program took an early and aggressive approach to restructuring inpatient workflows and health care worker staffing. The authors formalized a coronavirus disease 2019 (COVID-19) transfer system to address patients in need of services at their institution while minimizing the risk of SARS-CoV-2 in their transplant ward and used technological approaches to provide virtual telehealth where possible. They also modified their transplant fellow staffing and responsibilities to develop an adequate backup system in case of potential exposures. CONCLUSION: Every transplant program is unique, and an individualized plan to adapt and modify standard clinical practices will be required to continue providing essential transplantation services. The authors' experience highlights areas of attention specific to transplant programs and may provide generalizable solutions to support continued transplantation in the COVID-19 era.
Assuntos
Infecções por Coronavirus , Pandemias , Pneumonia Viral , Transplante/normas , Fluxo de Trabalho , Betacoronavirus , COVID-19 , Humanos , Assistência ao Paciente/métodos , Assistência ao Paciente/normas , SARS-CoV-2 , São Francisco , Transplante/métodosRESUMO
PURPOSE: Laparoscopic sleeve gastrectomy (LSG) has rapidly gained popularity as a single-stage operation for the treatment of morbid obesity, as patients undergoing LSG have been shown to achieve similar weight loss and resolution of obesity-related comorbidities in comparison to those undergoing Roux-en-Y gastric bypass (RYGB), the "gold standard" bariatric operation. Although LSG poses fewer technical challenges than RYGB, little is known about differences in short-term outcomes among patients undergoing LSG and RYGB. We hypothesized that LSG is associated with lower 30-day risk-adjusted serious morbidity. METHODS: Preoperative characteristics and 30-day outcomes from the American College of Surgeons National Surgical Quality Improvement Program (ACSNSQIP) Participant Use Files (PUF) 2010-2014 were selected for all patients who underwent LSG or RYGB. Descriptive comparisons were performed using chi-square and Wilcoxon's rank-sum tests as appropriate. The primary outcome was a risk-adjusted composite measure of 30-day serious morbidity and mortality. RESULTS: We analyzed records for 47,982 (42.0%) and 66,380 (58.0%) patients undergoing LSG and RYGB, respectively. On univariate analysis, LSG patients had a lower rate of organ space infection (0.45% vs. 0.68%, p < 0.001), lower rate of bleeding requiring transfusions (1.00% vs. 1.60%, p < 0.001), lower rate of sepsis (0.34% vs. 0.49%, p < 0.001), and septic shock (0.12% vs. 0.22%, p < 0.001) and required fewer unplanned reoperations (1.34% vs. 2.56%, p < 0.001) than RYGB patients. Both groups had similar rates of deep venous thrombosis (0.33% vs. 0.28%, p = 0.15) and pulmonary embolism (0.17% vs. 0.21%, p = 0.15). Mortality was lower among LSG patients (0.09% vs. 0.14%, p = 0.01). On multivariate analysis, RYGB was associated with higher risk-adjusted 30-day serious morbidity than LSG (odds ratio 1.61; 95% CI 1.52-1.71, p < 0.001). Older age, female gender, higher BMI, and insulin-dependent diabetes were also associated with risk of serious morbidity (C-statistic = 0.60). CONCLUSION: Serious morbidity following bariatric surgery is uncommon; however, LSG may be associated with modest protection from adverse 30-day outcomes in comparison to RYGB. Our conclusion is limited by the difference in baseline risk factors of the populations studied.
Assuntos
Gastrectomia , Derivação Gástrica , Laparoscopia , Obesidade Mórbida , Complicações Pós-Operatórias/epidemiologia , Gastrectomia/efeitos adversos , Gastrectomia/mortalidade , Gastrectomia/estatística & dados numéricos , Derivação Gástrica/efeitos adversos , Derivação Gástrica/mortalidade , Derivação Gástrica/estatística & dados numéricos , Humanos , Laparoscopia/efeitos adversos , Laparoscopia/mortalidade , Laparoscopia/estatística & dados numéricos , Masculino , Morbidade , Obesidade Mórbida/epidemiologia , Obesidade Mórbida/cirurgia , Melhoria de QualidadeRESUMO
BACKGROUND: Recent anti-microbial exposure has been associated with poor outcomes after infection in a mixed population. We hypothesized that recent anti-microbial exposure would be associated with poor outcomes of elective surgery. METHODS: From August 2015 to August 2016, all elective surgical patients were questioned prospectively about anti-microbial exposure during the prior three months. Multivariable models were used to calculate risk-adjusted odds ratios for anti-microbial exposure controlling for surgeon influence. Primary outcomes were any serious complication, any complication, any infection, and surgical site infection. Secondary outcomes were length of stay, C. difficile infection, and death. A separate analysis of patients excluding those having colorectal surgery who had undergone an oral antibiotic bowel preparation also was performed. RESULTS: Ninety-four percent of eligible patients (n = 1,538) answered the exposure question, with a three-month anti-microbial exposure rate of 34.1%. Colorectal surgery patients had the highest exposure rate, whereas hernia patients had the lowest. Exposed patients had higher rates of any complication, any infection, and surgical site infection, as well as a median two-day longer hospital stay. There were no differences in C. difficile infection or death between the groups. After risk adjustment, anti-microbial exposure was independently associated with any serious complication for all patients as well as with complications and infection in patients having an operation other than colorectal surgery. CONCLUSION: Recent anti-microbial exposure is associated with more complications of elective surgery. Anti-microbial drug-induced alterations in microbiome-related inflammatory responses may play a role, highlighting an opportunity for pre-surgical intervention in this at-risk population.
Assuntos
Anti-Infecciosos/uso terapêutico , Procedimentos Cirúrgicos Eletivos/efeitos adversos , Infecção da Ferida Cirúrgica/epidemiologia , Adulto , Idoso , Idoso de 80 Anos ou mais , Clostridioides difficile/isolamento & purificação , Feminino , Humanos , Tempo de Internação , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Risco , Infecção da Ferida Cirúrgica/microbiologia , Infecção da Ferida Cirúrgica/mortalidade , Inquéritos e Questionários , Análise de Sobrevida , Adulto JovemRESUMO
OBJECTIVE: Many general surgery residents interrupt clinical training for research pursuits or advanced degrees during dedicated research time (DRT). We hypothesize that time required to obtain a second degree during DRT decreases resident publication productivity. DESIGN, SETTING, AND PARTICIPANTS: All consecutive categorical general surgery residents at the University of Virginia in Charlottesville, VA, graduating in 2007 to 2016 were evaluated. PubMed queries identified journal publications for residents during and after DRT, limited to 1 year postgraduation. DRT varied between 1 and 3 years and was standardized by dividing publication number by DRT plus remaining clinical years and 1 postgraduation year. Median publications were compared between residents by receipt of a second degree. RESULTS: Thirty-six residents were eligible for analysis. Of these, 8 obtained a Master's in Clinical Research, 3 received Master of Public Health, and 1 completed a Doctorate of Philosophy. Publications ranged from 2 to 76 for degree residents and 1 to 36 for nondegree residents. For the 12 degree residents, median publication number per year was 3.8 (interquartile range: 2.3, 5.2) compared to 2.6 (interquartile range: 1.6, 3.5) in residents not pursuing a postdoctoral degree (p = 0.04). There was no significant difference in median number of first and second author publications by degree status. CONCLUSION: More publications per year were seen among residents earning a second degree, with a statistically significant difference between residents obtaining postdoctoral degrees during DRT compared with their counterparts. Our study demonstrates that residents pursuing a second degree are not hindered in their publication productivity despite the time investment required by the degree program. Additional research is needed to determine whether formal research training through a second degree corresponds to sustained scholarly productivity beyond residency.
Assuntos
Pesquisa Biomédica/educação , Competência Clínica , Educação de Pós-Graduação em Medicina/organização & administração , Cirurgia Geral/educação , Internato e Residência/organização & administração , Editoração/estatística & dados numéricos , Adulto , Estudos de Coortes , Eficiência , Feminino , Humanos , Masculino , Estudos Retrospectivos , Análise e Desempenho de Tarefas , Estados UnidosRESUMO
BACKGROUND: The long-term significance of early and prolonged antibiotic use in critically ill patients is yet to be described. Several studies suggest that antimicrobial exposure may have as yet unrecognized long-term effects on clinically meaningful outcomes. Our group previously conducted a quasi-experimental, before and after observational cohort study of hemodynamically stable surgical patients suspected of having an intensive care unit-acquired infection. This study demonstrated that aggressive initiation of antimicrobial therapy was associated with increased 30-day deaths. In a follow-up survival analysis, we hypothesized that aggressive antimicrobial treatment would not be a significant predictor of long-term death. METHODS: Survival data for the 201 patients included in the initial study were obtained from our clinical data repository. Univariable analysis, Kaplan-Meier, and Cox proportional hazards models were used. Survival was evaluated at one and four years. Age, gender, Acute Physiology and Chronic Health Evaluation (APACHE) II score, and co-morbidities were chosen a priori for potential inclusion in the model. Variables that met the model assumptions after testing were included in the final model. RESULTS: Follow-up data were available for 190 patients (95 in each group) representing 94.5% of the initial cohort. Twenty-four (25.3%) patients in the aggressive group had initial APACHE II scores of less than 15 compared with 13 (13.7%) patients in the conservative group (p = 0.04). There was a trend toward higher deaths in the aggressive group at four years (41.1% vs. 30.5%; p = 0.13). Kaplan-Meier analysis demonstrated a difference in survival at one year but not at four years. The Cox proportional hazards model showed a higher long-term death for patients in the aggressive antimicrobial group at both one and four years (hazard rate: 2.26 and 1.70, respectively). CONCLUSION: Aggressive initiation of antimicrobial therapy is independently associated with decreased long-term survival after critical illness. While further work is needed to confirm these findings, waiting for evidence of infection before initiation of antibiotic agents may be beneficial.
Assuntos
Anti-Infecciosos/uso terapêutico , Estado Terminal/mortalidade , Estado Terminal/terapia , Infecção Hospitalar/tratamento farmacológico , Infecção Hospitalar/mortalidade , APACHE , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Unidades de Terapia Intensiva , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-IdadeRESUMO
BACKGROUND: Surgical care is delivered 24 h a day at most institutions. Alarmingly, some authors have found that certain operative start times are associated with greater morbidity and mortality rates. This effect has been noted in both the public and private sector. Although some of these differences may be related to process, they may also be caused by the human circadian rhythm and corresponding changes in host defenses. We hypothesized that the time of day of an operation would impact the frequency of certain post-operative outcomes significantly. METHODS: Cases at a single tertiary-care center reported to the American College of Surgeons National Surgical Quality Improvement Program over a 10-year period were identified. Operative start times were divided into six-hour blocks, with 6 am to noon serving as the reference. Standard univariable techniques were applied. Multivariable logistic regression with mixed effects modeling then was used to determine the relation between operative start times and infectious outcomes, controlling for surgeon clustering. Statistical significance was set at p < 0.01. RESULTS: A total of 21,985 cases were identified, of which 2,764 (12.6%) were emergency procedures. Overall, 9.7% (n = 2,142) of patients experienced some post-operative infectious complication. Seventy percent of these infections (n = 1,506) were surgical site infections. On univariable analysis considering all cases, nighttime and evening operations had higher rates of post-operative infections than those in performed during the day (9.1% from 6 am to noon; 9.7% from noon to 6 pm; 14.8% from 6 pm to midnight; and 14.4% from midnight to 6 am; p < 0.001). On multivariable analysis, operative start time was not associated with the risk of post-operative infection, even when emergency cases were considered independently. CONCLUSION: Our data suggest that operative start times have no correlation with post-operative infectious complications. Further work is required to identify the source of the time-dependent outcome variability observed in previous studies.
Assuntos
Duração da Cirurgia , Período Perioperatório/estatística & dados numéricos , Infecção da Ferida Cirúrgica/epidemiologia , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Período Pré-Operatório , Estudos Retrospectivos , Infecção da Ferida Cirúrgica/mortalidade , Fatores de TempoRESUMO
BACKGROUND: Vancomycin and piperacillin-tazobactam are commonly used first guns in the empiric management of critically ill patients. Current studies suggest an increased prevalence of acute kidney injury with concomitant use, however, these studies are few and limited by small sample size. The purpose of this study was to compare the prevalence of nephrotoxicity after treatment with vancomycin alone and concomitant vancomycin and piperacillin-tazobactam treatment at our institution. HYPOTHESIS: Concomitant vancomycin and piperacillin-tazobactam-treated patients will experience greater prevalence of nephrotoxicity compared with vancomycin-only treated patients. METHODS: This was a retrospective cohort of patients treated with vancomycin for gram-positive or mixed infections in our facility from 2005 to 2009 who were not receiving hemodialysis at the time of admission. Included patients were stratified by treatment with vancomycin, vancomycin/piperacillin-tazobactam, or vancomycin/an alternative gram-negative rod (GNR) antibiotic. p values for categorical variables were computed using χ(2) while continuous variables were computed using Kruskal-Wallis. Variables deemed statistically significant (< 0.05) were included in the multivariable, log-binomial regression model. Relative risk (RR) and 95% confidence intervals (CI), and p values were computed using a generalized estimating equation (GEE) approach with robust standard errors (i.e., Huber White "sandwich variance" estimates) to accommodate a correlated data structure corresponding to multiple episodes of infection per individual. RESULTS: A total of 530 patients with 1,007 episodes of infection, were treated with vancomycin (150 patients/302 episodes of infection), vancomycin/piperacillin-tazobactam (213 patients/372 episodes of infection), or vancomycin/GNR alternative (167 patients/333 episodes of infection). Patient demographics, comorbidities, sites of infection, and organisms of infection were compared among groups. After adjusting for statistically significant variables, neither vancomycin/piperacillin-tazobactam (RR = 1.1, 95% CI = 0.99-1.2; p = 0.073) nor vancomycin/GNR alternative (RR = 1.1, 95% CI = 0.98-1.2; p = 0.097) were found to be associated with an increased risk for nephrotoxicity compared with vancomycin alone. CONCLUSION: A difference in nephrotoxicity was not observed between vancomycin and vancomycin/piperacillin-tazobactam-treated patients at our institution. Concomitant use as empiric therapy is appropriate, although larger sample sizes are needed to analyze closely this relation among at-risk subsets of this population.
Assuntos
Antibacterianos/efeitos adversos , Infecções Bacterianas/tratamento farmacológico , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Ácido Penicilânico/análogos & derivados , Insuficiência Renal/induzido quimicamente , Insuficiência Renal/epidemiologia , Vancomicina/efeitos adversos , Adulto , Idoso , Idoso de 80 Anos ou mais , Animais , Antibacterianos/administração & dosagem , Estado Terminal , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Ácido Penicilânico/administração & dosagem , Ácido Penicilânico/efeitos adversos , Piperacilina/administração & dosagem , Piperacilina/efeitos adversos , Combinação Piperacilina e Tazobactam , Estudos Retrospectivos , Medição de Risco , Vancomicina/administração & dosagemRESUMO
BACKGROUND: Obesity and commonly associated comorbidities are known risk factors for the development of infections. However, the intensity and duration of antimicrobial treatment are rarely conditioned on body mass index (BMI). In particular, the influence of obesity on failure of antimicrobial treatment for intra-abdominal infection (IAI) remains unknown. We hypothesized that obesity is associated with recurrent infectious complications in patients treated for IAI. METHODS: Five hundred eighteen patients randomized to treatment in the Surgical Infection Society Study to Optimize Peritoneal Infection Therapy (STOP-IT) trial were evaluated. Patients were stratified by obese (BMI ≥30) versus non-obese (BMI≥30) status. Descriptive comparisons were performed using Chi-square test, Fisher exact test, or Wilcoxon rank-sum tests as appropriate. Multivariable logistic regression using a priori selected variables was performed to assess the independent association between obesity and treatment failure in patients with IAI. RESULTS: Overall, 198 (38.3%) of patients were obese (BMI ≥30) versus 319 (61.7%) who were non-obese. Mean antibiotic d and total hospital d were similar between both groups. Unadjusted outcomes of surgical site infection (9.1% vs. 6.9%, p = 0.36), recurrent intra-abdominal infection (16.2% vs. 13.8, p = 0.46), death (1.0% vs. 0.9%, p = 1.0), and a composite of all complications (25.3% vs. 19.8%, p = 0.14) were also similar between both groups. After controlling for appropriate demographics, comorbidities, severity of illness, treatment group, and duration of antimicrobial therapy, obesity was not independently associated with treatment failure (c-statistic: 0.64). CONCLUSIONS: Obesity is not associated with antimicrobial treatment failure among patients with IAI. These results suggest that obesity may not independently influence the need for longer duration of antimicrobial therapy in treatment of IAI versus non-obese patients.
Assuntos
Anti-Infecciosos/uso terapêutico , Infecções Intra-Abdominais/tratamento farmacológico , Obesidade/complicações , Adulto , Idoso , Índice de Massa Corporal , Esquema de Medicação , Humanos , Pessoa de Meia-Idade , Recidiva , Análise de Regressão , Falha de TratamentoRESUMO
BACKGROUND: Numerous studies have demonstrated microorganism interaction through signaling molecules, some of which are recognized by other bacterial species. This interspecies synergy can prove detrimental to the human host in polymicrobial infections. We hypothesized that polymicrobial intra-abdominal infections (IAI) have worse outcomes than monomicrobial infections. METHODS: Data from the Study to Optimize Peritoneal Infection Therapy (STOP-IT), a prospective, multicenter, randomized controlled trial, were reviewed for all occurrences of IAI having culture results available. Patients in STOP-IT had been randomized to receive four days of antibiotics vs. antibiotics until two days after clinical symptom resolution. Patients with polymicrobial and monomicrobial infections were compared by univariable analysis using the Wilcoxon rank sum, χ(2), and Fisher exact tests. RESULTS: Culture results were available for 336 of 518 patients (65%). The durations of antibiotic therapy in polymicrobial (n = 225) and monomicrobial IAI (n = 111) were equal (p = 0.78). Univariable analysis demonstrated similar demographics in the two populations. The 37 patients (11%) with inflammatory bowel disease were more likely to have polymicrobial IAI (p = 0.05). Polymicrobial infections were not associated with a higher risk of surgical site infection, recurrent IAI, or death. CONCLUSION: Contrary to our hypothesis, polymicrobial IAI do not have worse outcomes than monomicrobial infections. These results suggest polymicrobial IAI can be treated the same as monomicrobial IAI.
Assuntos
Antibacterianos/administração & dosagem , Infecções Bacterianas/tratamento farmacológico , Coinfecção/tratamento farmacológico , Infecções Intra-Abdominais/tratamento farmacológico , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Infecções Bacterianas/microbiologia , Coinfecção/microbiologia , Feminino , Humanos , Infecções Intra-Abdominais/microbiologia , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Recidiva , Infecção da Ferida Cirúrgica/epidemiologia , Análise de Sobrevida , Resultado do Tratamento , Adulto JovemRESUMO
We have reviewed the literature regarding recent advances in the management of intra-abdominal sepsis, with a focus on antimicrobial agents, duration of therapy, and source control. Several important developments in these areas are discussed in this review. The introduction of a new antimicrobial agent-ceftolozane/tazobactam-marks the first novel agent for treating intra-abdominal infections in a number of years, and its indications for use and supporting evidence are reviewed here. In addition, we discuss recent evidence that clarifies the importance of early source control for intra-abdominal infection and new data that suggests that an abbreviated course of antimicrobial therapy for intra-abdominal infection is equally effective as prolonged therapy.
RESUMO
BACKGROUND: Many centers advocate aggressive lower extremity deep venous thrombosis (DVT) screening using ultrasound (LUS) for patients meeting high-risk criteria. We hypothesized that a high-risk screening protocol is impractical and costly to implement. METHODS: The University of Virginia's trauma database was queried to identify 6,656 patients admitted between 2009 and 2013. Patient characteristics and outcomes were recorded. Multivariate analyses were performed on patients who underwent LUS to assess the association between patient characteristics and the development of DVT. A predictive model for DVT was applied to the entire population to determine performance and resources required for implementation. RESULTS: Overall, 2,350 (35.3%) of admitted patients underwent LUS. A total of 146 patients (6.2%) developed DVT. Patients who underwent LUS were significantly older (54.5 years vs. 50.4 years, p < 0.0001), had higher Injury Severity Scores (ISSs) (13.5 vs. 8.6, p < 0.0001), and had longer admissions to the intensive care unit (5.6 days vs. 0.9 days, p < 0.0001). Backward selection multivariable logistic regression identified intensive care unit length of stay, transfusion of blood products, spinal cord injury, and pelvic fracture to be associated with DVT (c statistic, 0.76). The model was applied to the entire population to evaluate probability of DVT (c statistic, 0.87). Predictive performance and costs were determined using a cost per LUS of $228. The most sensitive threshold for screening would detect 53% of DVTs, require screening of 26% of all trauma patients, and cost nearly $600,000 to implement during the study period. CONCLUSION: Although a predictive model identified high-risk criteria for the development of DVT at our institution, the model demonstrated poor sensitivity and positive predictive value. These results suggest that implementing a high-risk screening protocol in trauma patients would require a costly and burdensome commitment of resources and that high-risk DVT screening protocols may not be practical or cost-effective for trauma patients. LEVEL OF EVIDENCE: Therapeutic/care management study, level IV.
Assuntos
Programas de Rastreamento/métodos , Trombose Venosa/diagnóstico por imagem , Feminino , Humanos , Escala de Gravidade do Ferimento , Masculino , Pessoa de Meia-Idade , Valor Preditivo dos Testes , Medição de Risco , Fatores de Risco , Sensibilidade e Especificidade , Ultrassonografia , Trombose Venosa/etiologia , Ferimentos e Lesões/complicaçõesRESUMO
BACKGROUND: Protein deficiency (PD) is a known risk factor for surgical complications; however, the risks of PD by weight class have not been well described. It was hypothesized that the combination of obesity and PD is associated with increased surgical complications compared with normal weight and normoalbuminemic patients. METHODS: A total of 85,833 general surgery patients undergoing elective operations within the 2011 National Surgical Quality Improvement Program were analyzed. Patients with conditions that could potentially confound serum albumin (SA) were excluded. Patients were stratified by normal (>3.0 g/dL) versus low (<3.0 g/dL) SA. The relative impact of SA and body mass index (BMI) (as individual and as combined variables) on surgical morbidity and mortality were assessed. Multivariate analyses were performed to identify independent risk factors for morbidity and mortality. RESULTS: Overall, 2,088 (2.43%) patients had low preoperative SA. 587 (28.1%) patients with low preoperative SA were obese (BMI>30), versus 39,299 (46.9%) with normal preoperative SA. Importantly, the interaction of hypoalbuminemia and BMI was independently associated with all complications among hypoalbuminemic patients with BMI>40, and mortality for patients with BMI>30 after controlling for appropriate demographic characteristics, co-morbidities, surgical wound classification, operation type, and complexity (c-statistic: .803 and .874 respectively). CONCLUSION: PD and obesity appear to synergistically increase the risk of surgical complications. Paradoxically, malnutrition may be less easily recognized in obese individuals and surgeons may need to more carefully evaluate this population before surgery. Future studies should investigate therapy to correct PD specifically among obese patients before surgery.
Assuntos
Cirurgia Bariátrica/efeitos adversos , Procedimentos Cirúrgicos Eletivos/efeitos adversos , Hipoalbuminemia/complicações , Obesidade Mórbida/complicações , Complicações Pós-Operatórias/epidemiologia , Idoso , Idoso de 80 Anos ou mais , Índice de Massa Corporal , Feminino , Humanos , Hipoalbuminemia/sangue , Masculino , Morbidade/tendências , Obesidade Mórbida/cirurgia , Melhoria de Qualidade , Estudos Retrospectivos , Fatores de Risco , Albumina Sérica/metabolismo , Taxa de Sobrevida/tendências , Resultado do Tratamento , Estados Unidos/epidemiologiaRESUMO
BACKGROUND: Broad-spectrum antibiotic therapy is critical in the management of necrotizing soft tissue infections (NSTI) in the emergency setting. Clindamycin often is included empirically to cover monomicrobial gram-positive pathogens but probably is of little value for polymicrobial infections and is associated with significant side effects, including the induction of Clostridium difficile colitis. However, there have been no studies predicting monomicrobial infections prior to obtaining cultures. The purpose of this study was to identify independent predictors of monomicrobial NSTI where the use of clindamycin would be most beneficial. We hypothesized that monomicrobial infections are characterized by involvement of the upper extremities and fewer co-morbid diseases. METHODS: We reviewed all cases of potential NSTI occurring between 1996 and 2013 in a single tertiary-care center. The infection was diagnosed by the finding of rapidly progressing necrotic fascia during debridement with positive cultures of tissue. Univariable analysis was performed using the Student t-, Wilcoxon rank sum, χ2, and Fisher exact tests as appropriate. Multivariable logistic regression was used to identify independent variables associated with outcomes. RESULTS: A group of 151 patients with confirmed NSTI with complete data was used. Of the monomicrobial infections, 61.8% were caused by Group A streptococci, 20.1% by Staphylococcus aureus, and 12.7% by Escherichia coli. Of the polymicrobial infections, E. coli was involved 13.7% of the time, followed by Candida spp. at 12.9%, and Bacteroides fragilis at 11.3%. On univariable analysis, immunosuppression, upper extremity infection, and elevated serum sodium concentration were associated with monomicrobial infection, whereas morbid obesity and a perineal infection site were associated with polymicrobial infection. On multivariable analysis, the strongest predictor of monomicrobial infection was immunosuppression (odds ratio [OR] 7.0; 95% confidence interval [CI] 2.2-22.3) followed by initial serum sodium concentration (OR 1.1; 95% CI 1.0-1.2). Morbid obesity (OR 0.1; 95% CI 0.0-0.5) and perineal infection (OR 0.3; 95% CI 0.1-0.8) were independently associated with polymicrobial infection. CONCLUSION: We identified independent risk factors that may be helpful in differentiating monomicrobial from polymicrobial NSTI. We suggest empiric clindamycin coverage be limited to patients who are immunosuppressed, have an elevated serum sodium concentration, or have upper extremity involvement and be avoided in obese patients or those with perineal disease.
Assuntos
Infecções Bacterianas/diagnóstico , Técnicas de Apoio para a Decisão , Infecções dos Tecidos Moles/diagnóstico , Bactérias/classificação , Bactérias/isolamento & purificação , Infecções Bacterianas/microbiologia , Coinfecção/diagnóstico , Coinfecção/microbiologia , Feminino , Fungos/classificação , Fungos/isolamento & purificação , Humanos , Masculino , Pessoa de Meia-Idade , Micoses/diagnóstico , Micoses/microbiologia , Infecções dos Tecidos Moles/microbiologia , Centros de Atenção TerciáriaRESUMO
BACKGROUND: Disparate lower-extremity ultrasonography (LUS) screening practices among trauma institutions reflecta lack of consensus regarding screening indications and whether screening improves outcomes. We hypothesized that LUS screening for deep-vein thrombosis (DVT) is not associated with a reduced incidence of pulmonary embolism (PE). METHODS: The 2012 ACS National Trauma Data Bank Research Data Set was queried to identify 442,108 patients treated at institutions reporting at least one LUS and at least one DVT. Institutions performing LUS on more than 2% of admitted patients were designated high-screening facilities and remaining institutions were designated low-screening facilities. Patient characteristics and risk factors were used to develop a logistic regression model to assess the independent associations between LUS and DVT and between LUS and PE. RESULTS: Overall, DVT and PE were reported in 0.94% and 0.37% of the study population, respectively. DVT and PE were reported more commonly in designated high-screening than low-screening facilities (DVT: 1.12% vs 0.72%, P < .0001; PE: 0.40% vs 0.33%, P = .0004). Multivariable logistic regression demonstrated that LUS was associated independently with DVT (odds ratio 1.43, confidence interval 1.34-1.53) but not PE (odds ratio 1.01, confidence interval 0.92-1.12) (c-statistic 0.86 and 0.85, respectively). Sensitivity analyses performed at various rates for designating HS facilities did not alter the significance of these relationships. CONCLUSION: LUS in trauma patients is not associated with a change in the incidence of PE. Aggressive LUS DVT screening protocols appear to detect many clinically insignificant DVTs for which subsequent therapeutic intervention may be unnecessary, and the use of these protocols should be questioned.