RESUMO
OBJECTIVE: This study examined the predictive value of Flavin Mononucleotide (FMN) levels in the flush solution used during cold storage of donor livers on outcomes post-transplantation. BACKGROUND: Static cold storage for liver grafts induces hypoxia with subsequent impaired mitochondrial function and Flavin Mononucleotide (FMN) release upon reperfusion. METHODS: This study enrolled 62 recipients who received whole liver grafts from donation after brain death (n=50) and circulatory death donors (n=12) between June 2022 and July 2023. FMN concentrations were measured in flush solutions on the back-table. ROC-curve analysis identified an FMN level cut-off for graft survival. Post-transplant outcomes were examined according to FMN levels. RESULTS: FMN level was significantly associated with graft survival, with an area-under-the-curve (AUC) of 0.858 (95%CI: 0.754-0.963, P<0.001), outperforming the donor risk index (AUC 0.571, 95%CI: 0.227-0.915, P=0.686). The study cohort was divided into low-FMN (<37.5 ng/mL, n=40) and high-FMN groups (≥37.5 ng/mL, n=22). The low-FMN group had superior one-year graft survival compared with the high-FMN group (100% vs. 77%, P=0.003). Levels of transaminases within 7 days post-transplant were significantly higher in the high-FMN group (P=0.003). The high-FMN group developed acute rejections (41% vs. 15%, P=0.023) and early allograft dysfunction (50% vs. 20%, P=0.014) more frequently. Median comprehensive complication index in the high-FMN group was significantly higher (54 [interquartile range, 40-78] vs. 42 [interquartile range, 28-52], P=0.017). CONCLUSION: The FMN level measured in donor livers' cold storage flush solution is a valid biomarker to predict post-transplant outcomes. Liver grafts with high FMN levels may benefit from machine perfusion to improve outcomes.
RESUMO
OBJECTIVE: Assess cost and complication outcomes after liver transplantation (LT) using normothermic machine perfusion (NMP). BACKGROUND: End-ischemic NMP is often used to aid logistics, yet its impact on outcomes after LT remains unclear, as does its true impact on costs associated with transplantation. METHODS: Deceased donor liver recipients at 2 centers (January 1, 2019, to June 30, 2023) were included. Retransplants, splits, and combined grafts were excluded. End-ischemic NMP (OrganOx-Metra) was implemented in October 2022 for extended-criteria donation after brain death (DBDs), all donations after circulatory deaths (DCDs), and logistics. NMP cases were matched 1:2 with static cold storage controls (SCS) using the Balance-of-Risk [donation after brain death (DBD)-grafts] and UK-DCD Score (DCD-grafts). RESULTS: Overall, 803 transplantations were included, 174 (21.7%) receiving NMP. Matching was achieved between 118 NMP-DBDs with 236 SCS; and 37 NMP-DCD with 74 corresponding SCS. For both graft types, median inpatient comprehensive complications index values were comparable between groups. DCD-NMP grafts experienced reduced cumulative 90-day comprehensive complications index (27.6 vs 41.9, P =0.028). NMP also reduced the need for early relaparotomy and renal replacement therapy, with subsequently less frequent major complications (Clavien-Dindo ≥IVa). This effect was more pronounced in DCD transplants. NMP had no protective effect on early biliary complications. Organ acquisition/preservation costs were higher with NMP, yet NMP-treated grafts had lower 90-day pretransplant costs in the context of shorter waiting list times. Overall costs were comparable for both cohorts. CONCLUSIONS: This is the first risk-adjusted outcome and cost analysis comparing NMP and SCS. In addition to logistical benefits, NMP was associated with a reduction in relaparotomy and bleeding in DBD grafts, and overall complications and post-LT renal replacement for DCDs. While organ acquisition/preservation was more costly with NMP, overall 90-day health care costs-per-transplantation were comparable.
Assuntos
Transplante de Fígado , Preservação de Órgãos , Perfusão , Complicações Pós-Operatórias , Humanos , Masculino , Feminino , Transplante de Fígado/economia , Pessoa de Meia-Idade , Perfusão/métodos , Preservação de Órgãos/métodos , Preservação de Órgãos/economia , Complicações Pós-Operatórias/economia , Complicações Pós-Operatórias/epidemiologia , Estudos Retrospectivos , Adulto , Idoso , Sobrevivência de EnxertoRESUMO
We describe a novel pre-liver transplant (LT) approach in colorectal liver metastasis, allowing for improved monitoring of tumor biology and reduction of disease burden before committing a patient to transplantation. Patients undergoing LT for colorectal liver metastasis at Cleveland Clinic were included. The described protocol involves intensive locoregional therapy with systemic chemotherapy, aiming to reach minimal disease burden revealed by positron emission tomography scan and carcinoembryonic Ag. Patients with no detectable disease or irreversible treatment-induced liver injury undergo transplant. Nine patients received liver transplant out of 27 who were evaluated (33.3%). The median follow-up was 700 days. Seven patients (77.8%) received a living donor LT. Five had no detectable disease, and 4 had treatment-induced cirrhosis. Pretransplant management included chemotherapy (n = 9) +/- bevacizumab (n = 6) and/or anti-EGFR (n = 6). The median number of pre-LT cycles of chemotherapy was 16 (range 10-40). Liver-directed therapy included Yttrium-90 (n = 5), ablation (n = 4), resection (n = 4), and hepatic artery infusion pump (n = 3). Three patients recurred after LT. Actuarial 1- and 2-year recurrence-free survival were 75% (n = 6/8) and 60% (n = 3/5). Recurrence occurred in the lungs (n = 1), liver graft (n = 1), and lungs+para-aortic nodes (n = 1). Patients with pre-LT detectable disease had reduced RFS ( p = 0.04). All patients with recurrence had histologically viable tumors in the liver explant. Patients treated in our protocol (n = 16) demonstrated improved survival versus those who were not candidates (n = 11) regardless of transplant status ( p = 0.01). A protocol defined by aggressive pretransplant liver-directed treatment and transplant for patients with the undetectable disease or treatment-induced liver injury may help prevent tumor recurrence.
RESUMO
Ex situ normothermic machine perfusion (NMP) helps increase the use of extended criteria donor livers. However, the impact of an NMP program on waitlist times and mortality has not been evaluated. Adult patients listed for liver transplant (LT) at 2 academic centers from January 1, 2015, to September 1, 2023, were included (n=2773) to allow all patients ≥6 months follow-up from listing. Routine NMP was implemented on October 14, 2022. Waitlist outcomes were compared from pre-NMP pre-acuity circles (n=1460), pre-NMP with acuity circles (n=842), and with NMP (n=381). Median waitlist time was 79 days (IQR: 20-232 d) at baseline, 49 days (7-182) with acuity circles, and 14 days (5-56) with NMP ( p <0.001). The rate of transplant-per-100-person-years improved from 61-per-100-person-years to 99-per-100-person-years with acuity circles and 194-per-100-person-years with NMP ( p <0.001). Crude mortality without transplant decreased from 18.3% (n=268/1460) to 13.3% (n=112/843), to 6.3% (n=24/381) ( p <0.001) with NMP. The incidence of mortality without LT was 15-per-100-person-years before acuity circles, 19-per-100 with acuity circles, and 9-per-100-person-years after NMP ( p <0.001). Median Model for End-Stage Liver Disease at LT was lowest with NMP, but Model for End-Stage Liver Disease at listing was highest in this era ( p <0.0001). The median donor risk index of transplanted livers at baseline was 1.54 (1.27-1.82), 1.66 (1.42-2.16) with acuity circles, and 2.06 (1.63-2.46) with NMP ( p <0.001). Six-month post-LT survival was not different between eras ( p =0.322). The total cost of health care while waitlisted was lowest in the NMP era ($53,683 vs. $32,687 vs. $23,688, p <0.001); cost-per-day did not differ between eras ( p =0.152). The implementation of a routine NMP program was associated with reduced waitlist time and mortality without compromising short-term survival after liver transplant despite increased use of riskier grafts. Routine NMP use enables better waitlist management with reduced health care costs.
RESUMO
BACKGROUND: Open offers (OOs) in liver transplantation (LT) result from bypassing the traditional allocation system. Little is known about the trends of OOs or the differences in donor/recipient characteristics compared to traditionally placed organs. We aim to quantify modern practices regarding OOs and understand NMP's impact, focusing on social determinants of health (SDH), cost, and graft-associated risk. METHODS: LTs from 1/1/2018 to 12/31/2023 at a single center were included. NMP was implemented on 10/1/2022. The CDC (centers for disease control)-validated social vulnerability index (SVI) and donor risk index (DRI) were calculated. Comprehensive complications index (CCI), Clavien-Dindo grades, patient and graft survival, and costs of transplantation were included. RESULTS: 1162 LTs were performed; 193 (16.8%) from OOs. OOs were more common in the post-NMP era (26.5% vs. 13.3%, p < 0.001). Pre-NMP, patients receiving OOs had longer waitlist times (118 vs. 69 days, p < 0.001), lower MELDs (17 vs. 25 points, p < 0.001), and riskier grafts (DRI = 1.8 vs. 1.6, p = 0.004) compared to standard offers. Post-NMP, recipients receiving OOs demonstrated no difference in waitlist time (27 vs. 20 days, p = 0.21) or graft risk (DRI = 2.03 vs. 2.23, p = 0.17). OO recipient MELD remained lower (16 vs. 22, p < 0.001). OO recipients were more socially vulnerable (SVI), pre-NMP (0.41 vs. 0.36, p = 0.004), but less vulnerable after NMP (0.23 vs. 0.36, p = 0.019). Despite increased graft risk, pre-NMP OO-LTs were less expensive in the 90-day global period ($154 939 vs. $178 970, p = 0.002) and the 180-days pre-/post-LT ($208 807 vs. $228 091, p = 0.021). Cost trends remained similar with NMP. CONCLUSION: OOs are increasingly utilized and may be appealing due to demonstrated cost reductions even with NMP. Although most OO-related metrics in our center remain similar before and after machine perfusion, programs should take caution that increasing use does not worsen organ access for socially vulnerable populations.
Assuntos
Sobrevivência de Enxerto , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Listas de Espera , Humanos , Transplante de Fígado/economia , Feminino , Masculino , Pessoa de Meia-Idade , Seguimentos , Obtenção de Tecidos e Órgãos/economia , Prognóstico , Perfusão , Doadores de Tecidos/provisão & distribuição , Fatores de Risco , Estudos Retrospectivos , Taxa de Sobrevida , Adulto , Doença Hepática Terminal/cirurgia , Determinantes Sociais da Saúde , Complicações Pós-Operatórias/epidemiologiaRESUMO
OBJECTIVE: We aim to report our institutional outcomes of single-staged combined liver transplantation (LT) and cardiac surgery (CS). SUMMARY BACKGROUND DATA: Concurrent LT and CS is a potential treatment for combined cardiac dysfunction and end-stage liver disease, yet only 54 cases have been previously reported in the literature. Thus, the outcomes of this approach are relatively unknown, and this approach has been previously regarded as extremely risky. METHODS: Thirty-one patients at our institution underwent combined cardiac surgery and liver transplant. Patients with at least one-year follow-up were included. The Leave-One-Out Cross-Validation (LOOCV) machine-learning approach was used to generate a model for mortality. RESULTS: Median follow-up was 8.2 years (IQR 4.6-13.6 y). One- and five-year survival was 74.2% (N=23) and 55% (N=17), respectively. Negative predictive factors of survival included recipient age>60 years (P=0.036), NASH-cirrhosis (P=0.031), Coronary Artery Bypass-Graft (CABG)-based CS (P=0.046) and pre-operative renal dysfunction (P=0.024). The final model demonstrated that renal dysfunction had a relative weighted impact of 3.2 versus CABG (1.7), age ≥60y (1.7) or NASH (1.3). Elevated LT+CS risk score was associated with an increased five-year mortality after surgery (AUC=0.731, P=<0.001). Conversely, the widely accepted STS-PROM calculator was unable to successfully stratify patients according to 1- (P>0.99) or 5-year (P=0.695) survival rates. CONCLUSIONS: This is the largest series describing combined LT+CS, with joint surgical management appearing feasible in highly selected patients. CABG and pre-operative renal dysfunction are important negative predictors of mortality. The four-variable LT+CS score may help predict patients at high risk for post-operative mortality.
RESUMO
BACKGROUND: Sepsis remains the leading cause of death in the surgical intensive care unit. Prior studies have demonstrated a survival benefit of remote ischemic conditioning (RIC) in many disease states. The aim of this study was to determine the effects of RIC on survival in sepsis in an animal model and to assess alterations in inflammatory biochemical profiles. We hypothesized that RIC alters inflammatory biochemical profiles resulting in decreased mortality in a septic mouse model. MATERIALS AND METHODS: Eight to 12 week C57BL/6 mice received intra-peritoneal injection of 12.5-mg/kg lipopolysaccharide (LPS). Septic animals in the experimental group underwent RIC at 0, 2, and 6 h after LPS by surgical exploration and alternate clamping of the femoral artery. Six 4-min cycles of ischemia-reperfusion were performed. Primary outcome was survival at 5-d after LPS injection. Secondary outcome was to assess the following serum cytokine levels: interferon-γ (IFN-γ), interleukin (IL)-10, IL-1ß, and tumor necrosis factoralpha (TNFα) at the baseline before LPS injection, 0 hour after LPS injection, and at 2, 4, 24 hours after induction of sepsis (RIC was performed at 2 h after LPS injection). Kaplan-Meier survival analysis and log-rank test were used. ANOVA test was used to compare cytokine measurements. RESULTS: We performed experiments on 44 mice: 14 sham and 30 RIC mice (10 at each time point). Overall survival was higher in the experimental group compared to the sham group (57% versus 21%; P = 0.02), with the highest survival rate observed in the 2-hour post-RIC group (70%). On Kaplan-Meier analysis, 2-h post-RIC group had increased survival at 5 days after LPS (P = 0.04) with hazard ratio of 0.3 (95% confidence interval = 0.09-0.98). In the RIC group, serum concentrations of IFN-γ, IL-10, IL-1ß, and TNFα peaked at 2 h after LPS and then decreased significantly over 24 hours (P < 0.0001) compared to the baseline. CONCLUSIONS: RIC improves survival in sepsis and has the potential for implementation in the clinical practice. Early implementation of RIC may play an immune-modulatory role in sepsis. Further studies are necessary to refine understanding of the observed survival benefits and its implications in sepsis management.
Assuntos
Isquemia , Extremidade Inferior/irrigação sanguínea , Reperfusão/métodos , Sepse/terapia , Animais , Biomarcadores/metabolismo , Artéria Femoral , Estimativa de Kaplan-Meier , Masculino , Camundongos , Camundongos Endogâmicos C57BL , Distribuição Aleatória , Sepse/imunologia , Sepse/mortalidade , Resultado do TratamentoRESUMO
OBJECTIVE: The aim of this study was to assess the seasonal variation in emergency general surgery (EGS) admissions. BACKGROUND: Seasonal variation in medical conditions is well established; however, its impact on EGS cases remains unclear. METHODS: The National Inpatient Sample (NIS) database was queried over an 8-year period (2004-2011) for all patients with diagnosis of acute appendicitis, acute cholecystitis, and diverticulitis. Elective admissions were excluded. The following data for each admission were recorded: age, sex, race, admission month, major operative procedure, hospital region, and mortality. Seasons were defined as follows: Spring (March, April, May), Summer (June, July, August), Fall (September, October, November), and Winter (December, January, February). X11 procedure and spectral analysis were performed to confirm seasonal variation. RESULTS: A total of 63,911,033 admission records were evaluated of which 493,569 were appendicitis, 395,838 were cholecystitis, and 412,163 were diverticulitis. Seasonal variation is confirmed in EGS (F = 159.12, P < 0.0001) admissions. In the subanalysis, seasonal variation was found in acute appendicitis (F = 119.62, P < 0.0001), acute cholecystitis (F = 37.13, P < 0.0001), and diverticulitis (F = 69.90, P < 0.0001). The average monthly EGS admission in Winter was 11,322 ± 674. The average monthly EGS admission in Summer was higher than that of Winter by 13.6% (n = 1542; 95% CI: 1180-1904, P < 0.001). CONCLUSIONS: Hospitalization due to EGS adheres to a consistent cyclical pattern, with more admissions occurring during the Summer months. Although the reasons for this variability are unknown, this information may be useful for hospital resource reallocation and staffing.
Assuntos
Apendicite/cirurgia , Colecistite/cirurgia , Diverticulite/cirurgia , Tratamento de Emergência/estatística & dados numéricos , Admissão do Paciente/estatística & dados numéricos , Estações do Ano , Procedimentos Cirúrgicos Operatórios/estatística & dados numéricos , Doença Aguda , Adulto , Apendicite/epidemiologia , Colecistite/epidemiologia , Diverticulite/epidemiologia , Feminino , Humanos , Masculino , Pessoa de Meia-IdadeRESUMO
BACKGROUND: Multiple prior studies have suggested an association between survival and beta-blocker administration in patients with severe traumatic brain injury (TBI). However, it is unknown whether this benefit of beta-blockers is dependent on heart rate control. The aim of this study was to assess whether rate control affects survival in patients receiving metoprolol with severe TBI. Our hypothesis was that improved survival from beta-blockade would be associated with a reduction in heart rate. METHODS: We performed a 7-y retrospective analysis of all blunt TBI patients at a level-1 trauma center. Patients aged >16 y with head abbreviated injury scale 4 or 5, admitted to the intensive care unit (ICU) from the operating room or emergency room (ER), were included. Patients were stratified into two groups: metoprolol and no beta-blockers. Using propensity score matching, we matched the patients in two groups in a 1:1 ratio controlling for age, gender, race, admission vital signs, Glasgow coma scale, injury severity score, mean heart rate monitored during ICU admission, and standard deviation of heart rate during the ICU admission. Our primary outcome measure was mortality. RESULTS: A total of 914 patients met our inclusion criteria, of whom 189 received beta-blockers. A propensity-matched cohort of 356 patients (178: metoprolol and 178: no beta-blockers) was created. Patients receiving metoprolol had higher survival than those patients who did not receive beta-blockers (78% versus 68%; P = 0.04); however, there was no difference in the mean heart rate (89.9 ± 13.9 versus 89.9 ± 15; P = 0.99). Nor was there a difference in the mean of standard deviation of the heart rates (14.7 ± 6.3 versus 14.4 ± 6.5; P = 0.65) between the two groups. In Kaplan-Meier survival analysis, patients who received metoprolol had a survival advantage (P = 0.011) compared with patients who did not receive any beta-blockers. CONCLUSIONS: Our study shows an association with improved survival in patients with severe TBI receiving metoprolol, and this effect appears to be independent of any reduction in heart rate. We suggest that beta-blockers should be administered to all severe TBI patients irregardless of any perceived beta-blockade effect on heart rate.
Assuntos
Antagonistas Adrenérgicos beta/farmacologia , Lesões Encefálicas/tratamento farmacológico , Frequência Cardíaca/efeitos dos fármacos , Metoprolol/farmacologia , Adolescente , Antagonistas Adrenérgicos beta/uso terapêutico , Adulto , Idoso , Idoso de 80 Anos ou mais , Lesões Encefálicas/mortalidade , Feminino , Escala de Coma de Glasgow , Humanos , Escala de Gravidade do Ferimento , Masculino , Metoprolol/uso terapêutico , Pessoa de Meia-Idade , Pontuação de Propensão , Estudos Retrospectivos , Resultado do Tratamento , Adulto JovemRESUMO
INTRODUCTION: Early seizures after severe traumatic brain injury (TBI) have a reported incidence of up to 15 %. Prophylaxis for early seizures using 1 week of phenytoin is considered standard of care for seizure prevention. However, many centers have substituted the anticonvulsant levetiracetam without good data on the efficacy of this approach. Our hypothesis was that the treatment with levetiracetam is not effective in preventing early post-traumatic seizures. METHODS: All trauma patients sustaining a TBI from January 2007 to December 2009 at an urban level-one trauma center were retrospectively analyzed. Seizures were identified from a prospectively gathered morbidity database and anticonvulsant use from the pharmacy database. Statistical comparisons were made by Chi square, t tests, and logistic regression modeling. Patients who received levetiracetam prophylaxis were matched 1:1 using propensity score matching with those who did not receive the drug. RESULTS: 5551 trauma patients suffered a TBI during the study period, with an overall seizure rate of 0.7 % (39/5551). Of the total population, 1795 were diagnosed with severe TBI (Head AIS score 3-5). Seizures were 25 times more likely in the severe TBI group than in the non-severe group [2.0 % (36/1795) vs. 0.08 % (3/3756); OR 25.6; 95 % CI 7.8-83.2; p < 0.0001]. Of the patients who had seizures after severe TBI, 25 % (9/36) received pharmacologic prophylaxis with levetiracetam, phenytoin, or fosphenytoin. In a matched cohort by propensity scores, no difference was seen in seizure rates between the levetiracetam group and no-prophylaxis group (1.9 vs. 3.4 %, p = 0.50). CONCLUSIONS: In this propensity score-matched cohort analysis, levetiracetam prophylaxis was ineffective in preventing seizures as the rate of seizures was similar whether patients did or did not receive the drug. The incidence of post-traumatic seizures in severe TBI patients was only 2.0 % in this study; therefore we question the benefit of routine prophylactic anticonvulsant therapy in patients with TBI.
Assuntos
Anticonvulsivantes/uso terapêutico , Lesões Encefálicas Traumáticas/complicações , Piracetam/análogos & derivados , Convulsões/prevenção & controle , Adolescente , Adulto , Quimioprevenção , Bases de Dados Factuais , Feminino , Humanos , Levetiracetam , Masculino , Pessoa de Meia-Idade , Fenitoína/análogos & derivados , Fenitoína/uso terapêutico , Piracetam/uso terapêutico , Pontuação de Propensão , Estudos Retrospectivos , Convulsões/etiologia , Falha de Tratamento , Adulto JovemRESUMO
INTRODUCTION: Computed Tomography Angiography (CTA) is being used to identify traumatic intracranial aneurysms (TICA) in patients with findings such as skull fracture and intracranial haemorrhage on initial Computed Tomography (CT) scans after blunt traumatic brain injury (TBI). However, the incidence of TICA in patients with blunt TBI is unknown. The aim of this study is to report the incidence of TICA in patients with blunt TBI and to assess the utility of CTA in detecting these lesions. METHODS: A 10-year retrospective study (2003-2012) was performed at a Level 1 trauma centre. All patients with blunt TBI who had an initial non-contrasted head CT scan and a follow-up head CTA were included. Head CTAs were then reviewed by a single investigator and TICAs were identified. The primary outcome measure was incidence of TICA in blunt TBI. RESULTS: A total of 10 257 patients with blunt TBI were identified, out of which 459 patients were included in the analysis. Mean age was 47.3 ± 22.5, the majority were male (65.1%) and median ISS was 16 [9-25]. Thirty-six patients (7.8%) had intracranial aneurysm, of which three patients (0.65%) had TICAs. CONCLUSION: The incidence of traumatic intracranial aneurysm was exceedingly low (0.65%) over 10-years. This study adds to the growing literature questioning the empiric use of CTA for detecting vascular injuries in patients with blunt TBI.
Assuntos
Traumatismos Cranianos Fechados/diagnóstico , Aneurisma Intracraniano/diagnóstico , Adulto , Feminino , Traumatismos Cranianos Fechados/complicações , Traumatismos Cranianos Fechados/epidemiologia , Traumatismos Cranianos Fechados/terapia , Humanos , Aneurisma Intracraniano/epidemiologia , Aneurisma Intracraniano/etiologia , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Tomografia Computadorizada por Raios X/métodos , Centros de Traumatologia/estatística & dados numéricosRESUMO
BACKGROUND: Geriatric patients are at higher risk for adverse outcomes after injury because of their altered physiological reserve. Mortality after trauma laparotomy remains high; however, outcomes in geriatric patients after trauma laparotomy have not been well established. The aim of our study was to identify factors predicting mortality in geriatric trauma patients undergoing laparotomy. METHODS: A retrospective study was performed of all trauma patients undergoing a laparotomy at our level 1 trauma center over a 6-y period (2006-2012). Patients with age ≥55 y who underwent a trauma laparotomy were included. Patients with head abbreviated injury scale (AIS) score ≥ 3 or thorax AIS ≥ 3 were excluded. Our primary outcome measure was mortality. Significant factors in univariate regression model were used in multivariate regression analysis to evaluate the factors predicting mortality. RESULTS: A total of 1150 patients underwent a trauma laparotomy. Of which 90 patients met inclusion criteria. The mean age was 67 ± 10 y, 63% were male, and median abdominal AIS was 3 (2-4). Overall mortality rate was 23.3% (21/90) and progressively increased with age (P = 0.013). Age (P = 0.02) and lactate (P = 0.02) were the independent predictors of mortality in geriatric patients undergoing laparotomy. CONCLUSIONS: Mortality rate after trauma laparotomy increases with increasing age. Age and admission lactate were the predictors of mortality in geriatric population undergoing trauma laparotomies.
Assuntos
Laparotomia/mortalidade , Ferimentos e Lesões/cirurgia , Idoso , Idoso de 80 Anos ou mais , Arizona/epidemiologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Ferimentos e Lesões/complicações , Ferimentos e Lesões/mortalidadeRESUMO
BACKGROUND: Liver transplantation (LT) is a well-established treatment for hepatocellular carcinoma (HCC), but there are ongoing debates regarding outcomes and selection. This study examines the experience of LT for HCC at a high-volume centre. METHODS: A prospectively maintained database was used to identify HCC patients undergoing LT from 2000 to 2020 with more than or equal to 3-years follow-up. Data were obtained from the centre database and electronic medical records. The Metroticket 2.0 HCC-specific 5-year survival scale was calculated for each patient. Kaplan-Meier and Cox-regression analyses were employed assessing survival between groups based on Metroticket score and individual donor and recipient risk factors. RESULTS: Five hundred sixty-nine patients met criteria. Median follow-up was 96.2 months (8.12 years; interquartile range 59.9-147.8). Three-year recurrence-free (RFS) and overall survival (OS) were 88.6% ( n =504) and 86.6% ( n =493). Five-year RFS and OS were 78.9% ( n =449) and 79.1% ( n =450). Median Metroticket 2.0 score was 0.9 (interquartile range 0.9-0.95). Tumour size greater than 3 cm ( P =0.012), increasing tumour number on imaging ( P =0.001) and explant pathology ( P <0.001) was associated with recurrence. Transplant within Milan ( P <0.001) or UCSF criteria ( P <0.001) had lower recurrence rates. Increasing alpha-fetoprotein (AFP)-values were associated with more HCC recurrence ( P <0.001) and reduced OS ( P =0.008). Chemoembolization was predictive of recurrence in the overall population ( P =0.043) and in those outside-Milan criteria ( P =0.038). A receiver-operator curve using Metroticket 2.0 identified an optimal cut-off of projected survival greater than or equal to 87.5% for predicting recurrence. This cut-off was able to predict RFS ( P <0.001) in the total cohort and predict both, RFS ( P =0.007) and OS ( P =0.016) outside Milan. Receipt of donation after brain death (DBD) grafts (55/478, 13%) or living-donor grafts (3/22, 13.6%) experienced better survival rates compared to donation after cardiac death (DCD) grafts ( n =15/58, 25.6%, P =0.009). Donor age was associated with a higher HCC recurrence ( P =0.006). Both total ischaemia time (TIT) greater than 6hours ( P =0.016) and increasing TIT correlated with higher HCC recurrence ( P =0.027). The use of DCD grafts for outside-Milan candidates was associated with increased recurrence ( P =0.039) and reduced survival ( P =0.033). CONCLUSION: This large two-centre analysis confirms favourable outcomes after LT for HCC. Tumour size and number, pre-transplant AFP, and Milan criteria remain important recipient HCC-risk factors. A higher donor risk (i.e. donor age, DCD grafts, ischaemia time) was associated with poorer outcomes.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Humanos , Carcinoma Hepatocelular/cirurgia , Carcinoma Hepatocelular/mortalidade , Carcinoma Hepatocelular/patologia , Transplante de Fígado/mortalidade , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/mortalidade , Neoplasias Hepáticas/patologia , Masculino , Feminino , Pessoa de Meia-Idade , Medição de Risco , Seguimentos , Idoso , Estudos Retrospectivos , Adulto , Fatores de Risco , Recidiva Local de Neoplasia , Estimativa de Kaplan-MeierRESUMO
BACKGROUND: This study compares selection criteria for liver transplant (LT) for hepatocellular carcinoma (HCC) for inclusivity and predictive ability to identify the most permissive criteria that maintain patient outcomes. METHODS: The Scientific Registry of Transplant Recipients (SRTR) database was queried for deceased donor LT's for HCC (2003-2020) with 3-y follow-up; these data were compared with a 2-center experience. Milan, University of California, San Francisco (UCSF), 5-5-500, Up-to-seven (U7), HALT-HCC, and Metroticket 2.0 scores were calculated. RESULTS: Nationally, 26 409 patients were included, and 547 at the 2 institutions. Median SRTR-follow-up was 6.8 y (interquartile range 3.9-10.1). Three criteria allowed the expansion of candidacy versus Milan: UCSF (7.7%, n = 1898), Metroticket 2.0 (4.2%, n = 1037), and U7 (3.5%, n = 828). The absolute difference in 3-y overall survival (OS) between scores was 1.5%. HALT-HCC (area under the curve [AUC] = 0.559, 0.551-0.567) best predicted 3-y OS although AUC was notably similar between criteria (0.506 < AUC < 0.527, Mila n = 0.513, UCSF = 0.506, 5-5-500 = 0.522, U7 = 0.511, HALT-HCC = 0.559, and Metroticket 2.0 = 0.520), as was Harrall's c-statistic (0.507 < c-statistic < 0.532). All scores predicted survival to P < 0.001 on competing risk analysis. Median follow-up in our enterprise was 9.8 y (interquartile range 7.1-13.3). U7 (13.0%, n = 58), UCSF (11.1%, n = 50), HALT-HCC (6.4%, n = 29), and Metroticket 2.0 (6.3%, n = 28) allowed candidate expansion. HALT-HCC (AUC = 0.768, 0.713-0.823) and Metroticket 2.0 (AUC = 0.739, 0.677-0.801) were the most predictive of recurrence. All scores predicted recurrence and survival to P < 0.001 using competing risk analysis. CONCLUSIONS: Less restrictive criteria such as Metroticket 2.0, UCSF, or U7 allow broader application of transplants for HCC without sacrificing outcomes. Thus, the criteria for Model for End-stage Liver Disease-exception points for HCC should be expanded to allow more patients to receive life-saving transplantation.
RESUMO
Hepatocellular carcinoma (HCC) is the third leading cause of cancer-related death and the sixth most diagnosed malignancy worldwide. Serum alpha-fetoprotein (AFP) is the traditional, ubiquitous biomarker for HCC. However, there has been an increasing call for the use of multiple biomarkers to optimize care for these patients. AFP, AFP-L3, and prothrombin induced by vitamin K absence II (DCP) have described clinical utility for HCC, but unfortunately, they also have well established and significant limitations. Circulating tumor DNA (ctDNA), genomic glycosylation, and even totally non-invasive salivary metabolomics and/or micro-RNAS demonstrate great promise for early detection and long-term surveillance, but still require large-scale prospective validation to definitively validate their clinical validity. This review aims to provide an update on clinically available and emerging biomarkers for HCC, focusing on their respective clinical strengths and weaknesses.
RESUMO
INTRODUCTION: Circulating tumor DNA (ctDNA) is emerging as a promising, non-invasive diagnostic and surveillance biomarker in solid organ malignancy. However, its utility before and after liver transplant (LT) for patients with primary and secondary liver cancers is still underexplored. METHODS: Patients undergoing LT for hepatocellular carcinoma (HCC), cholangiocarcinoma (CCA), and colorectal liver metastases (CRLM) with ctDNA testing were included. CtDNA testing was conducted pre-transplant, post-transplant, or both (sequential) from 11/2019 to 09/2023 using Guardant360, Guardant Reveal, and Guardant360 CDx. RESULTS: 21 patients with HCC (n = 9, 43%), CRLM (n = 8, 38%), CCA (n = 3, 14%), and mixed HCC/CCA (n = 1, 5%) were included in the study. The median follow-up time was 15 months (range: 1-124). The median time from pre-operative testing to surgery was 3 months (IQR: 1-4; range: 0-5), and from surgery to post-operative testing, it was 9 months (IQR: 2-22; range: 0.4-112). A total of 13 (62%) patients had pre-transplant testing, with 8 (62%) having ctDNA detected (ctDNA+) and 5 (32%) not having ctDNA detected (ctDNA-). A total of 18 (86%) patients had post-transplant testing, 11 (61%) of whom were ctDNA+ and 7 (33%) of whom were ctDNA-. The absolute recurrence rates were 50% (n = 5) in those who were ctDNA+ vs. 25% (n = 1) in those who were ctDNA- in the post-transplant setting, though this difference was not statistically significant (p = 0.367). Six (29%) patients (HCC = 3, CCA = 1, CRLM = 2) experienced recurrence with a median recurrence-free survival of 14 (IQR: 6-40) months. Four of these patients had positive post-transplant ctDNA collected following diagnosis of recurrence, while one patient had positive post-transplant ctDNA collected preceding recurrence. A total of 10 (48%) patients had sequential ctDNA testing, of whom n = 5 (50%) achieved ctDNA clearance (+/-). The remainder were ctDNA+/+ (n = 3, 30%), ctDNA-/- (n = 1, 10%), and ctDNA-/+ (n = 1, 11%). Three (30%) patients showed the acquisition of new genomic alterations following transplant, all without recurrence. Overall, the median tumor mutation burden (TMB) decreased from 1.23 mut/Mb pre-transplant to 0.00 mut/Mb post-transplant. CONCLUSIONS: Patients with ctDNA positivity experienced recurrence at a higher rate than the ctDNA- patients, indicating the potential role of ctDNA in predicting recurrence after curative-intent transplant. Based on sequential testing, LT has the potential to clear ctDNA, demonstrating the capability of LT in the treatment of systemic disease. Transplant providers should be aware of the potential of donor-derived cell-free DNA and improved approaches are necessary to address such concerns.
RESUMO
INTRODUCTION: Pediatric trauma centers (PTCs) were created to address the unique needs of injured children with the expectation that outcomes would be improved. However, prior studies to evaluate the impact of PTCs have had conflicting results. Our study was conducted to further clarify this question. We hypothesize that severely injured children ≤ 14 years of age have better outcomes at PTCs and that better survival may be due to higher emergency department (ED) survival rates than at adult trauma centers (ATCs). METHODS: A retrospective analysis of severely injured children (ISS>15) ≤18 years of age entered into the National Trauma Data Bank (NTDB) between 2011 and 2012 was performed. Subjects were stratified into 2 age cohorts; young children (0-14 years) and adolescents (15-18 years). Primary outcomes were emergency department (ED) and in-patient (IP) mortality. Secondary outcomes included in-hospital complications, hospital and ICU length of stay, and ventilator days. Outcome differences were assessed using multilevel logistic and negative binomial regression analyses. RESULTS: A total of 10,028 children were included. Median ISS was 22 (Interquartile range 17-29). Adjusting for confounders on multivariate analysis, children ≤ 14 had lower odds of ED (0.42[CI 0.25-0.71], p=0.001) and IP mortality (0.73[CI 0.5-0.9], p=0.02) at PTCs. There were no differences in odds of ED mortality (0.81 [CI 0.5-1.3], p=0.4) or IP mortality (1.01 [CI 0.8-1.2], p=0.88) for adolescents between centers. There were no differences in complication rates between PTCs and ATCs (OR 0.86 [CI 0.69-1.06], p=1.7) but children were more likely to be discharged to home and have more ICU and ventilator free days if treated at a PTC. CONCLUSION: Young children but not adolescents have better ED survival at PTCs compared to ATCs.Level of Evidence: Level IV, Therapeutic.
RESUMO
INTRODUCTION: Optimization of surgical outcomes after colectomy continues to be actively studied, but most studies group right-sided and left-sided colectomies together. The aim of our study was to determine whether the complication rate differs between right-sided and left-sided colectomies for cancer. METHODS: We identified patients who underwent laparoscopic colectomy for colon cancer between 2005 and 2010 in the American College of Surgeons National Surgical Quality Improvement Program database and stratified cases by right and left side. The two groups were matched using propensity score matching for demographics, previous abdominal surgery, pre-operative chemotherapy and radiotherapy, and preoperative laboratory data. Outcome measures were: 30-day mortality and morbidity. RESULTS: We identified 2512 patients who underwent elective laparoscopic colectomy for right-sided or left-sided colon cancer. The two groups were similar in demographics, and pre-operative characteristics. There was no difference in overall morbidity (15% vs. 17.7%; p value < 0.08) or 30-day mortality (1.5% vs. 1.5%; p value < 0.9) between the two groups. Sub-analysis revealed higher surgical site infection rates (9% vs. 6%; p value < 0.04), higher incidence of ureteral injury (0.6% vs. 0.4%; p value < 0.04), higher conversion rate to open colectomy (51% vs. 30%; p value < 0.01) and a longer hospital length of stay (10.5 ± 4 vs. 7.1 ± 1.3 days; p value < 0.02) in patients undergoing laparoscopic left colectomy. CONCLUSION: Our study highlights the difference in complications between right-sided and left-sided colectomies for cancer. Further research on outcomes after colectomy should incorporate right vs. left side colon resection as a potential pre-operative risk factor.
RESUMO
INTRODUCTION: Whole body CT (WBCT) scan is known to be associated with significant radiation risk especially in pediatric trauma patients. The aim of this study was to assess the use WBCT scan across trauma centers for the management of pediatric trauma patients. METHODS: We performed a two year (2011-2012) retrospective analysis of the National Trauma Data Bank. Pediatric (age≤18years) trauma patients managed in level I or II adult or pediatric trauma centers with a head, neck, thoracic, or abdominal CT scan were included. WBCT scan was defined as CT scan of the head, neck, thorax, and abdomen. Patients were stratified into two groups: patients managed in adult centers and patients managed in designated pediatric centers. Outcome measure was use of WBCT. Multivariate logistic regression analysis was performed. RESULTS: A total of 30,667 pediatric trauma patients were included of which; 38.3% (n=11,748) were managed in designated pediatric centers. 26.1% (n=8013) patients received a WBCT. The use of WBCT scan was significantly higher in adult trauma centers in comparison to pediatric centers (31.4% vs. 17.6%, p=0.001). There was no difference in mortality rate between the two groups (2.2% vs. 2.1%, p=0.37). After adjusting for all confounding factors, pediatric patients managed in adult centers were 1.8 times more likely to receive a WBCT compared to patients managed in pediatric centers (OR [95% CI]: 1.8 [1.3-2.1], p=0.001). CONCLUSIONS: Variability exists in the use of WBCT scan across trauma centers with no difference in patient outcomes. Pediatric patients managed in adult trauma centers were more likely to be managed with WBCT, increasing their risk for radiation without a difference in outcomes. Establishing guidelines for minimizing the use of WBCT across centers is warranted.