Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 43
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Am J Transplant ; 2024 Jun 10.
Artigo em Inglês | MEDLINE | ID: mdl-38866110

RESUMO

Medical literature highlights differences in liver transplantation (LT) waitlist experiences among ABO blood types. Type AB candidates reportedly have higher LT rates and reduced mortality. Despite liver offering guidelines, ABO disparities persist. This study examines LT access discrepancies among blood types, focusing on type AB, and seeks equitable strategies. Using the United Network for Organ Sharing database (2003-2022), 170 276 waitlist candidates were retrospectively analyzed. Dual predictive analyses (LT opportunity and survival studies) evaluated 1-year recipient pool survival, considering waitlist and post-LT survival, alongside anticipated allocation value per recipient, under 6 scenarios. Of the cohort, 97 670 patients (57.2%) underwent LT. Type AB recipients had the highest LT rate (73.7% vs 55.2% for O), shortest median waiting time (90 vs 198 days for A), and lowest waitlist mortality (12.9% vs 23.9% for O), with the lowest median model for end-stage liver disease-sodium (MELD-Na) score (20 vs 25 for A/O). The LT opportunity study revealed that reallocating type A (or A and O) donors originally for AB recipients to A recipients yielded the greatest reduction in disparities in anticipated value per recipient, from 0.19 (before modification) to 0.08. Meanwhile, the survival study showed that ABO-identical LTs reduced disparity the most (3.5% to 2.8%). Sensitivity analysis confirmed these findings were specific to the MELD-Na score < 30 population, indicating current LT allocation may favor certain blood types. Prioritizing ABO-identical LTs for MELD-Na score < 30 recipients could ensure uniform survival outcomes and mitigate disparities.

2.
Artigo em Inglês | MEDLINE | ID: mdl-38908731

RESUMO

BACKGROUND AND AIMS: Continuous risk stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in non-hepatocellular carcinoma (HCC) patients in the United States. Instead, for HCC patients, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared to other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5,196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted five-year overall survival after LT. HALT-HCC demonstrated the highest AUC values for predicting dropout at various intervals post-listing (0.68 at six months, 0.66 at one year), with excellent calibration (R2=0.95 at six months, 0.88 at one year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in HCC patients, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.

3.
Liver Transpl ; 2024 May 13.
Artigo em Inglês | MEDLINE | ID: mdl-38727618

RESUMO

There is no recent update on the clinical course of retransplantation (re-LT) after living donor liver transplantation (LDLT) in the US using recent national data. The UNOS database (2002-2023) was used to explore patient characteristics in initial LT, comparing deceased donor liver transplantation (DDLT) and LDLT for graft survival (GS), reasons for graft failure, and GS after re-LT. It assesses waitlist dropout and re-LT likelihood, categorizing re-LT cohort based on time to re-listing as acute or chronic (≤ or > 1 mo). Of 132,323 DDLT and 5955 LDLT initial transplants, 3848 DDLT and 302 LDLT recipients underwent re-LT. Of the 302 re-LT following LDLT, 156 were acute and 146 chronic. Primary nonfunction (PNF) was more common in DDLT, although the difference was not statistically significant (17.4% vs. 14.8% for LDLT; p = 0.52). Vascular complications were significantly higher in LDLT (12.5% vs. 8.3% for DDLT; p < 0.01). Acute re-LT showed a larger difference in primary nonfunction between DDLT and LDLT (49.7% vs. 32.0%; p < 0.01). Status 1 patients were more common in DDLT (51.3% vs. 34.0% in LDLT; p < 0.01). In the acute cohort, Kaplan-Meier curves indicated superior GS after re-LT for initial LDLT recipients in both short-term and long-term ( p = 0.02 and < 0.01, respectively), with no significant difference in the chronic cohort. No significant differences in waitlist dropout were observed, but the initial LDLT group had a higher re-LT likelihood in the acute cohort (sHR 1.40, p < 0.01). A sensitivity analysis focusing on the most recent 10-year cohort revealed trends consistent with the overall study findings. LDLT recipients had better GS in re-LT than DDLT. Despite a higher severity of illness, the DDLT cohort was less likely to undergo re-LT.

4.
Liver Transpl ; 30(4): 376-385, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-37616509

RESUMO

With increasing metabolic dysfunction-associated steatotic liver disease, the use of steatotic grafts in liver transplantation (LT) and their impact on postoperative graft survival (GS) needs further exploration. Analyzing adult LT recipient data (2002-2022) from the United Network for Organ Sharing database, outcomes of LT using steatotic (≥30% macrosteatosis) and nonsteatotic donor livers, donors after circulatory death, and standard-risk older donors (age 45-50) were compared. GS predictors were evaluated using Kaplan-Meier and Cox regression analyses. Of the 35,345 LT donors, 8.9% (3,155) were fatty livers. The initial 30-day postoperative period revealed significant challenges with fatty livers, demonstrating inferior GS. However, the GS discrepancy between fatty and nonfatty livers subsided over time ( p = 0.10 at 5 y). Long-term GS outcomes showed comparable or even superior results in fatty livers relative to nonsteatotic livers, conditional on surviving the initial 90 postoperative days ( p = 0.90 at 1 y) or 1 year ( p = 0.03 at 5 y). In the multivariable Cox regression analysis, the high body surface area (BSA) ratio (≥1.1) (HR 1.42, p = 0.02), calculated as donor BSA divided by recipient BSA, long cold ischemic time (≥6.5 h) (HR 1.72, p < 0.01), and recipient medical condition (intensive care unit hospitalization) (HR 2.53, p < 0.01) emerged as significant adverse prognostic factors. Young (<40 y) fatty donors showed a high BSA ratio, diabetes, and intensive care unit hospitalization as significant indicators of a worse prognosis ( p < 0.01). Our study emphasizes the initial postoperative 30-day survival challenge in LT using fatty livers. However, with careful donor-recipient matching, for example, avoiding the use of steatotic donors with long cold ischemic time and high BSA ratios for recipients in the intensive care unit, it is possible to enhance immediate GS, and in a longer time, outcomes comparable to those using nonfatty livers, donors after circulatory death livers, or standard-risk older donors can be anticipated. These novel insights into decision-making criteria for steatotic liver use provide invaluable guidance for clinicians.


Assuntos
Fígado Gorduroso , Transplante de Fígado , Humanos , Pessoa de Meia-Idade , Transplante de Fígado/métodos , Prognóstico , Fígado Gorduroso/etiologia , Fígado/metabolismo , Doadores de Tecidos , Sobrevivência de Enxerto
5.
Liver Transpl ; 2024 Apr 17.
Artigo em Inglês | MEDLINE | ID: mdl-38625836

RESUMO

The use of older donors after circulatory death (DCD) for liver transplantation (LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD (≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion (NMP) technology. A total of 7602 DCD LT cases from the United Network for Organ Sharing database (2003-2022) were reviewed. The impact of older DCD donors on graft survival was assessed using the Kaplan-Meier and HR analyses. In all, 1447 LT cases (19.0%) involved older DCD donors. Although there was a decrease in their use from 2003 to 2014, a resurgence was noted after 2015 and reached 21.9% of all LTs in the last 4 years (2019-2022). Initially, 90-day and 1-year graft survivals for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the cold ischemic time (>5.5 h) was a significant predictor. LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing cold ischemic time (≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.

6.
Clin Transplant ; 38(7): e15379, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38952196

RESUMO

BACKGROUND: Introducing new liver transplantation (LT) practices, like unconventional donor use, incurs higher costs, making evaluation of their prognostic justification crucial. This study reexamines the spread pattern of new LT practices and its prognosis across the United States. METHODS: The study investigated the spread pattern of new practices using the UNOS database (2014-2023). Practices included LT for hepatitis B/C (HBV/HCV) nonviremic recipients with viremic donors, LT for COVID-19-positive recipients, and LT using onsite machine perfusion (OMP). One year post-LT patient and graft survival were also evaluated. RESULTS: LTs using HBV/HCV donors were common in the East, while LTs for COVID-19 recipients and those using OMP started predominantly in California, Arizona, Texas, and the Northeast. K-means cluster analysis identified three adoption groups: facilities with rapid, slow, and minimal adoption rates. Rapid adoption occurred mainly in high-volume centers, followed by a gradual increase in middle-volume centers, with little increase in low-volume centers. The current spread patterns did not significantly affect patient survival. Specifically, for LTs with HCV donors or COVID-19 recipients, patient and graft survivals in the rapid-increasing group was comparable to others. In LTs involving OMP, the rapid- or slow-increasing groups tended to have better patient survival (p = 0.05) and significantly improved graft survival rates (p = 0.02). Facilities adopting new practices often overlap across different practices. DISCUSSION: Our analysis revealed three distinct adoption groups across all practices, correlating the adoption aggressiveness with LT volume in centers. Aggressive adoption of new practices did not compromise patient and graft survivals, supporting the current strategy. Understanding historical trends could predict the rise in future LT cases with new practices, aiding in resource distribution.


Assuntos
COVID-19 , Sobrevivência de Enxerto , Transplante de Fígado , SARS-CoV-2 , Humanos , Transplante de Fígado/estatística & dados numéricos , Estados Unidos/epidemiologia , COVID-19/epidemiologia , Feminino , Masculino , Pessoa de Meia-Idade , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Doadores de Tecidos/provisão & distribuição , Doadores de Tecidos/estatística & dados numéricos , Adulto , Taxa de Sobrevida , Prognóstico , Padrões de Prática Médica/estatística & dados numéricos
7.
Clin Transplant ; 38(4): e15316, 2024 04.
Artigo em Inglês | MEDLINE | ID: mdl-38607291

RESUMO

BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.


Assuntos
Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Projetos de Pesquisa , Algoritmos , Bilirrubina , Aprendizado de Máquina
8.
Clin Transplant ; 38(1): e15155, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37812571

RESUMO

BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.


Assuntos
Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Prognóstico , Doadores de Tecidos , Sobrevivência de Enxerto , Hiperbilirrubinemia/etiologia , Bilirrubina , Estudos Retrospectivos
9.
HPB (Oxford) ; 2024 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-38879433

RESUMO

BACKGROUND: Cause of death (COD) is a predictor of liver transplant (LT) outcomes independent of donor age, yet has not been recently reappraised. METHODS: Analyzing UNOS database (2013-2022), the study explored COD trends and impacts on one-year post-LT graft survival (GS) and hazard ratios (HR) for graft failure. RESULTS: Of 80,282 brain-death donors, 55,413(69.0%) underwent initial LT. Anoxia became the predominant COD in 2015, increasing from 29.0% in 2013 to 45.1% in 2021, with notable increases in drug intoxication. Survival differences between anoxia and cerebrovascular accidents (CVA) recently became insignificant (P=0.95). Further analysis showed improved GS from intracranial hemorrhage/stroke (previously worse; P<0.01) (P=0.70). HRs for post-1-year graft failure showed reduced significance of CVA (vs.Anoxia) and intracranial hemorrhage/stroke (vs.any other COD) recently. Donors with intracranial hemorrhage/stroke, showing improved survival and HR, were allocated to recipients with lower MELD-Na, contrasting the trend for drug intoxication CODs. DISCUSSION: CVA, traditionally linked with poorer outcomes, shows improved GS and HRs (vs.Anoxia). This could be due to rising drug intoxication cases and the allocation of donors with drug intoxication to recipients with higher MELD-Na, and those with CVA to recipients with lower scores. While COD remains crucial in donor selection, proper matching can mitigate differences among CODs.

10.
Liver Transpl ; 29(8): 793-803, 2023 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-36847140

RESUMO

The current liver allocation system may be disadvantaging younger adult recipients as it does not incorporate the donor-recipient age difference. Given the longer life expectancy of younger recipients, the influences of older donor grafts on their long-term prognosis should be elucidated. This study sought to reveal the long-term prognostic influence of the donor-recipient age difference in young adult recipients. Adult patients who received initial liver transplants from deceased donors between 2002 and 2021 were identified from the UNOS database. Young recipients (patients 45 years old or below) were categorized into 4 groups: donor age younger than the recipient, 0-9 years older, 10-19 years older, or 20 years older or above. Older recipients were defined as patients 65 years old or above. To examine the influence of the age difference in long-term survivors, conditional graft survival analysis was conducted on both younger and older recipients. Among 91,952 transplant recipients, 15,170 patients were 45 years old or below (16.5%); these were categorized into 6,114 (40.3%), 3,315 (21.9%), 2,970 (19.6%), and 2,771 (18.3%) for groups 1-4, respectively. Group 1 demonstrated the highest probability of survival, followed by groups 2, 3, and 4 for the actual graft survival and conditional graft survival analyses. In younger recipients who survived at least 5 years post-transplant, inferior long-term survival was observed when there was an age difference of 10 years or above (86.9% vs. 80.6%, log-rank p <0.01), whereas there was no difference in older recipients (72.6% vs. 74.2%, log-rank p =0.89). In younger patients who are not in emergent need of a transplant, preferential allocation of younger aged donor offers would optimize organ utility by increasing postoperative graft survival time.


Assuntos
Transplante de Rim , Transplante de Fígado , Humanos , Adulto Jovem , Idoso , Recém-Nascido , Lactente , Pré-Escolar , Criança , Pessoa de Meia-Idade , Transplante de Fígado/efeitos adversos , Doadores Vivos , Fatores de Tempo , Doadores de Tecidos , Análise de Sobrevida , Sobrevivência de Enxerto , Fatores Etários , Estudos Retrospectivos
11.
Ann Surg Oncol ; 30(6): 3402-3410, 2023 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-36808590

RESUMO

BACKGROUND: Currently used treatment algorithms were originally established based on the clinical outcomes of the initial treatment for primary hepatocellular carcinoma (HCC), and no strong evidence exists yet to suggest if these algorithms could also be applicable to patients with recurrent HCC after surgery. As such, this study sought to explore an optimal risk stratification method for cases of recurrent HCC for better clinical management. METHODS: Among the 1616 patients who underwent curative resection for HCC, the clinical features and survival outcomes of 983 patients who developed recurrence were examined in detail. RESULTS: Multivariate analysis confirmed that both the disease-free interval (DFI) from the previous surgery and tumor stage at recurrence were significant prognostic factors. However, the prognostic impact of DFI seemed different according to the tumor stages at recurrence. While curative-intent treatment showed strong influence on survival [hazard ratio (HR), 0.61; P < 0.001] regardless of the DFI in patients with stage 0 or stage A disease at recurrence, early recurrence (< 6 months) was a poor prognostic marker in patients with stage B disease. The prognosis of patients with stage C disease was exclusively influenced by the tumor distribution or choice of treatment than by the DFI. CONCLUSIONS: The DFI complementarily predicts the oncological behavior of recurrent HCC, with its predictive value differing depending on the tumor stage at recurrence. These factors should be considered for selection of the optimal treatment in patients with recurrent HCC after curative-intent surgery.


Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Humanos , Carcinoma Hepatocelular/patologia , Neoplasias Hepáticas/patologia , Hepatectomia , Recidiva Local de Neoplasia/patologia , Estudos Retrospectivos , Prognóstico , Intervalo Livre de Doença
12.
Clin Transplant ; 37(12): e15127, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37772621

RESUMO

BACKGROUND: Despite advancements in liver transplantation (LT) over the past two decades, liver re-transplantation (re-LT) presents challenges. This study aimed to assess improvements in re-LT outcomes and contributing factors. METHODS: Data from the United Network for Organ Sharing database (2002-2021) were analyzed, with recipients categorized into four-year intervals. Trends in re-LT characteristics and postoperative outcomes were evaluated. RESULTS: Of 128,462 LT patients, 7254 received re-LT. Graft survival (GS) for re-LT improved (91.3%, 82.1%, and 70.8% at 30 days, 1 year, and 3 years post-LT from 2018 to 2021). However, hazard ratios (HRs) for GS remained elevated compared to marginal donors including donors after circulatory death (DCD), although the difference in HRs decreased in long-term GS. Changes in re-LT causes included a reduction in hepatitis C recurrence and an increase in graft failure post-primary LT involving DCD. Trends identified included recent decreased cold ischemic time (CIT) and increased distance from donor hospital in re-LT group. Meanwhile, DCD cohort exhibited less significant increase in distance and more marked decrease in CIT. The shortest CIT was recorded in urgent re-LT group. The highest Model for End-Stage Liver Disease score was observed in urgent re-LT group, while the lowest was recorded in DCD group. Analysis revealed shorter time interval between previous LT and re-listing, leading to worse outcomes, and varying primary graft failure causes influencing overall survival post-re-LT. DISCUSSION: While short-term re-LT outcomes improved, challenges persist compared to DCD. Further enhancements are required, with ongoing research focusing on optimizing risk stratification models and allocation systems for better LT outcomes.


Assuntos
Doença Hepática Terminal , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Doença Hepática Terminal/cirurgia , Índice de Gravidade de Doença , Doadores de Tecidos , Sobrevivência de Enxerto , Estudos Retrospectivos
13.
World J Surg ; 47(4): 1042-1048, 2023 04.
Artigo em Inglês | MEDLINE | ID: mdl-36622435

RESUMO

BACKGROUND: This study aimed to explore the efficacy of gadoxetic acid-enhanced (Gd-EOB) magnetic resonance imaging (MRI) in surgical risk estimation among patients with marginal hepatic function estimated by indocyanine green (ICG) clearance test. METHODS: This analysis focused on 120 patients with marginal hepatic functional reserve (ICG clearance rate of future liver remnant [ICG-Krem] < 0.10). Preoperative Gd-EOB MRI was retrospectively reviewed, and the remnant hepatocyte uptake index (rHUI) was calculated for quantitative measurement of liver function. The predictive power of rHUI for posthepatectomy liver failure was compared with several clinical measures used in current risk estimation before hepatectomy. RESULTS: Receiver operating curve analysis showed that rHUI had the best predictive power for posthepatectomy liver failure among the tested variables (ICG-R15, ICG-Krem, albumin + bilirubin score, and albumin + ICG-R15 score). Cross-validation showed that a threshold of 925 could be the best cut-off value for estimating the postoperative risk of liver failure with sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of 0.689, 0.884, 5.94, and 0.352, respectively. CONCLUSION: rHUI could be a sensitive substitute measure for posthepatectomy liver failure risk estimation among patients with marginal hepatic functional reserve.


Assuntos
Falência Hepática , Neoplasias Hepáticas , Humanos , Estudos Retrospectivos , Neoplasias Hepáticas/diagnóstico por imagem , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/etiologia , Testes de Função Hepática , Fígado/cirurgia , Hepatócitos/patologia , Falência Hepática/diagnóstico , Falência Hepática/etiologia , Hepatectomia/efeitos adversos , Verde de Indocianina , Albuminas , Imageamento por Ressonância Magnética/métodos , Medição de Risco
14.
Langenbecks Arch Surg ; 408(1): 66, 2023 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-36695913

RESUMO

BACKGROUND: Alpha-fetoprotein (AFP)-producing gastric cancer (AFPGC) is reported to have biologically aggressive features and poor prognosis. A relatively large number of patients with AFPGC have achieved a long-term prognosis after surgery in our institution. This study aimed to clarify the clinical features of and re-evaluate the long-term outcomes of AFPGC. METHODS: This analysis involved 465 patients who underwent surgery for gastric cancer (GC) at our institute between 1996 and 2020. The clinical features and long-term outcomes of the 24 patients with AFPGC were assessed. The differences in clinicopathological characteristics between AFPGC and non-AFPGC patients were statistically analyzed. RESULTS: In patients with AFPGC, the median preoperative serum AFP level was 232 ng/mL. Tumor invasion of AFPGC was classified and clinical characteristics of AFPGC patients were as follows: nodal metastasis, simultaneous liver metastasis, with malignant cells in ascites, lymphatic, and venous involvement. Postoperative surveillance revealed adjuvant therapy in fourteen, recurrence in eight, and four patients died of GC. The 3- and 5-year overall survival (OS) rates were 85.2% and 75.7% in AFPGC patients and 79.6% and 77.7% in non-AFPGC patients, respectively. The log-rank test identified no significant difference in OS between AFPGC and non-AFPGC patients. Tumor depth, nodal, and venous involvement showed significant differences between AFPGC and non-AFPGC patients. CONCLUSIONS: AFPGC has aggressive biological features, but long-term prognosis after surgery does not seem to be as poor as claimed in previous studies. Therefore, it may be important to detect and start treatment early when surgery is feasible.


Assuntos
Neoplasias Hepáticas , Neoplasias Gástricas , Humanos , alfa-Fetoproteínas , Neoplasias Gástricas/patologia , Prognóstico , Neoplasias Hepáticas/secundário
15.
Langenbecks Arch Surg ; 408(1): 73, 2023 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-36725735

RESUMO

PURPOSE: Tumor sidedness (hepatic side vs. peritoneal side) reportedly predicts microvascular invasion and survival outcomes of T2 gallbladder cancer, although the actual histopathological mechanism is not fully understood. METHODS: The clinical relevance of tumor sidedness was revisited in 84 patients with gallbladder cancer using histopathological analysis of the vascular density of the gallbladder wall. RESULTS: Hepatic-side tumor location was associated with overall survival (OS) (hazard ratio [HR], 13.62; 95% confidence interval [CI], 2.09-88.93) and recurrence-free survival (RFS) (HR, 8.70; 95% CI, 1.36-55.69) in T2 tumors. The Adjusted Kaplan-Meier curve indicated a clear survival difference between T2a (peritoneal side) and T2b (hepatic side) tumors (P = 0.006). A review of 56 pathological specimens with gallbladder cancer and 20 control specimens demonstrated that subserosal vascular density was significantly higher on the hepatic side of the gallbladder, regardless of the presence of cancer (P < 0.001). Multivariate analysis also confirmed that higher subserosal vascular density was significantly associated with poor OS (HR, 1.73; 95% CI, 1.10-2.73 per 10 microscopic fields) and poor RFS (HR, 1.62; 95% CI, 1.06-2.49) in T2  gallbladder cancer. CONCLUSION: Higher subserosal vascular density may account for the higher incidence of cancer spread and the poor prognosis of T2b gallbladder cancer.


Assuntos
Neoplasias da Vesícula Biliar , Humanos , Prognóstico , Densidade Microvascular , Modelos de Riscos Proporcionais , Estadiamento de Neoplasias , Estudos Retrospectivos
16.
Langenbecks Arch Surg ; 408(1): 381, 2023 Sep 28.
Artigo em Inglês | MEDLINE | ID: mdl-37770582

RESUMO

PURPOSE: Optimal choice of diuretics in perioperative management remains unclear in enhanced recovery after liver surgery. This study investigated the efficacy and safety of tolvaptan (oral vasopressin V2-receptor antagonist) in postoperative management of patients with liver injury and hepatocellular carcinoma. METHODS: The patients clinically diagnosed with liver cirrhosis were included in this study. Clinical outcomes of 51 prospective cohort managed with a modified postoperative protocol using tolvaptan (validation group) were compared with 83 patients treated with a conventional management protocol (control group). RESULTS: Postoperative urine output were significantly larger and excessive body weight increase were reduced with no impairment in renal function or serum sodium levels in the validation group. Although the total amount of discharge and trend of serum albumin level were not significantly different among the groups, global incidence of postoperative morbidity was less frequent (19.6% vs. 44.6%, P=0.005) and postoperative stay was significantly shorter (8 days vs.10 days, P=0.008) in the validation group compared with the control group. CONCLUSIONS: Tolvaptan could be safely used for the patients with injured liver in postoperative management after hepatectomy and potentially advantageous in the era of enhanced recovery after surgery with its strong diuretic effect and better fluid management.


Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Humanos , Tolvaptan , Carcinoma Hepatocelular/cirurgia , Antagonistas dos Receptores de Hormônios Antidiuréticos/efeitos adversos , Hepatectomia/efeitos adversos , Estudos Prospectivos , Benzazepinas/efeitos adversos , Neoplasias Hepáticas/cirurgia , Neoplasias Hepáticas/tratamento farmacológico , Diuréticos/efeitos adversos , Cirrose Hepática/complicações , Cirrose Hepática/cirurgia
17.
Ann Surg Oncol ; 28(11): 6738-6746, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-33554286

RESUMO

BACKGROUND: Body composition data are reportedly correlated with patient prognosis for various cancers. However, little is known about the prognostic impact of adipose tissue distribution among patients with hepatocellular carcinoma (HCC). METHODS: Data for 181 consecutive cirrhotic patients who underwent hepatectomy for HCC were retrospectively reviewed. The clinical significance of the visceral-to-subcutaneous adipose tissue ratio (VSR) was investigated through analysis of short- and long-term surgical outcomes. RESULTS: Of the 181 patients, 60 (33%) were classified as the high-VSR group and 121 (67%) as the low-VSR group. Although VSR was not correlated with a risk of postoperative morbidity, multivariate analysis confirmed that a higher VSR was significantly correlated with a shorter time to interventional failure (hazard ratio [HR] 2.24; P = 0.008) and overall survival (HR 2.65; P = 0.001) independently of American Joint Committed on Cancer stage or preoperative nutritional status. Analysis of the recurrence patterns showed that the proportion of unresectable recurrence at the initial recurrence event was significantly higher in the high-VSR group (39% vs. 18%; P = 0.025). The yearly transition probabilities, defined by a Markov model from postoperative R0 status to advanced disease or death (7.6% vs. 1.5%, P < 0.001) and early recurrence stage to advanced disease or death (15.4% vs. 2.8%, P = 0.004), were higher in the high-VSR group, suggesting that patients with a higher VSR are vulnerable to disease progression. CONCLUSION: A high VSR was found to be an independent predictor of disease progression and poor prognosis for HCC patients with underlying liver cirrhosis having resection for HCC.


Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Carcinoma Hepatocelular/cirurgia , Hepatectomia , Humanos , Cirrose Hepática/complicações , Neoplasias Hepáticas/cirurgia , Recidiva Local de Neoplasia/cirurgia , Prognóstico , Estudos Retrospectivos , Distribuição Tecidual
18.
World J Surg ; 45(6): 1906-1912, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33721071

RESUMO

BACKGROUND: While anti-p53 antibody (p53-Ab) is a potential marker for early detection of colorectal cancer, its clinical utility in patients with advanced colorectal cancer remains unknown. METHODS: The clinical significance of p53-Ab was investigated by analyzing the data of 206 patients who underwent curative resection for colorectal liver metastases. RESULTS: Of the 206 patients, 60 (29%) were seropositive and 146 were seronegative for p53-Ab before the surgery. The preoperative serum p53-Ab level showed no significant correlation with the serum CEA or serum CA19-9 levels. The perioperative changes in serum p53-Ab positivity were significantly correlated with the preoperative serum p53-Ab levels and multivariate analysis confirmed that a higher preoperative p53-Ab level was independently associated with a worse recurrence-free survival (hazard ratio [HR], 1.07; 95% CI, 1.01-1.13; P = 0.033 per + 100 U/mL), even after adjustments for other oncological factors, including the preoperative serum CEA level. CONCLUSION: Higher preoperative p53-Ab levels were associated with a higher risk of recurrence after curative resection of colorectal liver metastases.


Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Biomarcadores Tumorais , Antígeno Carcinoembrionário , Neoplasias Colorretais/cirurgia , Hepatectomia , Humanos , Neoplasias Hepáticas/cirurgia , Recidiva Local de Neoplasia , Prognóstico
19.
Langenbecks Arch Surg ; 406(7): 2391-2398, 2021 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-34196790

RESUMO

PURPOSE: The clinical impact of the preoperative nutritional status has not fully been understood in an aggressive surgical approach for stage IV colorectal cancer (CRC). METHODS: The clinical records of 399 patients with stage IV CRC who underwent surgery for the primary tumor were reviewed. The predictive powers of reported nutritional/inflammatory indices of postoperative morbidity were compared, and their correlations with both the short- and long-term outcomes were investigated. RESULTS: Among the 10 tested nutritional/inflammatory indices, the Controlling Nutritional Status (CONUT) score showed the highest performance for predicting major morbidity (area under the curve [AUC], 0.605; P = 0.067) and any morbidity (AUC, 0.605; P = 0.001). When stratifying the population into 4 undernutrition grades based on the CONUT score, the CONUT undernutrition grades were found to show good correlations with the Clavien-Dindo grades of postoperative morbidity (P < 0.001) and the length of hospital stay (P < 0.001). Multivariate analysis confirmed the CONUT undernutrition grade was significantly associated with the survival outcomes in patients with stage IV CRC (light: hazard ratio [HR], 1.12; 95% CI, 0.80-1.58; moderate: HR, 1.54; 95% CI, 1.02-2.33; severe: HR, 3.61; 95% CI, 1.52-8.62). CONCLUSIONS: Preoperative nutritional status is a useful predictive marker for both the short- and long-term outcomes of surgical interventions for stage IV CRC.


Assuntos
Neoplasias Colorretais , Desnutrição , Neoplasias Colorretais/cirurgia , Humanos , Desnutrição/diagnóstico , Desnutrição/epidemiologia , Avaliação Nutricional , Estado Nutricional , Prognóstico , Estudos Retrospectivos
20.
HPB (Oxford) ; 23(6): 907-914, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33121854

RESUMO

BACKGROUND: There has been no solid evidence regarding the actual efficacy of adhesion barriers in liver surgery. METHODS: Difficulty grade of lysis of adhesion was evaluated in 122 patients who underwent repeat hepatectomy (ReHx) using the TORAD score. Technical difficulty of lysis of adhesion and incidence of complication were then compared between the group of patients who received a sheet-type adhesion barrier (Seprafilm®) in the previous hepatectomy (n = 70) and those who did not (n = 52) using the inverse probability weighting method. RESULTS: Use of Seprafilm was significantly associated with lower grade of difficulty of lysis of adhesion according to the TORAD score (P < 0.001). Postoperative morbidity rate was lower and postoperative stay was shorter in the Seprafilm group in the propensity-score adjusted population (37% vs. 74%, P < 0.001 and 12 days vs. 14 days in median, P = 0.048). Multivariate analysis confirmed that use of Seprafilm was independent predictor for severity of adhesion (odds ratio [OR] 0.24, 95% CI, 0.09-0.65, P = 0.005) and decreased incidence of postoperative morbidity at ReHx (OR, 0.34; 95% CI, 0.14-0.84, P = 0.020). CONCLUSIONS: Use of Seprafilm may be associated with decreased technical difficulty of lysis of adhesion and may correlate with lower risk of postoperative morbidity in patients undergoing ReHx.


Assuntos
Carboximetilcelulose Sódica , Hepatectomia , Hepatectomia/efeitos adversos , Humanos , Ácido Hialurônico , Complicações Pós-Operatórias/etiologia , Complicações Pós-Operatórias/prevenção & controle , Aderências Teciduais/prevenção & controle
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA