Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 58
Filtrar
Mais filtros

Bases de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Am J Transplant ; 2024 Jun 10.
Artigo em Inglês | MEDLINE | ID: mdl-38866110

RESUMO

Medical literature highlights differences in liver transplantation (LT) waitlist experiences among ABO blood types. Type AB candidates reportedly have higher LT rates and reduced mortality. Despite liver offering guidelines, ABO disparities persist. This study examines LT access discrepancies among blood types, focusing on type AB, and seeks equitable strategies. Using the United Network for Organ Sharing database (2003-2022), 170 276 waitlist candidates were retrospectively analyzed. Dual predictive analyses (LT opportunity and survival studies) evaluated 1-year recipient pool survival, considering waitlist and post-LT survival, alongside anticipated allocation value per recipient, under 6 scenarios. Of the cohort, 97 670 patients (57.2%) underwent LT. Type AB recipients had the highest LT rate (73.7% vs 55.2% for O), shortest median waiting time (90 vs 198 days for A), and lowest waitlist mortality (12.9% vs 23.9% for O), with the lowest median model for end-stage liver disease-sodium (MELD-Na) score (20 vs 25 for A/O). The LT opportunity study revealed that reallocating type A (or A and O) donors originally for AB recipients to A recipients yielded the greatest reduction in disparities in anticipated value per recipient, from 0.19 (before modification) to 0.08. Meanwhile, the survival study showed that ABO-identical LTs reduced disparity the most (3.5% to 2.8%). Sensitivity analysis confirmed these findings were specific to the MELD-Na score < 30 population, indicating current LT allocation may favor certain blood types. Prioritizing ABO-identical LTs for MELD-Na score < 30 recipients could ensure uniform survival outcomes and mitigate disparities.

2.
Liver Transpl ; 2024 May 13.
Artigo em Inglês | MEDLINE | ID: mdl-38727618

RESUMO

There is no recent update on the clinical course of retransplantation (re-LT) after living donor liver transplantation (LDLT) in the US using recent national data. The UNOS database (2002-2023) was used to explore patient characteristics in initial LT, comparing deceased donor liver transplantation (DDLT) and LDLT for graft survival (GS), reasons for graft failure, and GS after re-LT. It assesses waitlist dropout and re-LT likelihood, categorizing re-LT cohort based on time to re-listing as acute or chronic (≤ or > 1 mo). Of 132,323 DDLT and 5955 LDLT initial transplants, 3848 DDLT and 302 LDLT recipients underwent re-LT. Of the 302 re-LT following LDLT, 156 were acute and 146 chronic. Primary nonfunction (PNF) was more common in DDLT, although the difference was not statistically significant (17.4% vs. 14.8% for LDLT; p = 0.52). Vascular complications were significantly higher in LDLT (12.5% vs. 8.3% for DDLT; p < 0.01). Acute re-LT showed a larger difference in primary nonfunction between DDLT and LDLT (49.7% vs. 32.0%; p < 0.01). Status 1 patients were more common in DDLT (51.3% vs. 34.0% in LDLT; p < 0.01). In the acute cohort, Kaplan-Meier curves indicated superior GS after re-LT for initial LDLT recipients in both short-term and long-term ( p = 0.02 and < 0.01, respectively), with no significant difference in the chronic cohort. No significant differences in waitlist dropout were observed, but the initial LDLT group had a higher re-LT likelihood in the acute cohort (sHR 1.40, p < 0.01). A sensitivity analysis focusing on the most recent 10-year cohort revealed trends consistent with the overall study findings. LDLT recipients had better GS in re-LT than DDLT. Despite a higher severity of illness, the DDLT cohort was less likely to undergo re-LT.

3.
Liver Transpl ; 30(4): 376-385, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-37616509

RESUMO

With increasing metabolic dysfunction-associated steatotic liver disease, the use of steatotic grafts in liver transplantation (LT) and their impact on postoperative graft survival (GS) needs further exploration. Analyzing adult LT recipient data (2002-2022) from the United Network for Organ Sharing database, outcomes of LT using steatotic (≥30% macrosteatosis) and nonsteatotic donor livers, donors after circulatory death, and standard-risk older donors (age 45-50) were compared. GS predictors were evaluated using Kaplan-Meier and Cox regression analyses. Of the 35,345 LT donors, 8.9% (3,155) were fatty livers. The initial 30-day postoperative period revealed significant challenges with fatty livers, demonstrating inferior GS. However, the GS discrepancy between fatty and nonfatty livers subsided over time ( p = 0.10 at 5 y). Long-term GS outcomes showed comparable or even superior results in fatty livers relative to nonsteatotic livers, conditional on surviving the initial 90 postoperative days ( p = 0.90 at 1 y) or 1 year ( p = 0.03 at 5 y). In the multivariable Cox regression analysis, the high body surface area (BSA) ratio (≥1.1) (HR 1.42, p = 0.02), calculated as donor BSA divided by recipient BSA, long cold ischemic time (≥6.5 h) (HR 1.72, p < 0.01), and recipient medical condition (intensive care unit hospitalization) (HR 2.53, p < 0.01) emerged as significant adverse prognostic factors. Young (<40 y) fatty donors showed a high BSA ratio, diabetes, and intensive care unit hospitalization as significant indicators of a worse prognosis ( p < 0.01). Our study emphasizes the initial postoperative 30-day survival challenge in LT using fatty livers. However, with careful donor-recipient matching, for example, avoiding the use of steatotic donors with long cold ischemic time and high BSA ratios for recipients in the intensive care unit, it is possible to enhance immediate GS, and in a longer time, outcomes comparable to those using nonfatty livers, donors after circulatory death livers, or standard-risk older donors can be anticipated. These novel insights into decision-making criteria for steatotic liver use provide invaluable guidance for clinicians.


Assuntos
Fígado Gorduroso , Transplante de Fígado , Humanos , Pessoa de Meia-Idade , Transplante de Fígado/métodos , Prognóstico , Fígado Gorduroso/etiologia , Fígado/metabolismo , Doadores de Tecidos , Sobrevivência de Enxerto
4.
Liver Transpl ; 2024 Apr 17.
Artigo em Inglês | MEDLINE | ID: mdl-38625836

RESUMO

The use of older donors after circulatory death (DCD) for liver transplantation (LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD (≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion (NMP) technology. A total of 7602 DCD LT cases from the United Network for Organ Sharing database (2003-2022) were reviewed. The impact of older DCD donors on graft survival was assessed using the Kaplan-Meier and HR analyses. In all, 1447 LT cases (19.0%) involved older DCD donors. Although there was a decrease in their use from 2003 to 2014, a resurgence was noted after 2015 and reached 21.9% of all LTs in the last 4 years (2019-2022). Initially, 90-day and 1-year graft survivals for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the cold ischemic time (>5.5 h) was a significant predictor. LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing cold ischemic time (≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.

5.
Clin Transplant ; 38(7): e15379, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38952196

RESUMO

BACKGROUND: Introducing new liver transplantation (LT) practices, like unconventional donor use, incurs higher costs, making evaluation of their prognostic justification crucial. This study reexamines the spread pattern of new LT practices and its prognosis across the United States. METHODS: The study investigated the spread pattern of new practices using the UNOS database (2014-2023). Practices included LT for hepatitis B/C (HBV/HCV) nonviremic recipients with viremic donors, LT for COVID-19-positive recipients, and LT using onsite machine perfusion (OMP). One year post-LT patient and graft survival were also evaluated. RESULTS: LTs using HBV/HCV donors were common in the East, while LTs for COVID-19 recipients and those using OMP started predominantly in California, Arizona, Texas, and the Northeast. K-means cluster analysis identified three adoption groups: facilities with rapid, slow, and minimal adoption rates. Rapid adoption occurred mainly in high-volume centers, followed by a gradual increase in middle-volume centers, with little increase in low-volume centers. The current spread patterns did not significantly affect patient survival. Specifically, for LTs with HCV donors or COVID-19 recipients, patient and graft survivals in the rapid-increasing group was comparable to others. In LTs involving OMP, the rapid- or slow-increasing groups tended to have better patient survival (p = 0.05) and significantly improved graft survival rates (p = 0.02). Facilities adopting new practices often overlap across different practices. DISCUSSION: Our analysis revealed three distinct adoption groups across all practices, correlating the adoption aggressiveness with LT volume in centers. Aggressive adoption of new practices did not compromise patient and graft survivals, supporting the current strategy. Understanding historical trends could predict the rise in future LT cases with new practices, aiding in resource distribution.


Assuntos
COVID-19 , Sobrevivência de Enxerto , Transplante de Fígado , SARS-CoV-2 , Humanos , Transplante de Fígado/estatística & dados numéricos , Estados Unidos/epidemiologia , COVID-19/epidemiologia , Feminino , Masculino , Pessoa de Meia-Idade , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Doadores de Tecidos/provisão & distribuição , Doadores de Tecidos/estatística & dados numéricos , Adulto , Taxa de Sobrevida , Prognóstico , Padrões de Prática Médica/estatística & dados numéricos
6.
Clin Transplant ; 38(4): e15316, 2024 04.
Artigo em Inglês | MEDLINE | ID: mdl-38607291

RESUMO

BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.


Assuntos
Transplante de Fígado , Adulto , Humanos , Transplante de Fígado/efeitos adversos , Projetos de Pesquisa , Algoritmos , Bilirrubina , Aprendizado de Máquina
7.
Clin Transplant ; 38(1): e15155, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37812571

RESUMO

BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.


Assuntos
Fígado , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Prognóstico , Doadores de Tecidos , Sobrevivência de Enxerto , Hiperbilirrubinemia/etiologia , Bilirrubina , Estudos Retrospectivos
8.
HPB (Oxford) ; 2024 Jun 04.
Artigo em Inglês | MEDLINE | ID: mdl-38879433

RESUMO

BACKGROUND: Cause of death (COD) is a predictor of liver transplant (LT) outcomes independent of donor age, yet has not been recently reappraised. METHODS: Analyzing UNOS database (2013-2022), the study explored COD trends and impacts on one-year post-LT graft survival (GS) and hazard ratios (HR) for graft failure. RESULTS: Of 80,282 brain-death donors, 55,413(69.0%) underwent initial LT. Anoxia became the predominant COD in 2015, increasing from 29.0% in 2013 to 45.1% in 2021, with notable increases in drug intoxication. Survival differences between anoxia and cerebrovascular accidents (CVA) recently became insignificant (P=0.95). Further analysis showed improved GS from intracranial hemorrhage/stroke (previously worse; P<0.01) (P=0.70). HRs for post-1-year graft failure showed reduced significance of CVA (vs.Anoxia) and intracranial hemorrhage/stroke (vs.any other COD) recently. Donors with intracranial hemorrhage/stroke, showing improved survival and HR, were allocated to recipients with lower MELD-Na, contrasting the trend for drug intoxication CODs. DISCUSSION: CVA, traditionally linked with poorer outcomes, shows improved GS and HRs (vs.Anoxia). This could be due to rising drug intoxication cases and the allocation of donors with drug intoxication to recipients with higher MELD-Na, and those with CVA to recipients with lower scores. While COD remains crucial in donor selection, proper matching can mitigate differences among CODs.

9.
Liver Transpl ; 29(8): 793-803, 2023 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-36847140

RESUMO

The current liver allocation system may be disadvantaging younger adult recipients as it does not incorporate the donor-recipient age difference. Given the longer life expectancy of younger recipients, the influences of older donor grafts on their long-term prognosis should be elucidated. This study sought to reveal the long-term prognostic influence of the donor-recipient age difference in young adult recipients. Adult patients who received initial liver transplants from deceased donors between 2002 and 2021 were identified from the UNOS database. Young recipients (patients 45 years old or below) were categorized into 4 groups: donor age younger than the recipient, 0-9 years older, 10-19 years older, or 20 years older or above. Older recipients were defined as patients 65 years old or above. To examine the influence of the age difference in long-term survivors, conditional graft survival analysis was conducted on both younger and older recipients. Among 91,952 transplant recipients, 15,170 patients were 45 years old or below (16.5%); these were categorized into 6,114 (40.3%), 3,315 (21.9%), 2,970 (19.6%), and 2,771 (18.3%) for groups 1-4, respectively. Group 1 demonstrated the highest probability of survival, followed by groups 2, 3, and 4 for the actual graft survival and conditional graft survival analyses. In younger recipients who survived at least 5 years post-transplant, inferior long-term survival was observed when there was an age difference of 10 years or above (86.9% vs. 80.6%, log-rank p <0.01), whereas there was no difference in older recipients (72.6% vs. 74.2%, log-rank p =0.89). In younger patients who are not in emergent need of a transplant, preferential allocation of younger aged donor offers would optimize organ utility by increasing postoperative graft survival time.


Assuntos
Transplante de Rim , Transplante de Fígado , Humanos , Adulto Jovem , Idoso , Recém-Nascido , Lactente , Pré-Escolar , Criança , Pessoa de Meia-Idade , Transplante de Fígado/efeitos adversos , Doadores Vivos , Fatores de Tempo , Doadores de Tecidos , Análise de Sobrevida , Sobrevivência de Enxerto , Fatores Etários , Estudos Retrospectivos
10.
Ann Surg Oncol ; 30(5): 2769-2777, 2023 May.
Artigo em Inglês | MEDLINE | ID: mdl-36719568

RESUMO

BACKGROUND: Current success in transplant oncology for select liver tumors, such as hepatocellular carcinoma, has ignited international interest in liver transplantation (LT) as a therapeutic option for nonresectable colorectal liver metastases (CRLM). In the United States, the CRLM LT experience is limited to reports from a handful of centers. This study was designed to summarize donor, recipient, and transplant center characteristics and posttransplant outcomes for the indication of CRLM. METHODS: Adult, primary LT patients listed between December 2017 and March 2022 were identified by using United Network Organ Sharing database. LT for CRLM was identified from variables: "DIAG_OSTXT"; "DGN_OSTXT_TCR"; "DGN2_OSTXT_TCR"; and "MALIG_TY_OSTXT." RESULTS: During this study period, 64 patients were listed, and 46 received LT for CRLM in 15 centers. Of 46 patients who underwent LT for CRLM, 26 patients (56.5%) received LTs using living donor LT (LDLT), and 20 patients received LT using deceased donor (DDLT) (43.5%). The median laboratory MELD-Na score at the time of listing was statistically similar between the LDLT and DDLT groups (8 vs. 9, P = 0.14). This persisted at the time of LT (8 vs. 12, P = 0.06). The 1-, 2-, and 3-year, disease-free, survival rates were 75.1, 53.7, and 53.7%. Overall survival rates were 89.0, 60.4, and 60.4%, respectively. CONCLUSIONS: This first comprehensive U.S. analysis of LT for CRLM suggests a burgeoning interest in high-volume U.S. transplant centers. Strategies to optimize patient selection are limited by the scarce oncologic history provided in UNOS data, warranting a separate registry to study LT in CRLM.


Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Transplante de Fígado , Adulto , Humanos , Estados Unidos , Estudos Retrospectivos , Neoplasias Hepáticas/cirurgia , Doadores Vivos , Neoplasias Colorretais/cirurgia , Receptores de Antígenos de Linfócitos T , Resultado do Tratamento
11.
Clin Transplant ; 37(12): e15127, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-37772621

RESUMO

BACKGROUND: Despite advancements in liver transplantation (LT) over the past two decades, liver re-transplantation (re-LT) presents challenges. This study aimed to assess improvements in re-LT outcomes and contributing factors. METHODS: Data from the United Network for Organ Sharing database (2002-2021) were analyzed, with recipients categorized into four-year intervals. Trends in re-LT characteristics and postoperative outcomes were evaluated. RESULTS: Of 128,462 LT patients, 7254 received re-LT. Graft survival (GS) for re-LT improved (91.3%, 82.1%, and 70.8% at 30 days, 1 year, and 3 years post-LT from 2018 to 2021). However, hazard ratios (HRs) for GS remained elevated compared to marginal donors including donors after circulatory death (DCD), although the difference in HRs decreased in long-term GS. Changes in re-LT causes included a reduction in hepatitis C recurrence and an increase in graft failure post-primary LT involving DCD. Trends identified included recent decreased cold ischemic time (CIT) and increased distance from donor hospital in re-LT group. Meanwhile, DCD cohort exhibited less significant increase in distance and more marked decrease in CIT. The shortest CIT was recorded in urgent re-LT group. The highest Model for End-Stage Liver Disease score was observed in urgent re-LT group, while the lowest was recorded in DCD group. Analysis revealed shorter time interval between previous LT and re-listing, leading to worse outcomes, and varying primary graft failure causes influencing overall survival post-re-LT. DISCUSSION: While short-term re-LT outcomes improved, challenges persist compared to DCD. Further enhancements are required, with ongoing research focusing on optimizing risk stratification models and allocation systems for better LT outcomes.


Assuntos
Doença Hepática Terminal , Transplante de Fígado , Obtenção de Tecidos e Órgãos , Humanos , Doença Hepática Terminal/cirurgia , Índice de Gravidade de Doença , Doadores de Tecidos , Sobrevivência de Enxerto , Estudos Retrospectivos
12.
Int J Colorectal Dis ; 38(1): 21, 2023 Jan 21.
Artigo em Inglês | MEDLINE | ID: mdl-36680603

RESUMO

PURPOSE: Abdominal aortic calcification (AAC) is a well-known risk marker for cardiovascular disease. However, its clinical effect on patients who underwent radical surgery for colorectal cancer (CRC) stages II-III is unclear. This study aimed to analyze the associations between AAC and prognosis of patients with stage II-III CRC. METHODS: To evaluate the effect of AAC on clinical outcomes, prognosis, and metastatic patterns of CRC, we analyzed 362 patients who underwent radical surgery for stage II-III CRC between 2010 and 2018. RESULTS: The high AAC group had significantly worse overall survival (OS), cancer-specific survival (CSS), and recurrence-free survival (RFS) after propensity score matching to adjust for differences in baseline characteristics of patients and tumors. In the multivariate Cox regression analyses, a high AAC was an independent risk factor for poor OS (hazard ratio [HR], 2.38; 95% confidence interval [CI], 1.23-4.59; p = 0.01), poor CSS (HR, 5.22; 95% CI, 1.74-15.6; p < 0.01), and poor RFS (HR, 1.83; 95% CI, 1.19-2.83; p < 0.01). A high AAC was not associated with a risk of lung metastasis or local or peritoneal recurrence, but a risk for liver metastasis of CRC. CONCLUSION: A high AAC showed a strong relationship with poor OS, CSS, and RFS after curative resection for stage II-III CRC. A high AAC was also associated with a risk for liver metastasis, which may worsen the prognosis in stage II-III CRC. AAC could be a new clinical tool for predicting the prognosis for patients in stage II-III CRC.


Assuntos
Neoplasias Colorretais , Neoplasias Hepáticas , Humanos , Estudos Retrospectivos , Prognóstico , Modelos de Riscos Proporcionais , Neoplasias Colorretais/patologia , Neoplasias Hepáticas/cirurgia
13.
BMC Surg ; 23(1): 86, 2023 Apr 11.
Artigo em Inglês | MEDLINE | ID: mdl-37041491

RESUMO

PURPOSE: The rate of postoperative morbidity, including infectious complications, is still high after major hepatobiliary pancreatic (HBP) surgery. Although surgery-related disseminated intravascular coagulation (DIC) occurs in some cases, its significance has not been elucidated in HBP surgery. This study aimed to evaluate the influence of surgery-related DIC on the complication severity after HBP surgery. METHODS: We analyzed the records of 100 patients with hepatectomy in two or more segments, hepatectomy with biliary tract reconstruction, and pancreaticoduodenectomy. The baseline characteristics and complications were compared between patients with and without surgery-related DIC on postoperative day 1 (POD1) after HBP surgery between 2010 and 2018. Complication severity was assessed using the Comprehensive Complication Index (CCI). RESULTS: The DIC group (surgery-related DIC on POD1) had predictive factors, such as larger bleeding volume and higher liver enzyme levels. The DIC group exhibited significantly elevated rates of surgical site infection, sepsis, prolonged intensive care unit stay, more frequent blood transfusions, and higher CCI. Furthermore, compared with and without adjustment of DIC, odds ratio (OR) of AST level and operation time for  the risk of high CCI decreased (OR of AST level: 1.25 to 1.19 and OR of operation time: 1.30 to 1.23) and the significant differences had vanished. CONCLUSIONS: Surgery-related DIC on POD1 could be a partial mediator between AST level, operation time and higher CCI. The prevention or proper management of surgery-related DIC on POD1 can be an important target to reduce the severity of postoperative complications.


Assuntos
Coagulação Intravascular Disseminada , Humanos , Coagulação Intravascular Disseminada/complicações , Infecção da Ferida Cirúrgica/complicações , Hemorragia , Razão de Chances
14.
Am J Transplant ; 22(10): 2392-2400, 2022 10.
Artigo em Inglês | MEDLINE | ID: mdl-35670552

RESUMO

Single nucleotide polymorphisms (SNPs) in FCGR3A can predict the susceptibility of liver transplant (LT) recipients to bloodstream infections (BSI) and clinical outcomes following living-donor LT (LDLT). Here, we retrospectively analyzed the relationship of adoptive immunotherapy with activated natural killer (NK) cells from perfusate effluents of liver allografts against BSI following LDLT. Higher BSI incidence and lower survival were observed in LT recipients with FcγRIIIa (158F/F or F/V) (n = 81) who did not receive adoptive immunotherapy (n = 55) than in those who did (n = 26) (BSI frequency, 36.4% vs. 11.5%; p = .033; log-rank p = .047). After matching patient background using propensity score, similar results were obtained (BSI ratio, 41.7% vs. 12.5%; p = .049; log-rank p = .039). The predominant BSI pathogens in patients who did and did not receive adoptive immunotherapy were gram-negative rods (n = 3, 100%) and gram-positive cocci (GPC) (n = 15, 65.2%), respectively. The proportion of NK cells administered to patients with BSI was significantly lower than that administered to patients without BSI (Number: 80.3 (29.9-239.2) × 106 cells vs. 37.1 (35.6-50.4) × 106 ; p = .033, percentage; 14.1 (13.3-17.8)% vs. 34.6 (16.5-47)%, p = .0078). Therefore, adoptive immunotherapy with NK cells was associated with the reduced post-transplant BSI related to GPCs due to FcγRIIIa SNP in LT recipients.


Assuntos
Transplante de Fígado , Sepse , Predisposição Genética para Doença/etiologia , Humanos , Fatores Imunológicos , Imunoterapia Adotiva/efeitos adversos , Transplante de Fígado/efeitos adversos , Doadores Vivos , Polimorfismo de Nucleotídeo Único , Estudos Retrospectivos , Fatores de Risco , Sepse/etiologia
15.
Transpl Int ; 33(12): 1745-1753, 2020 12.
Artigo em Inglês | MEDLINE | ID: mdl-32970890

RESUMO

Abdominal aortic calcification (AAC) was reported as a poor prognostic factor among liver transplantation. However, donor AAC is not enough discussed. We analyzed the impact of the donor AAC level on graft function on outcomes following living donor liver transplantation (LDLT). A total of 133 consecutive patients who had undergone LDLT were divided into two groups (non-AAC group and AAC group) according to their donor AAC level by plain computed tomography. The rate of postoperative biliary complications (BC) was significantly higher in AAC group (N = 17) than in non-AAC group (N = 116; HR, 2.77; 95% CI, 1.32-5.83; P = 0.0008). The Cox proportional hazards regression model revealed that donor AAC (HR, 4.15; 95% CI, 1.93-8.97; P = 0.0003) and right lobe graft (HR, 2.81; 95% CI, 1.41-5.61; P = 0.003) increased the risk of BC. Conversely, splenectomy (HR, 0.39; 95% CI, 0.16-0.92; P = 0.03) decreased the risk of BC after LDLT independently. The long-term survival was also significantly worse in AAC group than in non-AAC group (HR, 2.25; 95% CI, 1.04-4.89; P = 0.04). Donor AAC was an independent prognostic factor for BC among patients undergoing LDLT. Although further investigations are needed to verify our results, the levels of donor AAC could be a useful tool to identify the risks of BC and predict better outcomes following LDLT.


Assuntos
Transplante de Fígado , Sobrevivência de Enxerto , Humanos , Transplante de Fígado/efeitos adversos , Doadores Vivos , Complicações Pós-Operatórias , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Fatores de Risco , Resultado do Tratamento
16.
Liver Transpl ; 25(1): 79-87, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-30021054

RESUMO

Abdominal aortic calcification (AAC) is known as a risk factor of coronary artery disease, stroke, hyperphosphatemia, chronic inflammation, diabetes, and decreased estimated glomerular filtration rate. However, the clinical implications of incidental AAC findings in liver transplantation (LT) have not been evaluated in terms of posttransplantation survival and complications. Therefore, we analyzed the relationships between the AAC level and the outcomes following LT. A total of 156 consecutive patients who underwent LT between January 2007 and December 2014 were divided into 2 groups according to their AAC level (<100 mm3 or ≥100 mm3 ), as calculated using the Agatston method. Even after propensity matching, the survival time was significantly longer in the low-AAC group compared with that in the high-AAC group (median survival time, 4.5 versus 3.0 years; P < 0.01). A multivariate analysis identified high AAC level (hazard ratio, 2.2) and old donor age (hazard ratio, 2.2) as prognostic factors for overall survival. In conclusion, high AAC is an independent unfavorable prognostic factor in LT.


Assuntos
Aorta Abdominal/patologia , Doença Hepática Terminal/cirurgia , Transplante de Fígado/efeitos adversos , Calcificação Vascular/epidemiologia , Adulto , Fatores Etários , Idoso , Aorta Abdominal/diagnóstico por imagem , Angiografia por Tomografia Computadorizada , Doença Hepática Terminal/complicações , Doença Hepática Terminal/mortalidade , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Período Pré-Operatório , Prognóstico , Estudos Retrospectivos , Fatores de Risco , Análise de Sobrevida , Doadores de Tecidos/estatística & dados numéricos , Transplantados/estatística & dados numéricos , Calcificação Vascular/complicações , Calcificação Vascular/diagnóstico
19.
Ann Nutr Metab ; 72(2): 112-116, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29353284

RESUMO

BACKGROUND/AIMS: The need for totally implantable central venous access devices (TICVADs) has increased with increased opportunities in the use of chemotherapy and parenteral nutrition. This study aimed to determine the outcomes of TICVAD implantation and use in patients aged ≥85 years. METHODS: Between January 2010 and August 2016, 117 patients underwent TICVAD implantation and their records were retrospectively reviewed. RESULTS: Participants were divided into 2 groups (plus-85 and sub-85 groups). Fifty-five patients (47.0%) had solid organ cancer alone; 35 patients (29.9%) had cerebrovascular or cranial nerve disease. The average follow-up period was 201 (2-1,620) days. Major complications were identified in 6 (14.6%) plus-85 patients and 11 (14.5%) sub-85 patients (p = 0.9813). Catheter-related infections developed in 3 plus-85 (7.3%) and 4 sub-85 patients (5.3%; p = 0.6549). There were no significant group differences in hematoma, pneumothorax, occlusion, and removal rates. In plus-85 patients examined just before surgery and a month after surgery, increased rates of serum albumin and Onodera's prognostic nutritional index were observed in 48% (14/39) and 41% (12/39), respectively. CONCLUSIONS: The use of TICVADs in the plus-85 group resulted in effective outcomes. The results of this retrospective study support the wider use of TICVADs in patients aged ≥85 years.


Assuntos
Infecções Relacionadas a Cateter/epidemiologia , Cateteres de Demora , Cateteres Venosos Centrais , Idoso , Idoso de 80 Anos ou mais , Cateterismo Venoso Central , Feminino , Humanos , Masculino , Estudos Retrospectivos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA