RESUMO
PURPOSE: Severe obesity is a barrier to listing for kidney transplantation due to concern for poor outcomes. This study aims to compare bariatric surgery with medical weight loss as a means of achieving weight loss and subsequent listing for renal transplant. We hypothesize that bariatric surgery will induce greater frequency of listing for transplant within 18 months of study initiation. MATERIALS AND METHODS: We performed a randomized study of metabolic bariatric surgery (MBS) vs medical weight loss (MM) in patients on dialysis with a body mass index (BMI) of 40-55 kg/m2. The primary outcome was suitability for renal transplant within 18 months of initiating treatment. Secondary outcomes included weight loss, mortality, and complications. RESULTS: Twenty patients enrolled, only 9 (5 MBS, 4 MM) received treatment. Treated groups did not differ in age, gender, or race (P ≥ .44). There was no statistically significant difference in the primary endpoint: 2 MBS (40%) and 1 MM (25%) listed for transplant ≤18 months (P = 1.00). With additional time, 100% MBS and 25% MM patients achieved listing status (P = .048); 100% of MBS and 0 MM received kidney transplants to date (P = .008). Weight, weight loss, and BMI trajectories differed between the groups (P ≤ .002). One death from COVID-19 occurred in the MM group, and 1 MBS patient had a myocardial infarction 3.75 years after baseline evaluation. CONCLUSION: These results suggest MBS is superior to MM in achieving weight loss prior to listing for kidney transplantation. Larger studies are needed to ensure the safety profile is acceptable in patients with ESRD undergoing bariatric surgery.
RESUMO
BACKGROUND: A major change to deceased-donor kidney allocation in the United States, Kidney Allocation System 250 (KAS250), was implemented on March 15, 2021. Evaluating the consequences of this policy on critical system performance metrics is critical to determining its success. METHODS: We performed a retrospective analysis of critical performance measures of the kidney transplant system by reviewing all organs procured during a 4-y period in the United States. To mitigate against possible effects of the COVID-19 pandemic, Scientific Registry of Transplant Recipients records were stratified into 2 pre- and 2 post-KAS250 eras: (1) 2019; (2) January 1, 2020-March14, 2021; (3) March 15, 2021-December 31, 2021; and (4) 2022. Between-era differences in rates of key metrics were analyzed using chi-square tests with pairwise z -tests. Multivariable logistic regression and analysis of variations methods were used to evaluate the effects of the policy on rural and urban centers. RESULTS: Over the period examined, among kidneys recovered for transplant, nonuse increased from 19.7% to 26.4% (all between-era P < 0.05) and among all Kidney Donor Profile Index strata. Cold ischemia times increased ( P < 0.001); however, the distance between donor and recipient hospitals decreased ( P < 0.05). Kidneys from small-metropolitan or nonmetropolitan hospitals were more likely to not be used over all times ( P < 0.05). CONCLUSIONS: Implementation of KAS250 was associated with increased nonuse rates across all Kidney Donor Profile Index strata, increased cold ischemic times, and shorter distance traveled.
Assuntos
COVID-19 , Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Transplante de Rim/estatística & dados numéricos , Estudos Retrospectivos , Obtenção de Tecidos e Órgãos/legislação & jurisprudência , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/tendências , Estados Unidos , COVID-19/epidemiologia , COVID-19/prevenção & controle , Doadores de Tecidos/provisão & distribuição , Feminino , Masculino , Sistema de Registros/estatística & dados numéricos , Pessoa de Meia-Idade , SARS-CoV-2 , Adulto , Isquemia FriaRESUMO
Importance: A new liver allocation policy was implemented by United Network for Organ Sharing (UNOS) in February 2020 with the stated intent of improving access to liver transplant (LT). There are growing concerns nationally regarding the implications this new system may have on LT costs, as well as access to a chance for LT, which have not been captured at a multicenter level. Objective: To characterize LT volume and cost changes across the US and within specific center groups and demographics after the policy implementation. Design, Setting, and Participants: This cross-sectional study collected and reviewed LT volume from multiple centers across the US and cost data with attention to 8 specific center demographics. Two separate 12-month eras were compared, before and after the new UNOS allocation policy: March 4, 2019, to March 4, 2020, and March 5, 2020, to March 5, 2021. Data analysis was performed from May to December 2022. Main Outcomes and Measures: Center volume, changes in cost. Results: A total of 22 of 68 centers responded comparing 1948 LTs before the policy change and 1837 LTs postpolicy, resulting in a 6% volume decrease. Transplants using local donations after brain death decreased 54% (P < .001) while imported donations after brain death increased 133% (P = .003). Imported fly-outs and dry runs increased 163% (median, 19; range, 1-75, vs 50, range, 2-91; P = .009) and 33% (median, 3; range, 0-16, vs 7, range, 0-24; P = .02). Overall hospital costs increased 10.9% to a total of $46â¯360â¯176 (P = .94) for participating centers. There was a 77% fly-out cost increase postpolicy ($10â¯600â¯234; P = .03). On subanalysis, centers with decreased LT volume postpolicy observed higher overall hospital costs ($41â¯720â¯365; P = .048), and specifically, a 122% cost increase for liver imports ($6â¯508â¯480; P = .002). Transplant centers from low-income states showed a significant increase in hospital (12%) and import (94%) costs. Centers serving populations with larger proportions of racial and ethnic minority candidates and specifically Black candidates significantly increased costs by more than 90% for imported livers, fly-outs, and dry runs despite lower LT volume. Similarly, costs increased significantly (>100%) for fly-outs and dry runs in centers from worse-performing health systems. Conclusions and Relevance: Based on this large multicenter effort and contrary to current assumptions, the new liver distribution system appears to place a disproportionate burden on populations of the current LT community who already experience disparities in health care. The continuous allocation policies being promoted by UNOS could make the situation even worse.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Transplante de Fígado/economia , Humanos , Estudos Transversais , Estados Unidos , Obtenção de Tecidos e Órgãos/economia , Obtenção de Tecidos e Órgãos/legislação & jurisprudência , Política de Saúde , Masculino , Feminino , Listas de EsperaRESUMO
The Liver Simulated Allocation Model (LSAM) is used to evaluate proposed organ allocation policies. Although LSAM has been shown to predict the directionality of changes in transplants and nonused organs, the magnitude is often overestimated. One reason is that policymakers and researchers using LSAM assume static levels of organ donation and center behavior because of challenges with predicting future behavior. We sought to assess the ability of LSAM to account for changes in organ donation and organ acceptance behavior using LSAM 2019. We ran 1-year simulations with the default model and then ran simulations changing donor arrival rates (ie, organ donation) and center acceptance behavior. Changing the donor arrival rate was associated with a progressive simulated increase in transplants, with corresponding simulated decreases in waitlist deaths. Changing parameters related to organ acceptance was associated with important changes in transplants, nonused organs, and waitlist deaths in the expected direction in data simulations, although to a much lesser degree than changing the donor arrival rate. Increasing the donor arrival rate was associated with a marked decrease in the travel distance of donor livers in simulations. In conclusion, we demonstrate that LSAM can account for changes in organ donation and organ acceptance in a manner aligned with historical precedent that can inform future policy analyses. As Scientific Registry of Transplant Recipients develops new simulation programs, the importance of considering changes in donation and center practice is critical to accurately estimate the impact of new allocation policies.
RESUMO
Importance: Availability of organs inadequately addresses the need of patients waiting for a transplant. Objective: To estimate the true number of donor patients in the United States and identify inefficiencies in the donation process as a way to guide system improvement. Design, Setting, and Participants: A retrospective cross-sectional analysis was performed of organ donation across 13 different hospitals in 2 donor service areas covered by 2 organ procurement organizations (OPOs) in 2017 and 2018 to compare donor potential to actual donors. More than 2000 complete medical records for decedents were reviewed as a sample of nearly 9000 deaths. Data were analyzed from January 1, 2017, to December 31, 2018. Exposure: Deaths of causes consistent with donation according to medical record review, ventilated patient referrals, center acceptance practices, and actual deceased donors. Main Outcomes and Measures: Potential donors by medical record review vs actual donors and OPO performance at specific hospitals. Results: Compared with 242 actual donors, 931 potential donors were identified at these hospitals. This suggests a deceased donor potential of 3.85 times (95% CI, 4.23-5.32) the actual number of donors recovered. There was a surprisingly wide variability in conversion of potential donor patients into actual donors among the hospitals studied, from 0% to 51.0%. One OPO recovered 18.8% of the potential donors, whereas the second recovered 48.2%. The performance of the OPOs was moderately related to referrals of ventilated patients and not related to center acceptance practices. Conclusions and Relevance: In this cross-sectional study of hospitals served by 2 OPOs, wide variation was found in the performance of the OPOs, especially at individual hospitals. Addressing this opportunity could greatly increase the organ supply, affirming the importance of recent efforts from the federal government to increase OPO accountability and transparency.
Assuntos
Transplante de Órgãos , Obtenção de Tecidos e Órgãos , Humanos , Estados Unidos , Estudos Transversais , Estudos Retrospectivos , Doadores de TecidosRESUMO
BACKGROUND: Societal factors that influence wait-listing for transplantation are complex and poorly understood. Social determinants of health (SDOH) affect rates of and outcomes after transplantation. METHODS: This cross-sectional study investigated the impact of SDOH on additions to state-level, 2017-2018 kidney and liver wait-lists. Principal components analysis, starting with 127 variables among 3142 counties, was used to derive novel, comprehensive state-level composites, designated (1) health/economics and (2) community capital/urbanicity. Stepwise multivariate linear regression with backwards elimination (n = 51; 50 states and DC) tested the effects of these composites, Medicaid expansion, and center density on adult disease burden-adjusted wait-list additions. RESULTS: SDOH related to increased community capital/urbanicity were independently associated with wait-listing (starting models: B = .40, P = .010 Kidney; B = .36, P = .038 Liver) (final models: B = .31, P = .027 Kidney, B = .34, P = .015 Liver). In contrast and surprisingly, no other covariates were associated with wait-listing (P ≥ .122). CONCLUSIONS: These results suggest that deficits in community resources are important contributors to disparities in wait-list access. Our composite SDOH metrics may help identify at-risk communities, which can be the focus of local and national policy initiatives to improve access to organ transplantation.
Assuntos
Transplante de Órgãos , Determinantes Sociais da Saúde , Adulto , Estados Unidos , Humanos , Estudos Transversais , Listas de EsperaRESUMO
Cytokines are secreted soluble glycoproteins that regulate cellular growth, proliferation, and differentiation. Suppressors of cytokine signaling (SOCS) proteins negatively regulate cytokine signaling and form a classical negative feedback loop in the signaling pathways. There are eight members of the SOCS family. The SOCS proteins are all comprised of a loosely conserved N-terminal domain, a central Src homology 2 (SH2) domain, and a highly conserved SOCS box at the C-terminus. The role of SOCS proteins has been implicated in the regulation of cytokines and growth factors in liver diseases. The SOCS1 and SOCS3 proteins are involved in immune response and inhibit protective interferon signaling in viral hepatitis. A decreased expression of SOCS3 is associated with advanced stage and poor prognosis of patients with hepatocellular carcinoma (HCC). DNA methylations of SOCS1 and SOCS3 are found in HCC. Precise regulation of liver regeneration is influenced by stimulatory and inhibitory factors after partial hepatectomy (PH), in particular, SOCS2 and SOCS3 are induced at an early time point after PH. Evidence supporting the important role of SOCS signaling during liver regeneration also supports a role of SOCS signaling in HCC. Immuno-oncology drugs are now the first-line therapy for advanced HCC. The SOCS can be potential targets for HCC in terms of cell proliferation, cell differentiation, and immune response. In this literature review, we summarize recent findings of the SOCS family proteins related to HCC and liver diseases.
RESUMO
To meet new Centers for Medicare and Medicaid Services (CMS) metrics, organ procurement organizations (OPOs) will benefit from understanding performance across decedent and hospital types. We sought to determine the utility of existing data-reporting structures for this purpose by reviewing Scientific Registry of Transplant Recipient (SRTR) OPO-Specific Reports (OSRs) from 2013 to 2019. OSRs contain both the Standardized donation rate ratio (SDRR) metric and OPO-reported numbers of "eligible deaths" and donors by hospital. Donor hospitals were characterized using information from Homeland Infrastructure Foundation-Level Data, Dartmouth Atlas Hospital Service Area data, and the US Census Bureau. Hospital data reported by OPOs showed 51% higher eligible death donors and 140% higher noneligible death donors per 100 inpatient beds in CMS ranked top versus bottom-quartile OPOs. Top-quartile OPOs by the CMS metric recovered 78% more donors than those in the bottom quartile, but were indistinguishable by SDRR rankings. These differences persisted across hospital sizes, trauma case mix, and area demographics. OPOs with divergent performance were indistinguishable over time by SDRR, but showed changes to hospital-level recovery patterns in SRTR data. Contemporaneous recognition of underperformance across hospitals may provide important and actionable data for regulators and OPOs for focused quality improvement projects.
Assuntos
Obtenção de Tecidos e Órgãos , Transplantados , Idoso , Humanos , Medicare , Sistema de Registros , Doadores de Tecidos , Estados UnidosRESUMO
Recent changes to organ procurement organization (OPO) performance metrics have highlighted the need to identify opportunities to increase organ donation in the United States. Using data from the Organ Procurement and Transplantation Network (OPTN), Scientific Registry of Transplant Recipients (SRTR), and Veteran Health Administration Informatics and Computing Infrastructure Clinical Data Warehouse (VINCI CDW), we sought to describe historical donation performance at Veteran Administration Medical Centers (VAMCs). We found that over the period 2010-2019, there were only 33 donors recovered from the 115 VAMCs with donor potential nationwide. VA donors had similar age-matched organ transplant yields to non-VA donors. Review of VAMC records showed a total of 8474 decedents with causes of death compatible with donation, of whom 5281 had no infectious or neoplastic comorbidities preclusive to donation. Relative to a single state comparison of adult non-VA inpatient deaths, VAMC deaths were 20 times less likely to be characterized as an eligible death by SRTR. The rate of conversion of inpatient donation-consistent deaths without preclusive comorbidities to actual donors at VAMCs was 5.9% that of adult inpatients at non-VA hospitals. Overall, these findings suggest significant opportunities for growth in donation at VAMCs.
Assuntos
Transplante de Órgãos , Obtenção de Tecidos e Órgãos , Veteranos , Adulto , Humanos , Doadores de Tecidos , Transplantados , Estados UnidosRESUMO
OBJECTIVES: Despite data showing equivalent outcomes between grafts from marginal versus standard criteria deceased liver donors, elevated donor transaminases constitute a frequent reason to decline potential livers. We assessed the effect of donor transaminase levels and other characteristics on graft survival. MATERIALS AND METHODS: We performed a retrospective cohort analysis of adult first deceased donor liver transplant recipients with available transaminase levels registered in the Organ Procurement and Transplantation Network database (2008-2018). We used Cox proportional hazards regression to determine the effects of donor characteristics on graft survival. RESULTS: Of 53 913 liver transplants, 52 158 were allografts from donors with low transaminases (≤ 500 U/L; group A) and 1755 were from donors with elevated transaminases (> 500 U/L; group B). Group A recipients were more likely to be hospitalized (P = .01) or in intensive care (P < .001) or to have mechanical assistance (P < .001), portal vein thrombosis (P = .01), diabetes mellitus (P = .003), or dialysis the week before liver transplant (P = .004). Multivariable analysis (controlling for recipient characteristics) showed donor risk factors of graft failure included diabetes mellitus (P < .001), donation after cardiac death (P < .001), total bilirubin > 3.5 mg/dL (P < .001), serum creatinine > 1.5 mg/dL (P = .01), and cold ischemia time > 6 hours (P < .001). Regional organ sharing showed lower risk of graft failure (P = .02). Donor transaminases > 500 U/L were not associated with graft failure (relative risk, 1.02; 95% CI, 0.91-1.14; P = .74). CONCLUSIONS: Donor transaminases > 500 U/L should not preclude the use of liver grafts. Instead, donor total bilirubin > 3.5 mg/dL and serum creatinine > 1.5 mg/dL appear to be associated with higher likelihood of graft failure after liver transplant.
Assuntos
Sobrevivência de Enxerto , Transplante de Fígado , Doadores Vivos , Obtenção de Tecidos e Órgãos , Transaminases/sangue , Bilirrubina/sangue , Creatinina/sangue , Diabetes Mellitus , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Fatores de RiscoRESUMO
BACKGROUND: Trauma patients may present with nonsurvivable injuries, which could be resuscitated for future organ transplantation. Trauma surgeons face an ethical dilemma of deciding whether, when, and how to resuscitate a patient who will not directly benefit from it. As there are no established guidelines to follow, we aimed to describe resuscitation practices for organ transplantation; we hypothesized that resuscitation practices vary regionally. METHOD: Over a 3-month period, we surveyed trauma surgeons practicing in Levels I and II trauma centers within a single state using an instrument to measure resuscitation attitudes and practices for organ preservation. Descriptive statistics were calculated for practice patterns. RESULTS: The survey response rate was 51% (31/60). Many (81%) had experience with resuscitations where the primary goal was to preserve potential for organ transplantation. Many (90%) said they encountered this dilemma at least monthly. All respondents were willing to intubate; most were willing to start vasopressors (94%) and to transfuse blood (84%) (range, 1 unit to >10 units). Of respondents, 29% would resuscitate for ≥24 hours, and 6% would perform a resuscitative thoracotomy. Respect for patients' dying process and future organ quality were the factors most frequently considered very important or important when deciding to stop or forgo resuscitation, followed closely by concerns about excessive resource use. CONCLUSION: Trauma surgeons' regional resuscitation practices vary widely for this patient population. This variation implies a lack of professional consensus regarding initiation and extent of resuscitations in this setting. These data suggest this is a common clinical challenge, which would benefit from further study to determine national variability, areas of equipoise, and features amenable to practice guidelines.
Assuntos
Padrões de Prática Médica/ética , Ressuscitação/ética , Doadores de Tecidos/ética , Transplante/ética , Traumatologia/ética , Ferimentos e Lesões/terapia , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Padrões de Prática Médica/estatística & dados numéricos , Ressuscitação/métodos , Inquéritos e Questionários , Tennessee , Centros de Traumatologia/ética , Centros de Traumatologia/estatística & dados numéricos , Traumatologia/estatística & dados numéricosRESUMO
OBJECTIVE: During the COVID-19 pandemic, health systems postponed non-essential medical procedures to accommodate surge of critically-ill patients. The long-term consequences of delaying procedures in response to COVID-19 remains unknown. We developed a high-throughput approach to understand the impact of delaying procedures on patient health outcomes using electronic health record (EHR) data. MATERIALS AND METHODS: We used EHR data from Vanderbilt University Medical Center's (VUMC) Research and Synthetic Derivatives. Elective procedures and non-urgent visits were suspended at VUMC between March 18, 2020 and April 24, 2020. Surgical procedure data from this period were compared to a similar timeframe in 2019. Potential adverse impact of delay in cardiovascular and cancer-related procedures was evaluated using EHR data collected from January 1, 1993 to March 17, 2020. For surgical procedure delay, outcomes included length of hospitalization (days), mortality during hospitalization, and readmission within six months. For screening procedure delay, outcomes included 5-year survival and cancer stage at diagnosis. RESULTS: We identified 416 surgical procedures that were negatively impacted during the COVID-19 pandemic compared to the same timeframe in 2019. Using retrospective data, we found 27 significant associations between procedure delay and adverse patient outcomes. Clinician review indicated that 88.9% of the significant associations were plausible and potentially clinically significant. Analytic pipelines for this study are available online. CONCLUSION: Our approach enables health systems to identify medical procedures affected by the COVID-19 pandemic and evaluate the effect of delay, enabling them to communicate effectively with patients and prioritize rescheduling to minimize adverse patient outcomes.
Assuntos
COVID-19/epidemiologia , Doenças Cardiovasculares/diagnóstico , Doenças Cardiovasculares/cirurgia , Neoplasias/diagnóstico , Neoplasias/cirurgia , Pandemias , Tempo para o Tratamento , Adulto , COVID-19/virologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , SARS-CoV-2/isolamento & purificaçãoRESUMO
A unique and complex microstructure underlies the diverse functions of the liver. Breakdown of this organization, as occurs in fibrosis and cirrhosis, impairs liver function and leads to disease. The role of integrin ß1 was examined both in establishing liver microstructure and recreating it after injury. Embryonic deletion of integrin ß1 in the liver disrupts the normal development of hepatocyte polarity, specification of cell-cell junctions, and canalicular formation. This in turn leads to the expression of transforming growth factor ß (TGF-ß) and widespread fibrosis. Targeted deletion of integrin ß1 in adult hepatocytes prevents recreation of normal hepatocyte architecture after liver injury, with resultant fibrosis. In vitro, integrin ß1 is essential for canalicular formation and is needed to prevent stellate cell activation by modulating TGF-ß. Taken together, these findings identify integrin ß1 as a key determinant of liver architecture with a critical role as a regulator of TGF-ß secretion. These results suggest that disrupting the hepatocyte-extracellular matrix interaction is sufficient to drive fibrosis.
Assuntos
Integrina beta1/metabolismo , Regeneração Hepática/fisiologia , Fígado/metabolismo , Fator de Crescimento Transformador beta/metabolismo , Animais , Matriz Extracelular/metabolismo , Hepatócitos/metabolismo , Cirrose Hepática/metabolismo , Camundongos , Camundongos TransgênicosRESUMO
Liver fibrosis is one of the risk factors for hepatocellular carcinoma (HCC) development. The staging of liver fibrosis can be evaluated only via a liver biopsy, which is an invasive procedure. Noninvasive methods for the diagnosis of liver fibrosis can be divided into morphological tests such as elastography and serum biochemical tests. Transient elastography is reported to have excellent performance in the diagnosis of liver fibrosis and has been accepted as a useful tool for the prediction of HCC development and other clinical outcomes. Two-dimensional shear wave elastography is a new technique and provides a real-time stiffness image. Serum fibrosis markers have been studied based on the mechanism of fibrogenesis and fibrolysis. In the healthy liver, homeostasis of the extracellular matrix is maintained directly by enzymes called matrix metalloproteinases (MMPs) and their specific inhibitors, tissue inhibitors of metalloproteinases (TIMPs). MMPs and TIMPs could be useful serum biomarkers for liver fibrosis and promising candidates for the treatment of liver fibrosis. Further studies are required to establish liver fibrosis-specific markers based on further clinical and molecular research. In this review, we summarize noninvasive fibrosis tests and molecular mechanism of liver fibrosis in current daily clinical practice.
Assuntos
Biomarcadores/sangue , Técnicas de Imagem por Elasticidade/métodos , Cirrose Hepática/diagnóstico , Antígenos de Neoplasias/sangue , Sistemas Computacionais , Proteínas da Matriz Extracelular/metabolismo , Fibronectinas/sangue , Hepatite Viral Humana/sangue , Hepatite Viral Humana/complicações , Humanos , Cirrose Hepática/sangue , Cirrose Hepática/diagnóstico por imagem , Cirrose Hepática/etiologia , Imageamento por Ressonância Magnética/métodos , Metaloproteinases da Matriz/sangue , Metaloproteinases da Matriz/classificação , Metaloproteinases da Matriz/fisiologia , Glicoproteínas de Membrana/sangue , Especificidade por Substrato , Inibidores Teciduais de Metaloproteinases/sangue , Inibidores Teciduais de Metaloproteinases/fisiologia , Ultrassonografia/métodosRESUMO
Recipients of donation after circulatory death (DCD) LTs historically have an increased risk of graft failure. Antibody induction (AI) with antithymocyte globulin (ATG) or anti-interleukin 2 receptor (anti-IL2R) immunotherapy may decrease the incidence of graft failure by mitigating ischemia/reperfusion injury. A retrospective review of the United Network for Organ Sharing (UNOS) database for LTs between 2002 and 2015 was conducted to determine whether ATG or anti-IL2R AI was associated with graft survival in DCD. A secondary endpoint was postoperative renal function as measured by estimated glomerular filtration rate at 6 and 12 months. Among DCD recipients, ATG (hazard ratio [HR] = 0.71; P = 0.03), but not anti-IL2R (HR = 0.82; P = 0.10), was associated with a decrease in graft failure at 3 years when compared with recipients without AI. ATG (HR = 0.90; P = 0.02) and anti-IL2R (HR = 0.94; P = 0.03) were associated with a decreased risk of graft failure in donation after brain death (DBD) liver recipients at 3 years compared with no AI. When induction regimens were compared between DCD and DBD, only ATG (HR = 1.19; P = 0.19), and not anti-IL2R (HR = 1.49; P < 0.01) or no AI (HR = 1.77; P < 0.01), was associated with similar survival between DCD and DBD. In conclusion, AI therapy with ATG was associated with improved longterm liver allograft survival in DCD compared with no AI. ATG was associated with equivalent graft survival between DCD and DBD, suggesting a beneficial role of immune cell depletion in DCD outcomes.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Morte Encefálica , Morte , Rejeição de Enxerto/epidemiologia , Rejeição de Enxerto/prevenção & controle , Sobrevivência de Enxerto , Humanos , Terapia de Imunossupressão , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Doadores de TecidosRESUMO
BACKGROUND: Living donor liver transplantation (LDLT) and donation after circulatory death (DCD) can expand the donor pool for cholestatic liver disease (CLD) patients. We sought to compare the outcomes of deceased donor liver transplant (DDLT) vs LDLT in CLD patients. METHODS: Retrospective cohort analysis of adult CLD recipients registered in the OPTN database who received primary LT between 2002 and 2018. Cox proportional hazards regression models with mixed effects were used to determine the impact of graft type on patient and graft survival. RESULTS: Five thousand, nine hundred ninety-nine DDLT (5730 donation after brain death [DBD], 269 DCD) and 912 LDLT recipients were identified. Ten-year patient/graft survival rates were DBD: 73.8%/67.9%, DCD: 74.7%/60.7%, and LDLT: 82.5%/73.9%. Higher rates of biliary complications as a cause of graft failure were seen in DCD (56.8%) than LDLT (30.5%) or DBD (18.7%) recipients. On multivariable analysis, graft type was not associated with patient mortality, while DCD was independently associated with graft failure (P = .046). CONCLUSION: DBD, DCD, and LDLT were associated with comparable overall patient survival. No difference in the risk of graft failure could be observed between LDLT and DBD. DCD can be an acceptable alternative to DBD with equivalent patient survival, but inferior graft survival likely related to the high rate of biliary complications.