RESUMO
RATIONALE & OBJECTIVE: The US Kidney Allocation System (KAS) prioritizes candidates with a≤20% estimated posttransplant survival (EPTS) to receive high-longevity kidneys defined by a≤20% Kidney Donor Profile Index (KDPI). Use of EPTS in the KAS deprioritizes candidates with older age, diabetes, and longer dialysis durations. We assessed whether this use also disadvantages race and ethnicity minority candidates, who are younger but more likely to have diabetes and longer durations of kidney failure requiring dialysis. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Adult candidates for and recipients of kidney transplantation represented in the Scientific Registry of Transplant Recipients from January 2015 through December 2020. EXPOSURE: Race and ethnicity. OUTCOME: Age-adjusted assignment to≤20% EPTS, transplantation of a≤20% KDPI kidney, and posttransplant survival in longevity-matched recipients by race and ethnicity. ANALYTIC APPROACH: Multivariable logistic regression, Fine-Gray competing risks survival analysis, and Kaplan-Meier and Cox proportional hazards methods. RESULTS: The cohort included 199,444 candidates (7% Asian, 29% Black, 19% Hispanic or Latino, and 43% White) listed for deceased donor kidney transplantation. Non-White candidates had significantly higher rates of diabetes, longer dialysis duration, and were younger than White candidates. Adjusted for age, Asian, Black, and Hispanic or Latino candidates had significantly lower odds of having a ETPS score of≤20% (odds ratio, 0.86 [95% CI, 0.81-0.91], 0.52 [95% CI, 0.50-0.54], and 0.49 [95% CI, 0.47-0.51]), and were less likely to receive a≤20% KDPI kidney (sub-hazard ratio, 0.70 [0.66-0.75], 0.89 [0.87-0.92], and 0.73 [0.71-0.76]) compared with White candidates. Among recipients with≤20% EPTS scores transplanted with a≤20% KDPI deceased donor kidney, Asian and Hispanic recipients had lower posttransplant mortality (HR, 0.45 [0.27-0.77] and 0.63 [0.47-0.86], respectively) and Black recipients had higher but not statistically significant posttransplant mortality (HR, 1.22 [0.99-1.52]) compared with White recipients. LIMITATIONS: Provider reported race and ethnicity data and 5-year post transplant follow-up period. CONCLUSIONS: The US kidney allocation system is less likely to identify race and ethnicity minority candidates as having a≤20% EPTS score, which triggers allocation of high-longevity deceased donor kidneys. These findings should inform the Organ Procurement and Transplant Network about how to remedy the race and ethnicity disparities introduced through KAS's current approach of allocating allografts with longer predicted longevity to recipients with longer estimated posttransplant survival. PLAIN-LANGUAGE SUMMARY: The US Kidney Allocation System prioritizes giving high-longevity, high-quality kidneys to patients on the waiting list who have a high estimated posttransplant survival (EPTS) score. EPTS is calculated based on the patient's age, whether the patient has diabetes, whether the patient has a history of organ transplantation, and the number of years spent on dialysis. Our analyses show that Asian, Black or African American, and Hispanic or Latino patients were less likely to receive high-longevity kidneys compared with White patients, despite having similar or better posttransplant survival outcomes.
Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Masculino , Feminino , Pessoa de Meia-Idade , Estados Unidos/epidemiologia , Adulto , Estudos de Coortes , Doadores de Tecidos , Falência Renal Crônica/cirurgia , Falência Renal Crônica/etnologia , Falência Renal Crônica/mortalidade , Sobrevivência de Enxerto , Idoso , Etnicidade , Longevidade , Sistema de Registros , Grupos RaciaisRESUMO
In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.
Assuntos
Sobrevivência de Enxerto , Transplante de Rim , Modelos Estatísticos , Transplante de Rim/mortalidade , Humanos , Prognóstico , Análise de Sobrevida , Estados Unidos , Probabilidade , Simulação por ComputadorRESUMO
BACKGROUND: The Organ Procurement and Transplant Network (OPTN) Final Rule guides national organ transplantation policies, mandating equitable organ allocation and organ-specific priority stratification systems. Current allocation scores rely on mortality predictions. METHODS: We examined the alignment between the ethical priorities across organ prioritization systems and the statistical design of the risk models in question. We searched PubMed for literature on organ allocation history, policy, and ethics in the United States. RESULTS: We identified 127 relevant articles, covering kidney (19), liver (60), lung (24), and heart transplants (23), and transplant accessibility (1). Current risk scores emphasize model performance and overlook ethical concerns in variable selection. The inclusion of race, sex, and geographical limits as categorical variables lacks biological basis; therefore, blurring the line between evidence-based models and discrimination. Comprehensive ethical and equity evaluation of risk scores is lacking, with only limited discussion of the algorithmic fairness of the Model for End-Stage Liver Disease (MELD) and the Kidney Donor Risk Index (KDRI) in some literature. We uncovered the inconsistent ethical standards underlying organ allocation scores in the United States. Specifically, we highlighted the exception points in MELD, the inclusion of race in KDRI, the geographical limit in the Lung Allocation Score, and the inadequacy of risk stratification in the Heart Tier system, creating obstacles for medically underserved populations. CONCLUSIONS: We encourage efforts to address statistical and ethical concerns in organ allocation models and urge standardization and transparency in policy development to ensure fairness, equitability, and evidence-based risk predictions.
Assuntos
Algoritmos , Obtenção de Tecidos e Órgãos , Humanos , Estados Unidos , Obtenção de Tecidos e Órgãos/ética , Transplante de Órgãos/ética , Alocação de Recursos para a Atenção à Saúde/ética , Alocação de Recursos/ética , Doadores de Tecidos/ética , Medição de RiscoRESUMO
BACKGROUND: The 2014 Kidney Allocation System (KAS) introduced longevity matching for adult candidates using the Estimated Post-Transplant Survival (EPTS) score, which includes candidate age, time on dialysis, diabetes status, and number of previous solid organ transplants. The proposed continuous distribution framework may expand the use of this attribute to pediatric candidates, but there is no data on its performance among pediatric kidney transplant recipients. METHODS: We performed a retrospective cohort study of 6800 pediatric kidney transplant recipients from 2001 to 2011 using Organ Procurement and Transplantation Network (OPTN) data. EPTS score was calculated for each patient and compared to reported patient survival to estimate the validity of the score in children. RESULTS: The median age of patients was 14.01 years (IQR 9.29-16.37 years), and dialysis vintage was 0.67 years (IQR 0-1.82 years). 18.2% of the cohort had a prior transplant and 1% had diabetes. Median EPTS score was 2 (IQR 1-2). Seven percent of patients died during the study period and 54.7% of the cohort was censored prior to 10 years. The c-statistic was 0.505 (95% CI: 0.49-0.53). CONCLUSION: Overall, EPTS is not a valid predictor of patient survival among pediatric kidney transplant recipients.
RESUMO
INTRODUCTION: Survival following lung transplant is low. With limited donor lung availability, predicting post-transplant survival is key. We investigated the predictive value of pre-transplant CT biomarkers on survival. METHODS: In this single-center retrospective cohort study of adults in a diverse, underserved, urban lung transplant program (11/8/2017-5/20/2022), chest CTs were analyzed using TeraRecon to assess musculature, fat, and bone. Erector spinae and pectoralis muscle area and attenuation were analyzed. Sarcopenia thresholds were 34.3 (women) and 38.5 (men) Hounsfield Units (HU). Visceral and subcutaneous fat area and HU, and vertebral body HU were measured. Demographics and pre-transplant metrics were recorded. Survival analyses included Kaplan-Meier and Cox proportional hazard. RESULTS: The study cohort comprised 131 patients, 50 women, mean age 60.82 (SD 10.15) years, and mean follow-up 1.78 (SD 1.23) years. Twenty-nine percent were White. Mortality was 32.1%. Kaplan-Meier curves did not follow the proportional hazard assumption for sex, so analysis was stratified. Pre-transplant EMR metrics did not predict survival. Women without sarcopenia at erector spinae or pectoralis had 100% survival (p = 0.007). Sarcopenia did not predict survival in men and muscle area did not predict survival in either sex. Men with higher visceral fat area and HU had decreased survival (p = 0.02). Higher vertebral body density predicted improved survival in men (p = 0.026) and women (p = 0.045). CONCLUSION: Pre-transplantation CT biomarkers had predictive value in lung transplant survival and varied by sex. The absence of sarcopenia in women, lower visceral fat attenuation and area in men, and higher vertebral body density in both sexes predicted survival in our diverse, urban population.
Assuntos
Transplante de Pulmão , Sarcopenia , Masculino , Adulto , Humanos , Feminino , Pessoa de Meia-Idade , Sarcopenia/diagnóstico por imagem , Estudos Retrospectivos , População Urbana , Biomarcadores , Transplante de Pulmão/efeitos adversos , Tomografia Computadorizada por Raios XRESUMO
BACKGROUND: Intra-aortic balloon pumps (IABP) are used to bridge select end-stage heart disease patients to heart transplant (HT). IABP use and exception requests both increased dramatically after the UNOS policy change (PC). The purpose of this study was to evaluate the effect of PC and exception status requests on waitlist and post-transplant outcomes in patients bridged to HT with IABP support. METHODS: We analyzed adult, first-time, single-organ HT recipients from the UNOS Registry either on IABP at the time of registration for HT or at the time of HT. We compared waitlist and post-HT outcomes between patients from the PRE (October 18, 2016 to May 30, 2018) and POST (October 18, 2018 to May 30, 2020) eras using Kaplan-Meier curves and time-to-event analyses. RESULTS: A total of 1267 patients underwent HT from IABP (261 pre-policy/1006 post-policy). On multivariate analysis, PC was associated with an increase in HT (sub-distribution hazard ratio (sdHR): 2.15, p < .001) and decrease in death/deterioration (sdHR: 0.55, p = .011) on the waitlist with no effect on 1-year post-HT survival (p = .8). The exception status of patients undergoing HT was predominantly seen in the POST era (29%, 293/1006); only four patients in the PRE era. Exception requests in the POST era did not alter patient outcomes. CONCLUSIONS: In patients bridged to heart transplant with an IABP, policy change is associated with decreased rates of death/deterioration and increased rates of heart transplantation on the waitlist without affecting 1-year post-transplant survival. While exception status use has markedly increased post-PC, it is not associated with patient outcomes.
Assuntos
Insuficiência Cardíaca , Transplante de Coração , Coração Auxiliar , Adulto , Insuficiência Cardíaca/cirurgia , Coração Auxiliar/efeitos adversos , Humanos , Balão Intra-Aórtico/efeitos adversos , Políticas , Estudos Retrospectivos , Listas de EsperaRESUMO
The impact of hyponatremia on waitlist and post-transplant outcomes following the implementation of MELD-Na-based liver allocation remains unclear. We investigated waitlist and postliver transplant (LT) outcomes in patients with hyponatremia before and after implementing MELD-Na-based allocation. Adult patients registered for a primary LT between 2009 and 2021 were identified in the OPTN/UNOS database. Two eras were defined; pre-MELD-Na and post-MELD-Na. Extreme hyponatremia was defined as a serum sodium concentration ≤120 mEq/l. Ninety-day waitlist outcomes and post-LT survival were compared using Fine-Gray proportional hazard and mixed-effects Cox proportional hazard models. A total of 118 487 patients were eligible (n = 64 940: pre-MELD-Na; n = 53 547: post-MELD-Na). In the pre-MELD-Na era, extreme hyponatremia at listing was associated with an increased risk of 90-day waitlist mortality ([ref: 135-145] HR: 3.80; 95% CI: 2.97-4.87; P < 0.001) and higher transplant probability (HR: 1.67; 95% CI: 1.38-2.01; P < 0.001). In the post-MELD-Na era, patients with extreme hyponatremia had a proportionally lower relative risk of waitlist mortality (HR: 2.27; 95% CI 1.60-3.23; P < 0.001) and proportionally higher transplant probability (HR: 2.12; 95% CI 1.76-2.55; P < 0.001) as patients with normal serum sodium levels (135-145). Extreme hyponatremia was associated with a higher risk of 90, 180, and 365-day post-LT survival compared to patients with normal serum sodium levels. With the introduction of MELD-Na-based allocation, waitlist outcomes have improved in patients with extreme hyponatremia but they continue to have worse short-term post-LT survival.
Assuntos
Hiponatremia , Transplante de Fígado , Adulto , Humanos , Hiponatremia/etiologia , Fatores de Risco , Sódio , Listas de EsperaRESUMO
Patients with hepatocellular carcinoma (HCC) are at high risk of second primary malignancies. As HCC has become the leading indication of liver transplant (LT), the aim of this study was to investigate whether the presence of HCC before LT could influence the onset of de novo malignancies (DNM). A cohort study was conducted on 2653 LT recipients. Hazard ratios (HR) of DNM development for patients transplanted for HCC (HCC patients) were compared with those of patients without any previous malignancy (non-HCC patients). All models were adjusted for sex, age, calendar year at transplant, and liver disease etiology. Throughout 17 903 person-years, 6.6% of HCC patients and 7.4% of non-HCC patients developed DNM (202 cases). The median time from LT to first DNM diagnosis was shorter for solid tumors in HCC patients (2.7 vs 4.5 years for HCC and non-HCC patients, respectively, P < 0.01). HCC patients were at a higher risk of bladder cancer and skin melanoma. There were no differences in cumulative DNM-specific mortality by HCC status. This study suggests that primary HCC could be a risk factor for DNM in LT recipients, allowing for risk stratification and screening individualization.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Carcinoma Hepatocelular/etiologia , Estudos de Coortes , Humanos , Incidência , Neoplasias Hepáticas/etiologia , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Fatores de RiscoRESUMO
BACKGROUND: The impact of donor quality on post-kidney transplant survival may vary by candidate condition. OBJECTIVE: Analyzing the combined use of the Kidney Donor Profile Index (KDPI) and the estimated post-transplant survival (EPTS) scale and their correlation with the estimated glomerular filtration rate (eGFR) decline in deceased-donor kidney recipients (DDKR). METHODS: This was a retrospective, observational cohort study. We included DDKRs between 2015 and 2017 at a national third-level hospital. RESULTS: We analyzed 68 DDKR. The mean age at transplant was 41 ± 14 years, 47 (69%) had sensitization events, 18 (26%) had delayed graft function, and 16 (23%) acute rejection. The graft survival at 12 and 36 months was 98.1% (95% CI 94-100) and 83.7% (95% CI 65-100), respectively. The Pearson correlation coefficient between the percentage reduction in the annual eGFR and the sum of EPTS and KDPI scales was r = 0.61, p < 0.001. The correlation coefficient between the percentage reduction in the annual eGFR and the EPTS and KDPI scales separately was r = 0.55, p < 0.001, and r = 0.53, p < 0.001, respectively. CONCLUSIONS: The sum of EPTS and KDPI scales can provide a better donor-recipient relationship and has a moderately positive correlation with the decrease in eGFR in DDKR.
Assuntos
Sobrevivência de Enxerto , Transplante de Rim , Doadores de Tecidos , Adulto , Taxa de Filtração Glomerular , Humanos , Rim , Pessoa de Meia-Idade , Estudos Retrospectivos , Análise de Sobrevida , TransplantadosRESUMO
Donation after circulatory death (DCD) liver transplantation is associated with higher rates of graft loss. In this paper, we explored whether the Model for Early Allograft Function (MEAF) predicted outcome in DCD liver transplantation. We performed a retrospective analysis of prospectively collected data from all adult DCD (Maastricht 3) livers transplanted in Cambridge and Edinburgh between 1 January 2011 and 30 June 2017, excluding those undergoing any form of machine perfusion. 187 DCD liver transplants were performed during the study period. DCD liver transplants with a lower MEAF score had a significantly better survival compared to those with a high MEAF score (Mantel-Cox P < .0001); this was largely due to early graft loss. Beyond 28 days post-transplant, there were no significant long-term graft or patient survival differences irrespective of the grade of MEAF (Mantel-Cox P = .64 and P = .43, respectively). The MEAF score correlated with the length of ICU (P = .0011) and hospital stay (P = .0007), but did not predict the requirement for retransplantation for ischemic cholangiopathy (P = .37) or readmission (P = .74). In this study, a high MEAF score predicted early graft loss, but not the subsequent need for re-transplantation or late graft failure as a result of intrahepatic ischemic bile duct pathology.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Adulto , Aloenxertos , Sobrevivência de Enxerto , Humanos , Transplante de Fígado/efeitos adversos , Estudos Retrospectivos , Doadores de TecidosRESUMO
The kidney allocation system (KAS) aims to improve deceased donor kidney transplant outcomes by matching of donor allografts and kidney recipients using the kidney donor risk index (KDRI) and recipient estimated post-transplant survival (EPTS) indices. In this single-center study, KAS was retroactively applied to 573 adult deceased donor kidney transplants (2004-2012) performed in the extended criteria/standard criteria donor (ECD/SCD) era. Donor KDRI and recipient EPTS were calculated, and transplants were analyzed to identify KAS fits. These were defined as allocation of top 20% allografts to top 20% recipients and bottom 80% allografts to bottom 80% recipients. On retroactive calculation, 70.2% of all transplants fit the KAS. Transplants that fit the KAS had inferior 1- and 5-yr patient survival (95.5% vs. 98.8%, p = 0.048, and 83.4% vs. 91.7%, p = 0.018) and similar 1- and 5-yr graft survival compared to transplants that did not fit the KAS (91.3% vs. 94.1%, p = 0.276, and 72.7% vs. 73.9%, p = 0.561). While EPTS correlated with recipient survival (HR = 2.96, p < 0.001), KDRI correlated with both recipient (HR = 3.56, p < 0.001) and graft survival (HR = 3.23, p < 0.001). Overall, retroactive application of the KAS to transplants performed in the ECD/SCD era did not identify superior patient survival for kidneys allocated in accordance with the KAS.
Assuntos
Sobrevivência de Enxerto , Falência Renal Crônica/cirurgia , Transplante de Rim , Alocação de Recursos/tendências , Obtenção de Tecidos e Órgãos/tendências , Adulto , Fatores Etários , Feminino , Seguimentos , Taxa de Filtração Glomerular , Humanos , Testes de Função Renal , Masculino , Pessoa de Meia-Idade , Prognóstico , Sistema de Registros , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida , Fatores de Tempo , Doadores de TecidosRESUMO
Diabetes and post-transplant survival have been linked. However, the impact on post-transplant survival of patients supported on Continuous Flow (CF) axial left ventricular assist devices (LVAD) as a bridge to transplant (BTT) with diabetes has not been widely studied. This study attempts to assess the impact of diabetes type II (DM type II) as a comorbidity influencing survival patterns in the post-cardiac transplant population supported on LVADs and to test if the presence of a pre- transplant durable LVAD acts as an independent risk factor in long-term post-transplant survival. The UNOS database population from 2004 to 2015 was used to construct the cohorts. A total of 21,032 were transplanted during this period. The transplant data were further queried to extract CF-axial flow pumps BTT (HMII-BTT) patients and patients who did not have VAD support before the transplant. A total of 4224 transplant recipients had HMII at the time of transplant, and 13,131 did not have VAD support. Propensity analysis was performed, and 4107 recipients of similar patient characteristics to those in the BTT group were selected for comparison. The patients with a VAD had significantly reduced survival at 2 years post-transplant (p = 0.00514) but this trend did not persist at 5 years (p = 0.0617) and 10 years post-transplant (p = 0.183). Patients with diabetes and a VAD significantly decreased survival at 2 years (p = 0.00204), 5 years (p = 0.00029), and 10 years (p = 0.00193). The presence of a durable LVAD is not an independent risk factor for long-term survival. Diabetes has a longstanding effect on the posttransplant survival of BTT patients.
Assuntos
Bases de Dados Factuais , Transplante de Coração , Coração Auxiliar , Pontuação de Propensão , Humanos , Masculino , Feminino , Pessoa de Meia-Idade , Transplante de Coração/efeitos adversos , Adulto , Fatores de Risco , Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/cirurgia , Diabetes Mellitus Tipo 2/complicações , Diabetes Mellitus Tipo 2/mortalidade , Resultado do Tratamento , Estudos Retrospectivos , IdosoRESUMO
BACKGROUND: The age profile of organ donors and patients on lung transplantation (LT) waiting lists have changed over time. In Europe, the donor population has aged much more rapidly than the recipient population, making allocation decisions on lungs from older donors common. In this study we assessed the impact of donor and recipient age discrepancy on LT outcomes in the UK and France. METHODS: A retrospective analysis of all adult single or bilateral LT in France and the UK between 2010 and 2021. Recipients were stratified into 3 age author groups: young (≤30 years), middle-aged (30-60) and older (≥60). Their donors were also stratified into 2 groups <60, ≥60. Primary graft dysfunction (PGD) rates and recipient survival was compared between matched and mismatched donor and recipient age groups. Propensity matching was employed to minimize covariate imbalances and to improve the internal validity of our results. RESULTS: Our study cohort was 4,696 lung transplant recipients (LTRs). In young and older LTRs, there was no significant difference in 1 and 5-year post-transplant survival dependent on the age category of the donor. Young LTRs who received older donor grafts had a higher risk of severe grade 3 PGD. CONCLUSION: Our findings show that clinically usable organs from older donors can be utilized safely in LT, even for younger recipients. Further research is needed to assess if the higher rate of PGD3 associated with use of older donors has an effect on long-term outcomes.
Assuntos
Transplante de Pulmão , Doadores de Tecidos , Humanos , Transplante de Pulmão/mortalidade , Pessoa de Meia-Idade , Estudos Retrospectivos , Masculino , Feminino , Adulto , Fatores Etários , França/epidemiologia , Listas de Espera/mortalidade , Sobrevivência de Enxerto , Transplantados/estatística & dados numéricos , Taxa de Sobrevida/tendências , Europa (Continente)/epidemiologia , Reino Unido/epidemiologia , Obtenção de Tecidos e Órgãos , Disfunção Primária do Enxerto/epidemiologiaRESUMO
Acute graft-versus-host disease (aGvHD) remains a major cause of morbidity and mortality after allogeneic hematopoietic stem cell transplantation (HSCT). We performed RNA analysis of 1408 candidate genes in bone marrow samples obtained from 167 patients undergoing HSCT. RNA expression data were used in a machine learning algorithm to predict the presence or absence of aGvHD using either random forest or extreme gradient boosting algorithms. Patients were randomly divided into training (2/3 of patients) and validation (1/3 of patients) sets. Using post-HSCT RNA data, the machine learning algorithm selected 92 genes for predicting aGvHD that appear to play a role in PI3/AKT, MAPK, and FOXO signaling, as well as microRNA. The algorithm selected 20 genes for predicting survival included genes involved in MAPK and chemokine signaling. Using pre-HSCT RNA data, the machine learning algorithm selected 400 genes and 700 genes predicting aGvHD and overall survival, but candidate signaling pathways could not be specified in this analysis. These data show that NGS analyses of RNA expression using machine learning algorithms may be useful biomarkers of aGvHD and overall survival for patients undergoing HSCT, allowing for the identification of major signaling pathways associated with HSCT outcomes and helping to dissect the complex steps involved in the development of aGvHD. The analysis of pre-HSCT bone marrow samples may lead to pre-HSCT interventions including choice of remission induction regimens and modifications in patient health before HSCT.
RESUMO
BACKGROUND: Recently, several centers in the United States have begun performing donation after circulatory death (DCD) heart transplants (HTs) in adults. We sought to characterize the recent use of DCD HT, waitlist time, and outcomes compared to donation after brain death (DBD). METHODS: Using the United Network for Organ Sharing database, 10,402 adult (aged >18 years) HT recipients from January 2019 to June 2022 were identified: 425 (4%) were DCD and 9,977 (96%) were DBD recipients. Posttransplant outcomes in matched and unmatched cohorts and waitlist times were compared between groups. RESULTS: DCD and DBD recipients had similar age (57 years for both, p = 0.791). DCD recipients were more likely White (67% vs 60%, p = 0.002), on left ventricular assist device (LVAD; 40% vs 32%, p < 0.001), and listed as status 4 to 6 (60% vs 24%, p < 0.001); however, less likely to require inotropes (22% vs 40%, p < 0.001) and preoperative extracorporeal membrane oxygenation (0.9% vs 6%, p < 0.001). DCD donors were younger (29 vs 32 years, p < 0.001) and had less renal dysfunction (15% vs 39%, p < 0.001), diabetes (1.9% vs 3.8%, p = 0.050), or hypertension (9.9% vs 16%, p = 0.001). In matched and unmatched cohorts, early survival was similar (p = 0.22). Adjusted waitlist time was shorter in DCD group (21 vs 31 days, p < 0.001) compared to DBD cohort and 5-fold shorter (DCD: 22 days vs DBD: 115 days, p < 0.001) for candidates in status 4 to 6, which was 60% of DCD cohort. CONCLUSIONS: The community is using DCD mostly for those recipients who are expected to have extended waitlist times (e.g., durable LVADs, status >4). DCD recipients had similar posttransplant early survival and shorter adjusted waitlist time compared to DBD group. Given this early success, efforts should be made to expand the donor pool using DCD, especially for traditionally disadvantaged recipients on the waitlist.
Assuntos
Transplante de Coração , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Doadores de Tecidos , Morte Encefálica , Fatores de Tempo , Sobrevivência de Enxerto , Estudos Retrospectivos , MorteRESUMO
BACKGROUND: The long-term survival of liver transplant (LT) recipients is essential for optimizing organ allocation and estimating mortality outcomes. While models like the Model-for-End-Stage-Liver-Disease (MELD) predict 90-day mortality on the waiting list, they do not predict post-LT survival accurately. There is a need for predictive models that can forecast post-LT survival beyond the immediate period after transplantation. METHOD: This study introduces new temporal variation features for predicting post-LT survival during the waiting list period. Cox Proportional-Hazards regression (CoxPH), Random Survival Forest (RSF), and Extreme Gradient Boosting (XGB) models are utilized, along with patient demographics and waiting list duration. Data from 716 LT patients from the University of Minnesota CTSI (2011-2021) are used to develop, evaluate, and compare post-LT survival prediction models. RESULTS: The temporal variation features, particularly when combined with the RSF model, proved most effective in predicting post-LT survival, with a C-index of 0.71 and an IBS of 0.151. This outperformed the predictive capability of the most recent MELD score, which had a C-index of <0.51 in the same cohort. CONCLUSIONS: Incorporating temporal variation features with the RSF model enhances long-term post-LT survival predictions. These insights can assist clinicians and patients in making more informed decisions about organ allocation and understanding the utility of LT, ultimately leading to improved patient outcomes.
RESUMO
Background: Alcohol use disorder (AUD) is a significant source of end-stage liver disease and liver failure and an indication for liver transplant (LT). Historically, LT for alcoholic liver disease (ALD) required 6 months of alcohol abstinence. Recently, it has been demonstrated that early LT (< 6 months of abstinence) in strictly selected group of patients provides survival benefit while keeping the relapse to harmful drinking at acceptable levels. This practice has been reflected in the Dallas consensus, but more data are needed to appropriately risk stratify the patient from the perspective of return to harmful alcohol drinking post-transplant. This "6-month rule" has been highly debated and recent data demonstrated that the duration of pre-transplant sobriety is not related with an increased risk of relapse to alcohol post-transplant. We performed a meta-analysis to compare the rate of alcohol relapse in individuals having standard vs. early LT. Methods: MEDLINE and SCOPUS were searched for randomized controlled trials (RCTs), observational studies, and case-control studies from their inception through June 2022. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMSA) 2009 checklist guidelines were followed for this meta-analysis. Studies comparing post-transplant outcomes, such as alcohol relapse, in individuals following standard vs. early LT, were included. Reviews, case studies, conference abstracts, clinical trials with only an abstract, and studies with inadequate data for extraction were all disqualified. The data were retrieved, gathered, and examined. The random effects model was used to generate forest plots. For the analysis, a P-value of 0.05 was considered significant. Results: Thirty-four studies were discovered in the initial search. Three studies were included in this systematic review and meta-analysis incorporating 367 patients. Mean age was 51.7 years. Out of 367 patients, 173 (47%) underwent early LT. Out of three studies included, one study demonstrated decreased probability of alcohol relapse in patients undergoing early LT, whereas the other two showed the opposite result. All of the included studies were analyzed and had minimal risk of bias. Pooled analysis demonstrates that the difference in alcohol relapse between early vs. standard LT was insignificant (odds ratio: 1.24, 95% confidence interval: 0.75 - 2.06, P = 0.40). Conclusion: Our results show that early LT is not associated with increased risk of alcohol relapse post-transplant when compared with a mandatory 6-month abstinence period. Hence, individuals with ALD should not be categorically rejected from LT merely on the criteria of 6 months of abstinence. Other selection criteria based on the need and post-transplant outcomes should be utilized.
RESUMO
Background: Several genetic variants are associated with chronic liver disease. The role of these variants in outcomes after liver transplantation (LT) is uncertain. The aim of this study was to determine if donor genotype at risk-associated variants in PNPLA3 (rs738409 C>G, p.I148M) and HSD17B13 (rs72613567 T>TA; rs80182459, p.A192Lfs∗8) influences post-LT survival. Methods: In this retrospective cohort study, data on 2346 adults who underwent first-time LT between January 1, 1999 and June 30, 2020 and who had donor DNA samples available at five large Transplant Immunology Laboratories in Texas, USA, were obtained from the United Network for Organ Sharing (UNOS). Duplicates, patients with insufficient donor DNA for genotyping, those who were <18 years of age at the time of transplant, had had a previous transplant or had missing genotype data were excluded. The primary outcomes were patient and graft survival after LT. The association between donor genotype and post-LT survival was examined using Kaplan-Meier method and multivariable-adjusted Cox proportional hazards models. Findings: Median age of LT recipients was 57 [interquartile range (IQR), 50-62] years; 837 (35.7%) were women; 1362 (58.1%) White, 713 (30.4%) Hispanic, 182 (7.8%) Black/African-American. Median follow-up time was 3.95 years. Post-LT survival was not affected by donor PNPLA3 genotype but was significantly reduced among recipients of livers with two HSD17B13 loss-of-function (LoF) variants compared to those receiving livers with no HSD17B13 LoF alleles (unadjusted one-year survival: 82.6% vs 93.9%, P < 0.0001; five-year survival: 73.1% vs 82.9%, P = 0.0017; adjusted hazard ratio [HR], 2.25; 95% CI, 1.61-3.15 after adjustment for recipient age, sex, and self-reported ethnicity). Excess mortality was restricted to those receiving steroid induction immunosuppression (crude 90-day post-LT mortality, 9.3% [95% CI, 1.9%-16.1%] vs 1.9% [95% CI, 0.9%-2.9%] in recipients of livers with two vs no HSD17B13 LoF alleles, P = 0.0012; age, sex, and ethnicity-adjusted HR, 2.85; 95% CI, 1.72-4.71, P < 0.0001). No reduction was seen among patients who did not receive steroid induction (90-day mortality 3.1% [95% CI, 0%-7.3%] vs 2% [95% CI, 0.9%-3.1%], P = 0.65; adjusted HR, 1.17; 95% CI, 0.66-2.08, P = 0.60). Interpretation: Donor HSD17B13 genotype adversely affects post-LT survival in patients receiving steroid induction. Additional studies are required to confirm this association. Funding: The National Institutes of Health and American Society of Transplant Surgeons Collaborative Scientist Grant.
RESUMO
Recurrent mutations in TP53, RAS pathway and JAK2 genes were shown to be highly prognostic of allogeneic hematopoietic cell transplant (alloHCT) outcomes in myelodysplastic syndromes (MDS). However, a significant proportion of MDS patients has no such mutations. Whole-genome sequencing (WGS) empowers the discovery of novel prognostic genetic alterations. We conducted WGS on pre-alloHCT whole-blood samples from 494 MDS patients. To nominate genomic candidates and subgroups that are associated with overall survival, we ran genome-wide association tests via gene-based, sliding window and cluster-based multivariate proportional hazard models. We used a random survival forest (RSF) model with build-in cross-validation to develop a prognostic model from identified genomic candidates and subgroups, patient-, disease- and HCT-related clinical factors. Twelve novel regions and three molecular signatures were identified with significant associations to overall survival. Mutations in two novel genes, CHD1 and DDX11, demonstrated a negative impact on survival in AML/MDS and lymphoid cancer data from the Cancer Genome Atlas (TCGA). From unsupervised clustering of recurrent genomic alterations, genomic subgroup with TP53/del5q is characterized with the significant association to inferior overall survival and replicated by an independent dataset. From supervised clustering of all genomic variants, more molecular signatures related to myeloid malignancies are characterized from supervised clustering, including Fc-receptor FCGRs, catenin complex CDHs and B-cell receptor regulators MTUS2/RFTN1. The RSF model with genomic candidates and subgroups, and clinical variables achieved superior performance compared to models that included only clinical variables.
Assuntos
Transplante de Células-Tronco Hematopoéticas , Síndromes Mielodisplásicas , Humanos , Estudo de Associação Genômica Ampla , Síndromes Mielodisplásicas/genética , Mutação , Prognóstico , DNA Helicases/genética , RNA Helicases DEAD-box/genéticaRESUMO
OBJECTIVE: Many pediatric Fontan patients require heart transplant, but this cohort is understudied given the difficulty in identifying these patients in national registries. We sought to characterize survival post-transplant in a large cohort of pediatric patients undergoing the Fontan. METHODS: The United Network for Organ Sharing and Pediatric Health Information System were used to identify Fontan heart transplant recipients aged less than 18 years (n = 241) between 2005 and 2022. Decompensation was defined as the presence of extracorporeal membrane oxygenation, ventilation, hepatic/renal dysfunction, paralytics, or total parenteral nutrition at transplant. RESULTS: Median age at transplant was 9 (interquartile range, 5-12) years. Median waitlist time was 107 (37-229) days. Median volume across 32 center was 8 (3-11) cases. Approximately half (n = 107, 45%) of recipients had 1A/1 initial listing status. Sixty-four patients (28%) were functionally impaired at transplant, 10 patients (4%) were ventilated, and 18 patients (8%) had ventricular assist device support. Fifty-nine patients (25%) had hepatic dysfunction, and 15 patients (6%) had renal dysfunction. Twenty-one patients (9%) were dependent on total parenteral nutrition. Median postoperative stay was 24 (14-46) days, and in-hospital mortality was 7%. Kaplan-Meier analysis showed 1- and 5-year survivals of 89% (95% CI, 85-94) and 74% (95% CI, 81-86), respectively. Kaplan-Meier of Fontan patients without decompensation (n = 154) at transplant demonstrated 1- and 5-year survivals of 93% (95% CI, 88-97) and 88% (95% CI, 82-94), respectively. In-hospital mortality was higher in decompensated patients (11% vs 4%, P = .023). Multivariable analysis showed that decompensation predicted worse post-transplant survival (hazard ratio, 2.47; 95% CI, 1.16-5.22; P = .018), whereas older age at transplant predicted superior post-transplant survival (hazard ratio, 0.89/year; 95% CI, 0.80-0.98; P = .019). CONCLUSIONS: Pediatric Fontan post-transplant outcomes are promising, although early mortality remains high. For nondecompensated pediatric patients at transplant without end-organ disease (>63% of cohort), early mortality is circumvented and post-transplant survival is excellent and similar to all pediatric transplantation.