Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 53
Filter
1.
Am J Transplant ; 24(4): 606-618, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38142955

ABSTRACT

Kidney transplantation from blood type A2/A2B donors to type B recipients (A2→B) has increased dramatically under the current Kidney Allocation System (KAS). Among living donor transplant recipients, A2-incompatible transplants are associated with an increased risk of all-cause and death-censored graft failure. In light of this, we used data from the Scientific Registry of Transplant Recipients from December 2014 until June 2022 to evaluate the association between A2→B listing and time to deceased donor kidney transplantation (DDKT) and post-DDKT outcomes for A2→B recipients. Among 53 409 type B waitlist registrants, only 12.6% were listed as eligible to accept A2→B offers ("A2-eligible"). The rates of DDKT at 1-, 3-, and 5-years were 32.1%, 61.4%, and 72.1% among A2-eligible candidates and 14.1%, 29.9%, and 44.1% among A2-ineligible candidates, with the former experiencing a 133% higher rate of DDKT (Cox weighted hazard ratio (wHR) = 2.192.332.47; P < .001). The 7-year adjusted mortality was comparable between A2→B and B-ABOc (type B/O donors to B recipients) recipients (wHR 0.780.941.13, P = .5). Moreover, there was no difference between A2→B vs B-ABOc DDKT recipients with regards to death-censored graft failure (wHR 0.771.001.29, P > .9) or all-cause graft loss (wHR 0.820.961.12, P = .6). Following its broader adoption since the implementation of the kidney allocation system, A2→B DDKT appears to be a safe and effective transplant modality for eligible candidates. As such, A2→B listing for eligible type B candidates should be expanded.


Subject(s)
Kidney Transplantation , Humans , Kidney Transplantation/adverse effects , Tissue Donors , Living Donors , Transplant Recipients , Registries , Kidney , Graft Survival
2.
Am J Transplant ; 23(12): 1980-1989, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37748554

ABSTRACT

Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.


Subject(s)
Kidney Transplantation , Humans , Aged , Middle Aged , Adolescent , Young Adult , Adult , Kidney Transplantation/adverse effects , Living Donors , Graft Survival , Graft Rejection/etiology , HLA Antigens , Risk Factors
3.
Liver Transpl ; 29(3): 268-278, 2023 03 01.
Article in English | MEDLINE | ID: mdl-36651194

ABSTRACT

Steatotic livers represent a potentially underutilized resource to increase the donor graft pool; however, 1 barrier to the increased utilization of such grafts is the heterogeneity in the definition and the measurement of macrovesicular steatosis (MaS). Digital imaging software (DIS) may better standardize definitions to study posttransplant outcomes. Using HALO, a DIS, we analyzed 63 liver biopsies, from 3 transplant centers, transplanted between 2016 and 2018, and compared macrovesicular steatosis percentage (%MaS) as estimated by transplant center, donor hospital, and DIS. We also quantified the relationship between DIS characteristics and posttransplant outcomes using log-linear regression for peak aspartate aminotransferase, peak alanine aminotransferase, and total bilirubin on postoperative day 7, as well as logistic regression for early allograft dysfunction. Transplant centers and donor hospitals overestimated %MaS compared with DIS, with better agreement at lower %MaS and less agreement for higher %MaS. No DIS analyzed liver biopsies were calculated to be >20% %MaS; however, 40% of liver biopsies read by transplant center pathologists were read to be >30%. Percent MaS read by HALO was positively associated with peak aspartate aminotransferase (regression coefficient= 1.04 1.08 1.12 , p <0.001), peak alanine aminotransferase (regression coefficient = 1.04 1.08 1.12 , p <0.001), and early allograft dysfunction (OR= 1.10 1.40 1.78 , p =0.006). There was no association between HALO %MaS and total bilirubin on postoperative day 7 (regression coefficient = 0.99 1.01 1.04 , p =0.3). DIS provides reproducible quantification of steatosis that could standardize MaS definitions and identify phenotypes associated with good clinical outcomes to increase the utilization of steatite livers.


Subject(s)
Fatty Liver , Image Processing, Computer-Assisted , Liver Transplantation , Humans , Alanine Transaminase , Aspartate Aminotransferases , Bilirubin , Biopsy , Fatty Liver/diagnostic imaging , Fatty Liver/pathology , Liver/diagnostic imaging , Liver/pathology , Liver Transplantation/methods , Software , Image Processing, Computer-Assisted/methods
4.
Radiology ; 306(3): e212403, 2023 03.
Article in English | MEDLINE | ID: mdl-36283115

ABSTRACT

Background Pre-liver transplant (LT) sarcopenia is associated with poor survival. Methods exist for measuring body composition with use of CT scans; however, it is unclear which components best predict post-LT outcomes. Purpose To quantify the association between abdominal CT-based body composition measurements and post-LT mortality in a large North American cohort. Materials and Methods This was a retrospective cohort of adult first-time deceased-donor LT recipients from 2009 to 2018 who underwent pre-LT abdominal CT scans, including at the L3 vertebral level, at Johns Hopkins Hospital. Measurements included sarcopenia (skeletal muscle index [SMI] <50 in men and <39 in women), sarcopenic obesity, myosteatosis (skeletal muscle CT attenuation <41 mean HU for body mass index [BMI] <25 and <33 mean HU for BMI ≥25), visceral adipose tissue (VAT), subcutaneous adipose tissue (SAT), and VAT/SAT ratio. Covariates in the adjusted models were selected with use of least absolute shrinkage and selection operator regression with lambda chosen by means of 10-fold cross-validation. Cox proportional hazards models were used to quantify associations with post-LT mortality. Model discrimination was quantified using the Harrell C-statistic. Results A total of 454 recipients (median age, 57 years [IQR, 50-62 years]; 294 men) were evaluated. In the adjusted model, pre-LT sarcopenia was associated with a higher hazard ratio (HR) of post-LT mortality (HR, 1.6 [95% CI: 1.1, 2.4]; C-statistic, 0.64; P = .02). SMI was significantly negatively associated with survival after adjustment for covariates. There was no evidence that myosteatosis was associated with mortality (HR, 1.3 [95% CI: 0.86, 2.1]; C-statistic, 0.64; P = .21). There was no evidence that BMI (HR, 1.2 [95% CI: 0.95, 1.4]), VAT (HR, 1.0 [95% CI: 0.98, 1.1]), SAT (HR, 1.0 [95% CI: 0.97, 1.0]), and VAT/SAT ratio (HR, 1.1 [95% CI: 0.90, 1.4]) were associated with mortality (P = .15-.77). Conclusions Sarcopenia, as assessed on routine pre-liver transplant (LT) abdominal CT scans, was the only factor significantly associated with post-LT mortality. © RSNA, 2022 See also the editorial by Ruehm in this issue.


Subject(s)
Liver Transplantation , Sarcopenia , Adult , Male , Humans , Female , Middle Aged , Sarcopenia/complications , Sarcopenia/diagnostic imaging , Retrospective Studies , Living Donors , Body Composition , Muscle, Skeletal , Tomography, X-Ray Computed/methods
6.
Transplant Direct ; 8(11): e1388, 2022 Nov.
Article in English | MEDLINE | ID: mdl-36284928

ABSTRACT

ABO type B and O kidney transplant candidates have increased difficulty identifying a compatible donor for living donor kidney transplantation (LDKT) and are harder to match in kidney paired donation registries. A2-incompatible (A2i) LDKT increases access to LDKT for these patients. To better inform living donor selection, we evaluated the association between A2i LDKT and patient and graft survival. Methods: We used weighted Cox regression to compare mortality, death-censored graft failure, and all-cause graft loss in A2i versus ABO-compatible (ABOc) recipients. Results: Using Scientific Registry of Transplant Recipients data 2000-2019, we identified 345 A2i LDKT recipients. Mortality was comparable among A2i and ABOc recipients; weighted 1-/5-/10-y mortality was 0.9%/6.5%/24.2%, respectively, among A2i LDKT recipients versus 1.4%/7.7%/22.2%, respectively, among ABOc LDKT recipients (weighted hazard ratio [wHR], 0.811.041.33; P = 0.8). However, A2i recipients faced higher risk of death-censored graft failure; weighted 1-/5-/10-y graft failure was 5.7%/11.6%/22.4% for A2i versus 1.7%/7.5%/17.2% for ABOc recipients (wHR in year 1 = 2.243.565.66; through year 5 = 1.251.782.53; through year 10 = 1.151.552.07). By comparison, 1-/5-/10-y wHRs for A1-incompatible recipients were 0.631.966.08/0.390.942.27/0.390.831.74. Conclusions: A2i LDKT is generally safe, but A2i donor/recipient pairs should be counseled about the increased risk of graft failure and be monitored as closely as their A1-incompatible counterparts posttransplant.

7.
PNAS Nexus ; 1(3): pgac124, 2022 Jul.
Article in English | MEDLINE | ID: mdl-36003074

ABSTRACT

Human leukocyte antigen class I (HLA-I) molecules bind and present peptides at the cell surface to facilitate the induction of appropriate CD8+ T cell-mediated immune responses to pathogen- and self-derived proteins. The HLA-I peptide-binding cleft contains dominant anchor sites in the B and F pockets that interact primarily with amino acids at peptide position 2 and the C-terminus, respectively. Nonpocket peptide-HLA interactions also contribute to peptide binding and stability, but these secondary interactions are thought to be unique to individual HLA allotypes or to specific peptide antigens. Here, we show that two positively charged residues located near the top of peptide-binding cleft facilitate interactions with negatively charged residues at position 4 of presented peptides, which occur at elevated frequencies across most HLA-I allotypes. Loss of these interactions was shown to impair HLA-I/peptide binding and complex stability, as demonstrated by both in vitro and in silico experiments. Furthermore, mutation of these Arginine-65 (R65) and/or Lysine-66 (K66) residues in HLA-A*02:01 and A*24:02 significantly reduced HLA-I cell surface expression while also reducing the diversity of the presented peptide repertoire by up to 5-fold. The impact of the R65 mutation demonstrates that nonpocket HLA-I/peptide interactions can constitute anchor motifs that exert an unexpectedly broad influence on HLA-I-mediated antigen presentation. These findings provide fundamental insights into peptide antigen binding that could broadly inform epitope discovery in the context of viral vaccine development and cancer immunotherapy.

9.
Liver Transpl ; 28(4): 571-580, 2022 04.
Article in English | MEDLINE | ID: mdl-34559954

ABSTRACT

Despite a documented survival benefit, older liver donor (OLD, age ≥70) graft offers are frequently declined, with utilization worsening over the last decade. To understand how offer acceptance varies by center, we studied 1113 eventually transplanted OLD grafts from 2009 to 2017 using Scientific Registry of Transplant Recipients (SRTR) data and random-intercept multilevel logistic regression. To understand how center-level acceptance of OLD graft offers might be associated with waitlist and posttransplant outcomes, we studied all adult, actively listed, liver-only candidates and recipients during the study period using Poisson regression (transplant rate), competing risks regression (waitlist mortality), and Cox regression (posttransplant mortality). Among 117 centers, OLD offer acceptance ranged from 0 (23 centers) to 95 acceptances, with a median odds ratio of 2.88. Thus, a candidate may be three times as likely to receive an OLD graft simply by listing at a different center. Centers in the highest quartile (Q4) of OLD acceptance (accepted 39% of OLD offers) accepted more nationally shared organs (Q4 versus Q1: 14.1% versus 0.0%, P < 0.001) and had higher annual liver transplant volume (Q4 versus Q1: 80 versus 21, P < 0.001). After adjustment, nationally shared OLD offers (adjusted odds ratio [aOR]: 0.16, 95% confidence interval [CI]: 0.13-0.20) and offers to centers with higher median Model for End-Stage Liver Disease (MELD) at transplant (aOR: 0.74, 95% CI: 0.62-0.87) were less likely to be accepted. OLD offers to centers with higher annual transplant volume were more likely to be accepted (aOR: 1.21, 95% CI: 1.14-1.30). Additionally, candidates listed at centers within the highest quartile of OLD graft offer acceptance had higher deceased donor liver transplantation (DDLT) rates (adjusted incidence rate ratio: 1.45, 95% CI: 1.41-1.50), lower waitlist mortality (adjusted subhazard ratio: 0.76, 95% CI: 0.72-0.76), and similar posttransplant survival (adjusted hazard ratio: 0.93, 95% CI: 0.86-1.01) when compared with those listed at centers in the lowest quartile of OLD graft offer acceptance. The wide variation in OLD offer acceptance supports the need for optimizing the organ offer process and efficiently directing OLD offers to centers more likely to use them.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Tissue and Organ Procurement , Adult , Aged , Aged, 80 and over , End Stage Liver Disease/surgery , Humans , Liver Transplantation/adverse effects , Living Donors , Severity of Illness Index , Tissue Donors , Waiting Lists
10.
Am J Transplant ; 22(4): 1031-1036, 2022 04.
Article in English | MEDLINE | ID: mdl-34464500

ABSTRACT

Donor/recipient incompatibility in kidney transplantation classically refers to ABO/HLA-incompatibility. Kidney paired donation (KPD) was historically established to circumvent ABO/HLA-incompatibility, with the goal of identifying ABO/HLA-compatible matches. However, there is a broad range of donor factors known to impact recipient outcomes beyond ABO/HLA-incompatibility, such as age and weight, and quantitative tools are now available to empirically compare potential living donors across many of these factors, such as the living donor kidney donor profile index (LKDPI). Moreover, the detrimental impact of mismatch at other HLA antigens (such as DQ) and epitope mismatching on posttransplant outcomes has become increasingly recognized. Thus, it is time for a new paradigm of incompatibility that considers all of these risks factors together in assessing donor/recipient compatibility and the potential utility for KPD. Under this new paradigm of incompatibility, we show how the LKDPI and other tools can be used to identify donor/recipient incompatibilities that could be improved through KPD, even for those with a traditionally "compatible" living donor.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Tissue and Organ Procurement , Donor Selection , HLA Antigens , Humans , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Living Donors
11.
Transplantation ; 106(3): 543-551, 2022 03 01.
Article in English | MEDLINE | ID: mdl-34259435

ABSTRACT

BACKGROUND: Historically, donation after circulatory death (DCD) livers were frequently discarded because of higher mortality and graft loss after liver transplantation (LT). However, the demand for LT continues to outstrip the supply of "acceptable" organs. Additionally, changes in the donor pool, organ allocation, and clinical management of donors and recipients, and improved clinical protocols might have altered post-DCD-LT outcomes. METHODS: We studied 5975 recovered DCD livers using US Scientific Registry of Transplant Recipients data from 2005 to 2017, with a comparison group of 78 235 adult donation after brain death (DBD) livers recovered during the same time period. We quantified temporal trends in discard using adjusted multilevel logistic regression and temporal trends in post-LT mortality and graft loss for DCD LT recipients using adjusted Cox regression. RESULTS: DCD livers were more likely to be discarded than DBD livers across the entire study period, and the relative likelihood of discard increased over time (adjusted odds ratio [aOR] of discard DCD versus DBD 3.854.455.14 2005-2007, 5.225.876.59 2015-2017) despite improving outcomes after DCD LT. Mortality risk for DCD LTs decreased in each time period (compared with 2005-2007, aHR 2008-2011 0.720.840.97, aHR 2012-2014 0.480.580.70, aHR 2015-2017 0.340.430.55), as did risk of graft loss (compared with 2005-2007, aHR 2008-2011 0.690.810.94, aHR 2012-2014 0.450.550.67, aHR 2015-2017 0.360.450.56). CONCLUSIONS: Despite dramatic improvements in outcomes of DCD LT recipients, DCD livers remain substantially more likely to be discarded than DBD livers, and this discrepancy has actually increased over time. DCD livers are underutilized and have the potential to expand the donor pool.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Adult , Brain Death , Death , Graft Survival , Humans , Liver/surgery , Liver Transplantation/adverse effects , Retrospective Studies , Tissue Donors , United States
12.
Cells ; 10(9)2021 09 10.
Article in English | MEDLINE | ID: mdl-34572028

ABSTRACT

Engineered T cell receptor T (TCR-T) cell therapy has facilitated the generation of increasingly reliable tumor antigen-specific adaptable cellular products for the treatment of human cancer. TCR-T cell therapies were initially focused on targeting shared tumor-associated peptide targets, including melanoma differentiation and cancer-testis antigens. With recent technological developments, it has become feasible to target neoantigens derived from tumor somatic mutations, which represents a highly personalized therapy, since most neoantigens are patient-specific and are rarely shared between patients. TCR-T therapies have been tested for clinical efficacy in treating solid tumors in many preclinical studies and clinical trials all over the world. However, the efficacy of TCR-T therapy for the treatment of solid tumors has been limited by a number of factors, including low TCR avidity, off-target toxicities, and target antigen loss leading to tumor escape. In this review, we discuss the process of deriving tumor antigen-specific TCRs, including the identification of appropriate tumor antigen targets, expansion of antigen-specific T cells, and TCR cloning and validation, including techniques and tools for TCR-T cell vector construction and expression. We highlight the achievements of recent clinical trials of engineered TCR-T cell therapies and discuss the current challenges and potential solutions for improving their safety and efficacy, insights that may help guide future TCR-T studies in cancer.


Subject(s)
CD8-Positive T-Lymphocytes/transplantation , Immunotherapy, Adoptive , Neoplasms/therapy , Receptors, Chimeric Antigen/genetics , Animals , CD8-Positive T-Lymphocytes/immunology , CD8-Positive T-Lymphocytes/metabolism , Humans , Immunotherapy, Adoptive/adverse effects , Neoplasms/genetics , Neoplasms/immunology , Neoplasms/metabolism , Receptors, Chimeric Antigen/metabolism , Treatment Outcome , Tumor Microenvironment
13.
J Immunother Cancer ; 9(7)2021 07.
Article in English | MEDLINE | ID: mdl-34244308

ABSTRACT

BACKGROUND: Neoantigen (NeoAg) peptides displayed at the tumor cell surface by human leukocyte antigen molecules show exquisite tumor specificity and can elicit T cell mediated tumor rejection. However, few NeoAgs are predicted to be shared between patients, and none to date have demonstrated therapeutic value in the context of vaccination. METHODS: We report here a phase I trial of personalized NeoAg peptide vaccination (PPV) of 24 stage III/IV non-small cell lung cancer (NSCLC) patients who had previously progressed following multiple conventional therapies, including surgery, radiation, chemotherapy, and tyrosine kinase inhibitors (TKIs). Primary endpoints of the trial evaluated feasibility, tolerability, and safety of the personalized vaccination approach, and secondary trial endpoints assessed tumor-specific immune reactivity and clinical responses. Of the 16 patients with epidermal growth factor receptor (EGFR) mutations, nine continued TKI therapy concurrent with PPV and seven patients received PPV alone. RESULTS: Out of 29 patients enrolled in the trial, 24 were immunized with personalized NeoAg peptides. Aside from transient rash, fatigue and/or fever observed in three patients, no other treatment-related adverse events were observed. Median progression-free survival and overall survival of the 24 vaccinated patients were 6.0 and 8.9 months, respectively. Within 3-4 months following initiation of PPV, seven RECIST-based objective clinical responses including one complete response were observed. Notably, all seven clinical responders had EGFR-mutated tumors, including four patients that had continued TKI therapy concurrently with PPV. Immune monitoring showed that five of the seven responding patients demonstrated vaccine-induced T cell responses against EGFR NeoAg peptides. Furthermore, two highly shared EGFR mutations (L858R and T790M) were shown to be immunogenic in four of the responding patients, all of whom demonstrated increases in peripheral blood neoantigen-specific CD8+ T cell frequencies during the course of PPV. CONCLUSIONS: These results show that personalized NeoAg vaccination is feasible and safe for advanced-stage NSCLC patients. The clinical and immune responses observed following PPV suggest that EGFR mutations constitute shared, immunogenic neoantigens with promising immunotherapeutic potential for large subsets of NSCLC patients. Furthermore, PPV with concurrent EGFR inhibitor therapy was well tolerated and may have contributed to the induction of PPV-induced T cell responses.


Subject(s)
Cancer Vaccines/therapeutic use , Carcinoma, Non-Small-Cell Lung/drug therapy , Lung Neoplasms/drug therapy , Aged , Aged, 80 and over , Cancer Vaccines/pharmacology , Carcinoma, Non-Small-Cell Lung/pathology , ErbB Receptors/metabolism , Humans , Lung Neoplasms/pathology , Male , Middle Aged , Mutation
14.
Am J Transplant ; 21(10): 3305-3311, 2021 10.
Article in English | MEDLINE | ID: mdl-33870635

ABSTRACT

Recently, model for end-stage liver disease (MELD)-based liver allocation in the United States has been questioned based on concerns that waitlist mortality for a given biologic MELD (bMELD), calculated using laboratory values alone, might be higher at certain centers in certain locations across the country. Therefore, we aimed to quantify the center-level variation in bMELD-predicted mortality risk. Using Scientific Registry of Transplant Recipients (SRTR) data from January 2015 to December 2019, we modeled mortality risk in 33 260 adult, first-time waitlisted candidates from 120 centers using multilevel Poisson regression, adjusting for sex, and time-varying age and bMELD. We calculated a "MELD correction factor" using each center's random intercept and bMELD coefficient. A MELD correction factor of +1 means that center's candidates have a higher-than-average bMELD-predicted mortality risk equivalent to 1 bMELD point. We found that the "MELD correction factor" median (IQR) was 0.03 (-0.47, 0.52), indicating almost no center-level variation. The number of centers with "MELD correction factors" within ±0.5 points, and between ±0.5-± 1, ±1.0-±1.5, and ±1.5-±2.0 points was 62, 41, 13, and 4, respectively. No centers had waitlisted candidates with a higher-than-average bMELD-predicted mortality risk beyond ±2 bMELD points. Given that bMELD similarly predicts waitlist mortality at centers across the country, our results support continued MELD-based prioritization of waitlisted candidates irrespective of center.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Tissue and Organ Procurement , End Stage Liver Disease/surgery , Humans , Severity of Illness Index , Waiting Lists
15.
JAMA Surg ; 156(4): e207083, 2021 04 01.
Article in English | MEDLINE | ID: mdl-33566079

ABSTRACT

Importance: Historically, deceased organ donation was lower among Black compared with White populations, motivating efforts to reduce racial disparities. The overarching effect of these efforts in Black and other racial/ethnic groups remains unclear. Objective: To examine changes in deceased organ donation over time. Design, Setting, and Participants: This population-based cohort study used data from January 1, 1999, through December 31, 2017, from the Scientific Registry of Transplant Recipients to quantify the number of actual deceased organ donors, and from the Centers for Disease Control and Prevention Wide-ranging Online Data for Epidemiologic Research Detailed Mortality File to quantify the number of potential donors (individuals who died under conditions consistent with organ donation). Data were analyzed from December 2, 2019, to May 14, 2020. Exposures: Race and ethnicity of deceased and potential donors. Main Outcomes and Measures: For each racial/ethnic group and year, a donation ratio was calculated as the number of actual deceased donors divided by the number of potential donors. Direct age and sex standardization was used to allow for group comparisons, and Poisson regression was used to quantify changes in donation ratio over time. Results: A total of 141 534 deceased donors and 5 268 200 potential donors were included in the analysis. Among Black individuals, the donation ratio increased 2.58-fold from 1999 to 2017 (yearly change in adjusted incidence rate ratio [aIRR], 1.05; 95% CI, 1.05-1.05; P < .001). This increase was significantly greater than the 1.60-fold increase seen in White individuals. Nevertheless, substantial racial differences remained, with Black individuals still donating at only 69% the rate of White individuals in 2017 (P < .001). Among other racial minority populations, changes were less drastic. Deceased organ donation increased 1.80-fold among American Indian/Alaska Native and 1.40-fold among Asian or Pacific Islander populations, with substantial racial differences remaining in 2017 (American Indian/Alaska Native population donation at 28% and Asian/Pacific Islander population donation at 85% the rate of the White population). Deceased organ donation differences between Hispanic/Latino and non-Hispanic/Latino populations increased over time (4% lower in 2017). Conclusions and Relevance: The findings of this cohort study suggest that differences in deceased organ donation between White and some racial minority populations have attenuated over time. The greatest gains were observed among Black individuals, who have been the primary targets of study and intervention. Despite improvements, substantial differences remain, suggesting that novel approaches are needed to understand and address relatively lower rates of deceased organ donation among all racial minorities.


Subject(s)
Ethnic and Racial Minorities , Tissue and Organ Procurement/statistics & numerical data , Female , Humans , Male , United States
16.
Am J Transplant ; 21(5): 1838-1847, 2021 05.
Article in English | MEDLINE | ID: mdl-33107180

ABSTRACT

COVID-19 has profoundly affected the American health care system; its effect on the liver transplant (LT) waitlist based on COVID-19 incidence has not been characterized. Using SRTR data, we compared observed LT waitlist registrations, waitlist mortality, deceased donor LTs (DDLT), and living donor LTs (LDLT) 3/15/2020-8/31/2020 to expected values based on historical trends 1/2016-1/2020, stratified by statewide COVID-19 incidence. Overall, from 3/15 to 4/30, new listings were 11% fewer than expected (IRR = 0.84 0.890.93 ), LDLTs were 49% fewer (IRR = 0.37 0.510.72 ), and DDLTs were 9% fewer (IRR = 0.85 0.910.97 ). In May, new listings were 21% fewer (IRR = 0.74 0.790.84 ), LDLTs were 42% fewer (IRR = 0.39 0.580.85 ) and DDLTs were 13% more (IRR = 1.07 1.151.23 ). Centers in states with the highest incidence 3/15-4/30 had 59% more waitlist deaths (IRR = 1.09 1.592.32 ) and 34% fewer DDLTs (IRR = 0.50 0.660.86 ). By August, waitlist outcomes were occurring at expected rates, except for DDLT (13% more across all incidences). While the early COVID-affected states endured major transplant practice changes, later in the pandemic the newly COVID-affected areas were not impacted to the same extent. These results speak to the adaptability of the transplant community in addressing the pandemic and applying new knowledge to patient care.


Subject(s)
COVID-19 , Liver Transplantation/statistics & numerical data , Humans , Liver Transplantation/trends , Pandemics , Retrospective Studies , United States/epidemiology , Waiting Lists
17.
Am J Transplant ; 21(1): 198-207, 2021 01.
Article in English | MEDLINE | ID: mdl-32506639

ABSTRACT

Infections remain a major threat to successful kidney transplantation (KT). To characterize the landscape and impact of post-KT infections in the modern era, we used United States Renal Data System (USRDS) data linked to the Scientific Registry of Transplant Recipients (SRTR) to study 141 661 Medicare-primary kidney transplant recipients from January 1, 1999 to December 31, 2014. Infection diagnoses were ascertained by International Classification of Diseases, Ninth Revision (ICD-9) codes. The cumulative incidence of a post-KT infection was 36.9% at 3 months, 53.7% at 1 year, and 78.0% at 5 years. The most common infections were urinary tract infection (UTI; 46.8%) and pneumonia (28.2%). Five-year mortality for kidney transplant recipients who developed an infection was 24.9% vs 7.9% for those who did not, and 5-year death-censored graft failure (DCGF) was 20.6% vs 10.1% (P < .001). This translated to a 2.22-fold higher mortality risk (adjusted hazard ratio [aHR]: 2.15 2.222.29 , P < .001) and 1.92-fold higher DCGF risk (aHR: 1.84 1.911.98 , P < .001) for kidney transplant recipients who developed an infection, although the magnitude of this higher risk varied across infection types (for example, 3.11-fold higher mortality risk for sepsis vs 1.62-fold for a UTI). Post-KT infections are common and substantially impact mortality and DCGF, even in the modern era. Kidney transplant recipients at high risk for infections might benefit from enhanced surveillance or follow-up to mitigate these risks.


Subject(s)
Kidney Transplantation , Aged , Graft Rejection/epidemiology , Graft Rejection/etiology , Humans , Kidney Transplantation/adverse effects , Medicare , Risk Factors , Transplant Recipients , United States/epidemiology
18.
Transplantation ; 105(2): 436-442, 2021 02 01.
Article in English | MEDLINE | ID: mdl-32235255

ABSTRACT

BACKGROUND: Desensitization protocols for HLA-incompatible living donor kidney transplantation (ILDKT) vary across centers. The impact of these, as well as other practice variations, on ILDKT outcomes remains unknown. METHODS: We sought to quantify center-level variation in mortality and graft loss following ILDKT using a 25-center cohort of 1358 ILDKT recipients with linkage to Scientific Registry of Transplant Recipients for accurate outcome ascertainment. We used multilevel Cox regression with shared frailty to determine the variation in post-ILDKT outcomes attributable to between-center differences and to identify any center-level characteristics associated with improved post-ILDKT outcomes. RESULTS: After adjusting for patient-level characteristics, only 6 centers (24%) had lower mortality and 1 (4%) had higher mortality than average. Similarly, only 5 centers (20%) had higher graft loss and 2 had lower graft loss than average. Only 4.7% of the differences in mortality (P < 0.01) and 4.4% of the differences in graft loss (P < 0.01) were attributable to between-center variation. These translated to a median hazard ratio of 1.36 for mortality and 1.34 of graft loss for similar candidates at different centers. Post-ILDKT outcomes were not associated with the following center-level characteristics: ILDKT volume and transplanting a higher proportion of highly sensitized, prior transplant, preemptive, or minority candidates. CONCLUSIONS: Unlike most aspects of transplantation in which center-level variation and volume impact outcomes, we did not find substantial evidence for this in ILDKT. Our findings support the continued practice of ILDKT across these diverse centers.


Subject(s)
Graft Rejection/prevention & control , Graft Survival/drug effects , HLA Antigens/immunology , Healthcare Disparities , Histocompatibility , Immunosuppressive Agents/therapeutic use , Isoantibodies/blood , Kidney Transplantation , Living Donors , Practice Patterns, Physicians' , Adult , Female , Graft Rejection/blood , Graft Rejection/immunology , Graft Rejection/mortality , Humans , Immunosuppressive Agents/adverse effects , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Male , Middle Aged , Quality Indicators, Health Care , Registries , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States
19.
Am J Transplant ; 21(4): 1612-1621, 2021 04.
Article in English | MEDLINE | ID: mdl-33370502

ABSTRACT

Incompatible living donor kidney transplant recipients (ILDKTr) have pre-existing donor-specific antibody (DSA) that, despite desensitization, may persist or reappear with resulting consequences, including delayed graft function (DGF) and acute rejection (AR). To quantify the risk of DGF and AR in ILDKT and downstream effects, we compared 1406 ILDKTr to 17 542 compatible LDKT recipients (CLDKTr) using a 25-center cohort with novel SRTR linkage. We characterized DSA strength as positive Luminex, negative flow crossmatch (PLNF); positive flow, negative cytotoxic crossmatch (PFNC); or positive cytotoxic crossmatch (PCC). DGF occurred in 3.1% of CLDKT, 3.5% of PLNF, 5.7% of PFNC, and 7.6% of PCC recipients, which translated to higher DGF for PCC recipients (aOR = 1.03 1.682.72 ). However, the impact of DGF on mortality and DCGF risk was no higher for ILDKT than CLDKT (p interaction > .1). AR developed in 8.4% of CLDKT, 18.2% of PLNF, 21.3% of PFNC, and 21.7% of PCC recipients, which translated to higher AR (aOR PLNF = 1.45 2.093.02 ; PFNC = 1.67 2.403.46 ; PCC = 1.48 2.243.37 ). Although the impact of AR on mortality was no higher for ILDKT than CLDKT (p interaction = .1), its impact on DCGF risk was less consequential for ILDKT (aHR = 1.34 1.621.95 ) than CLDKT (aHR = 1.96 2.292.67 ) (p interaction = .004). Providers should consider these risks during preoperative counseling, and strategies to mitigate them should be considered.


Subject(s)
Kidney Transplantation , Delayed Graft Function/etiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney Transplantation/adverse effects , Living Donors , Retrospective Studies , Risk Factors
20.
Am J Transplant ; 21(4): 1564-1575, 2021 04.
Article in English | MEDLINE | ID: mdl-32949093

ABSTRACT

Desensitization has enabled incompatible living donor kidney transplantation (ILDKT) across HLA/ABO barriers, but added immunomodulation might put patients at increased risk of infections. We studied 475 recipients from our center from 2010 to 2015, categorized by desensitization intensity: none/compatible (n = 260), low (0-4 plasmaphereses, n = 47), moderate (5-9, n = 74), and high (≥10, n = 94). The 1-year cumulative incidence of infection was 50.1%, 49.8%, 66.0%, and 73.5% for recipients who received none, low, moderate, and high-intensity desensitization (P < .001). The most common infections were UTI (33.5% of ILDKT vs. 21.5% compatible), opportunistic (21.9% vs. 10.8%), and bloodstream (19.1% vs. 5.4%) (P < .001). In weighted models, a trend toward increased risk was seen in low (wIRR = 0.77 1.402.56 ,P = .3) and moderately (wIRR = 0.88 1.352.06 ,P = .2) desensitized recipients, with a statistically significant 2.22-fold (wIRR = 1.33 2.223.72 ,P = .002) increased risk in highly desensitized recipients. Recipients with ≥4 infections were at higher risk of prolonged hospitalization (wIRR = 2.62 3.574.88 , P < .001) and death-censored graft loss (wHR = 1.15 4.0113.95 ,P = .03). Post-KT infections are more common in desensitized ILDKT recipients. A subset of highly desensitized patients is at ultra-high risk for infections. Strategies should be designed to protect patients from the morbidity of recurrent infections, and to extend the survival benefit of ILDKT across the spectrum of recipients.


Subject(s)
Kidney Transplantation , ABO Blood-Group System , Blood Group Incompatibility , Graft Rejection/epidemiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney Transplantation/adverse effects , Living Donors , Transplant Recipients
SELECTION OF CITATIONS
SEARCH DETAIL
...