Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 244
Filter
Add more filters

Country/Region as subject
Publication year range
1.
Am J Transplant ; 24(4): 669-680, 2024 Apr.
Article in English | MEDLINE | ID: mdl-37923085

ABSTRACT

Medication nonadherence is a leading cause of graft loss. Adherence monitoring technologies-reminder texts, smart bottles, video-observed ingestion, and digestion-activated signaling pills-may support adherence. However, patient, care partner, and clinician perceptions of these tools are not well studied. We conducted qualitative individual semistructured interviews and focus groups among 97 participants at a single center: kidney and liver transplant recipients 2 weeks to 18 months posttransplant, their care partners, and transplant clinicians. We assessed adherence practices, reactions to monitoring technologies, and opportunities for care integration. One-size-fits-all approaches were deemed infeasible. Interviewees considered text messages the most acceptable approach; live video checks were the least acceptable and raised the most concerns for inconvenience and invasiveness. Digestion-activated signaling technology produced both excitement and apprehension. Patients and care partners generally aligned in perceptions of adherence monitoring integration into clinical care. Key themes were importance of routine, ease of use, leveraging technology for actionable medication changes, and aversion to surveillance. Transplant clinicians similarly considered text messages most acceptable and video checks least acceptable. Clinicians reported that early posttransplant use and real-time adherence tracking with patient feedback may facilitate successful implementation. The study provides initial insights that may inform future adherence technology implementation.


Subject(s)
Caregivers , Kidney Transplantation , Humans , Transplant Recipients , Medication Adherence
2.
Am J Transplant ; 24(1): 46-56, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37739347

ABSTRACT

Kidney paired donation (KPD) is a major innovation that is changing the landscape of kidney transplantation in the United States. We used the 2006-2021 United Network for Organ Sharing data to examine trends over time. KPD is increasing, with 1 in 5 living donor kidney transplants (LDKTs) in 2021 facilitated by KPD. The proportion of LDKT performed via KPD was comparable for non-Whites and Whites. An increasing proportion of KPD transplants are going to non-Whites. End-chain recipients are not identified in the database. To what extent these trends reflect how end-chain kidneys are allocated, as opposed to increase in living donation among minorities, remains unclear. Half the LDKT in 2021 in sensitized (panel reactive antibody ≥ 80%) and highly sensitized (panel reactive antibody ≥ 98%) groups occurred via KPD. Yet, the proportion of KPD transplants performed in sensitized recipients has declined since 2013, likely due to changes in the deceased donor allocation policies and newer KPD strategies such as compatible KPD. In 2021, 40% of the programs reported not performing any KPD transplants. Our study highlights the need for understanding barriers to pursuing and expanding KPD at the center level and the need for more detailed and accurate data collection at the national level.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Humans , United States , Living Donors , Tissue and Organ Harvesting , Kidney
3.
Am J Transplant ; 24(6): 983-992, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38346499

ABSTRACT

Some United States organ procurement organizations transfer deceased organ donors to donor care units (DCUs) for recovery procedures. We used Organ Procurement and Transplantation Network data, from April 2017 to June 2021, to describe the proximity of adult deceased donors after brain death to DCUs and understand the impact of donor service area (DSA) boundaries on transfer efficiency. Among 19 109 donors (56.1% of the cohort) in 25 DSAs with DCUs, a majority (14 593 [76.4%]) were in hospitals within a 2-hour drive. In areas with DCUs detectable in the study data set, a minority of donors (3582 of 11 532 [31.1%]) were transferred to a DCU; transfer rates varied between DSAs (median, 27.7%, range, 4.0%-96.5%). Median hospital-to-DCU driving times were not meaningfully shorter among transferred donors (50 vs 51 minutes for not transferred, P < .001). When DSA boundaries were ignored, 3241 cohort donors (9.5%) without current DCU access were managed in hospitals within 2 hours of a DCU and thus potentially eligible for transfer. In summary, approximately half of United States deceased donors after brain death are managed in hospitals in DSAs with a DCU. Transfer of donors between DSAs may increase DCU utilization and improve system efficiency.


Subject(s)
Organ Transplantation , Tissue Donors , Tissue and Organ Procurement , Humans , Tissue Donors/supply & distribution , Tissue and Organ Procurement/statistics & numerical data , Tissue and Organ Procurement/organization & administration , United States , Organ Transplantation/statistics & numerical data , Brain Death , Adult , Patient Transfer , Female , Male , Middle Aged
4.
Liver Transpl ; 30(1): 10-19, 2024 01 01.
Article in English | MEDLINE | ID: mdl-37379030

ABSTRACT

Frailty and impaired functional status are associated with adverse outcomes on the liver transplant (LT) waitlist and after transplantation. Prehabilitation prior to LT has rarely been tested. We conducted a 2-arm patient-randomized pilot trial to evaluate the feasibility and efficacy of a 14-week behavioral intervention to promote physical activity prior to LT. Thirty patients were randomized 2:1 to intervention (n = 20) versus control (n = 10). The intervention arm received financial incentives and text-based reminders linked to wearable fitness trackers. Daily step goals were increased by 15% in 2-week intervals. Weekly check-ins with study staff assessed barriers to physical activity. The primary outcomes were feasibility and acceptability. Secondary outcomes included mean end-of-study step counts, short physical performance battery, grip strength, and body composition by phase angle. We fit regression models for secondary outcomes with the arm as the exposure adjusting for baseline performance. The mean age was 61, 47% were female, and the median Model for End-stage Liver Disease sodium (MELD-Na) was 13. One-third were frail or prefrail by the liver frailty index, 40% had impaired mobility by short physical performance battery, nearly 40% had sarcopenia by bioimpedance phase angle, 23% had prior falls, and 53% had diabetes. Study retention was 27/30 (90%; 2 unenrolled from intervention, 1 lost to follow-up in control arm). Self-reported adherence to exercise during weekly check-ins was about 50%; the most common barriers were fatigue, weather, and liver-related symptoms. End-of-study step counts were nearly 1000 steps higher for intervention versus control: adjusted difference 997, 95% CI, 147-1847; p = 0.02. On average, the intervention group achieved daily step targets 51% of the time. A home-based intervention with financial incentives and text-based nudges was feasible, highly accepted, and increased daily steps in LT candidates with functional impairment and malnutrition.


Subject(s)
End Stage Liver Disease , Frailty , Liver Transplantation , Humans , Female , Middle Aged , Male , Liver Transplantation/adverse effects , Preoperative Exercise , End Stage Liver Disease/diagnosis , End Stage Liver Disease/surgery , Severity of Illness Index
5.
Am J Kidney Dis ; 84(5): 646-650, 2024 Nov.
Article in English | MEDLINE | ID: mdl-38670253

ABSTRACT

Advocates for improved equity in kidney transplants in the United States have recently focused their efforts on initiatives to increase referral for transplant evaluation. However, because donor kidneys remain scarce, increased referrals are likely to result in an increasing number of patients proceeding through the evaluation process without ultimately receiving a kidney. Unfortunately, the process of referral and evaluation can be highly resource-intensive for patients, families, transplant programs, and payers. Patients and families may incur out-of-pocket expenses and be required to complete testing and treatments that they might not have chosen in the course of routine clinical care. Kidney transplant programs may struggle with insufficient capacity, inefficient workflow, and challenging programmatic finances, and payers will need to absorb the increased expenses of upfront pretransplant costs. Increased referral in isolation may risk simply transmitting system stress and resulting disparities to downstream processes in this complex system. We argue that success in efforts to improve access through increased referrals hinges on adaptations to the pretransplant process more broadly. We call for an urgent re-evaluation and redesign at multiple levels of the pretransplant system in order to achieve the aim of equitable access to kidney transplantation for all patients with kidney failure.


Subject(s)
Health Services Accessibility , Kidney Transplantation , Referral and Consultation , Humans , United States , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/surgery , Tissue and Organ Procurement/organization & administration
6.
Am J Kidney Dis ; 2024 Jul 19.
Article in English | MEDLINE | ID: mdl-39032679

ABSTRACT

RATIONALE & OBJECTIVE: The clinical trajectory of normoalbuminuric chronic kidney disease (CKD), particularly in the absence of diabetes, has not yet been well-studied. This study evaluated the association of kidney and cardiovascular outcomes with levels of albuminuria in a cohort of patients with nondiabetic CKD. STUDY DESIGN: Prospective cohort study. SETTING & PARTICIPANTS: 1,463 adults with nondiabetic CKD without known glomerulonephritis and diagnosed with hypertensive nephrosclerosis or unknown cause of CKD participating in the Chronic Renal Insufficiency Cohort (CRIC) Study. EXPOSURE: Albuminuria stage at study entry. OUTCOME: Primary outcome: Composite kidney (halving of estimated glomerular filtration rate [eGFR], kidney transplantation, or dialysis), Secondary outcomes: (1) eGFR slope, (2) composite cardiovascular disease events (hospitalization for heart failure, myocardial infarction, stroke, or all-cause death), (3) all-cause death. ANALYTICAL APPROACH: Linear mixed effects and Cox proportional hazards regression analyses. RESULTS: Lower levels of albuminuria were associated with female sex and older age. For the primary outcome, compared with normoalbuminuria, those with moderate and severe albuminuria had higher rates of kidney outcomes (adjusted hazard ratio [AHR], 3.3 [95% CI, 2.4-4.6], and AHR, 8.6 [95% CI, 6.0-12.0], respectively) and cardiovascular outcomes (AHR, 1.5 [95% CI, 1.2-1.9], and AHR, 1.5 [95% CI, 1.1-2.0], respectively). Those with normoalbuminuria (<30µg/mg; n=863) had a slower decline in eGFR (-0.46mL/min/1.73m2 per year) compared with those with moderate (30-300µg/mg, n=372; 1.41mL/min/1.73m2 per year) or severe albuminuria (>300µg/mg, n=274; 2.63mL/min/1.73m2 per year). In adjusted analyses, kidney outcomes occurred, on average, sooner among those with moderate (8.6 years) and severe (7.3 years) albuminuria compared with those with normoalbuminuria (9.3 years) whereas the average times to cardiovascular outcomes were similar across albuminuria groups (8.2, 8.1, and 8.6 years, respectively). LIMITATIONS: Self-report of CKD etiology without confirmatory kidney biopsies; residual confounding. CONCLUSIONS: Participants with normoalbuminuric nondiabetic CKD experienced substantially slower CKD progression but only modestly lower cardiovascular risk than those with high levels of albuminuria. These findings inform the design of future studies investigating interventions among individuals with lower levels of albuminuria. PLAIN-LANGUAGE SUMMARY: Diabetes and hypertension are the leading causes of chronic kidney disease (CKD). Urine albumin levels are associated with cardiovascular and kidney disease outcomes among individuals with CKD. However, previous studies of long-term clinical outcomes in CKD largely included patients with diabetes. As well, few studies have evaluated long-term outcomes across different levels of urine albumin among people without diabetes. In this study, we found individuals with nondiabetic CKD and low urine albumin had much slower decline of kidney function but only a modestly lower risk of a cardiovascular events compared with those with high levels of urine albumin. Individuals with low urine albumin were much more likely to have a cardiovascular event than progression of their kidney disease. These findings inform the design of future studies investigating treatments among individuals with lower levels of albuminuria.

7.
Am J Kidney Dis ; 84(4): 416-426, 2024 Oct.
Article in English | MEDLINE | ID: mdl-38636649

ABSTRACT

RATIONALE & OBJECTIVE: The US Kidney Allocation System (KAS) prioritizes candidates with a≤20% estimated posttransplant survival (EPTS) to receive high-longevity kidneys defined by a≤20% Kidney Donor Profile Index (KDPI). Use of EPTS in the KAS deprioritizes candidates with older age, diabetes, and longer dialysis durations. We assessed whether this use also disadvantages race and ethnicity minority candidates, who are younger but more likely to have diabetes and longer durations of kidney failure requiring dialysis. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Adult candidates for and recipients of kidney transplantation represented in the Scientific Registry of Transplant Recipients from January 2015 through December 2020. EXPOSURE: Race and ethnicity. OUTCOME: Age-adjusted assignment to≤20% EPTS, transplantation of a≤20% KDPI kidney, and posttransplant survival in longevity-matched recipients by race and ethnicity. ANALYTIC APPROACH: Multivariable logistic regression, Fine-Gray competing risks survival analysis, and Kaplan-Meier and Cox proportional hazards methods. RESULTS: The cohort included 199,444 candidates (7% Asian, 29% Black, 19% Hispanic or Latino, and 43% White) listed for deceased donor kidney transplantation. Non-White candidates had significantly higher rates of diabetes, longer dialysis duration, and were younger than White candidates. Adjusted for age, Asian, Black, and Hispanic or Latino candidates had significantly lower odds of having a ETPS score of≤20% (odds ratio, 0.86 [95% CI, 0.81-0.91], 0.52 [95% CI, 0.50-0.54], and 0.49 [95% CI, 0.47-0.51]), and were less likely to receive a≤20% KDPI kidney (sub-hazard ratio, 0.70 [0.66-0.75], 0.89 [0.87-0.92], and 0.73 [0.71-0.76]) compared with White candidates. Among recipients with≤20% EPTS scores transplanted with a≤20% KDPI deceased donor kidney, Asian and Hispanic recipients had lower posttransplant mortality (HR, 0.45 [0.27-0.77] and 0.63 [0.47-0.86], respectively) and Black recipients had higher but not statistically significant posttransplant mortality (HR, 1.22 [0.99-1.52]) compared with White recipients. LIMITATIONS: Provider reported race and ethnicity data and 5-year post transplant follow-up period. CONCLUSIONS: The US kidney allocation system is less likely to identify race and ethnicity minority candidates as having a≤20% EPTS score, which triggers allocation of high-longevity deceased donor kidneys. These findings should inform the Organ Procurement and Transplant Network about how to remedy the race and ethnicity disparities introduced through KAS's current approach of allocating allografts with longer predicted longevity to recipients with longer estimated posttransplant survival. PLAIN-LANGUAGE SUMMARY: The US Kidney Allocation System prioritizes giving high-longevity, high-quality kidneys to patients on the waiting list who have a high estimated posttransplant survival (EPTS) score. EPTS is calculated based on the patient's age, whether the patient has diabetes, whether the patient has a history of organ transplantation, and the number of years spent on dialysis. Our analyses show that Asian, Black or African American, and Hispanic or Latino patients were less likely to receive high-longevity kidneys compared with White patients, despite having similar or better posttransplant survival outcomes.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Humans , Male , Female , Middle Aged , United States/epidemiology , Adult , Cohort Studies , Tissue Donors , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/mortality , Graft Survival , Aged , Ethnicity , Longevity , Registries , Racial Groups
8.
Am J Kidney Dis ; 84(5): 567-581.e1, 2024 Nov.
Article in English | MEDLINE | ID: mdl-38851446

ABSTRACT

RATIONALE & OBJECTIVE: Developing strategies to improve home dialysis use requires a comprehensive understanding of barriers. We sought to identify the most important barriers to home dialysis use from the perspective of patients, care partners, and providers. STUDY DESIGN: This is a convergent parallel mixed-methods study. SETTING & PARTICIPANTS: We convened a 7-member advisory board of patients, care partners, and providers who collectively developed lists of major patient/care partner-perceived barriers and provider-perceived barriers to home dialysis. We used these lists to develop a survey that was distributed to patients, care partners, and providers-through the American Association of Kidney Patients and the National Kidney Foundation. The surveys asked participants to (1) rank their top 3 major barriers (quantitative) and (2) describe barriers to home dialysis (qualitative). ANALYTICAL APPROACH: We compiled a list of the top 3 patient/care partner-perceived and top 3 provider-perceived barriers (quantitative). We also conducted a directed content analysis of open-ended survey responses (qualitative). RESULTS: There were 522 complete responses (233 providers; 289 patients/care partners). The top 3 patient/care partner-perceived barriers were fear of performing home dialysis; lack of space; and the need for home-based support. The top 3 provider-perceived barriers were poor patient education; limited mechanisms for home-based support staff, mental health, and education; and lack of experienced staff. We identified 9 themes through qualitative analysis: limited education; financial disincentives; limited resources; high burden of care; built environment/structure of care delivery that favors in-center hemodialysis; fear and isolation; perceptions of inequities in access to home dialysis; provider perspectives about patients; and patient/provider resiliency. LIMITATIONS: This was an online survey that is subject to nonresponse bias. CONCLUSIONS: The top 3 barriers to home dialysis for patient/care partners and providers incompletely overlap, suggesting the need for diverse strategies that simultaneously address patient-perceived barriers at home and provider-perceived barriers in the clinic. PLAIN-LANGUAGE SUMMARY: There are many barriers to home dialysis use in the United States. However, we know little about which barriers are the most important to patients and clinicians. This makes it challenging to develop strategies to increase home dialysis use. In this study, we surveyed patients, care partners, and clinicians across the country to identify the most important barriers to home dialysis, namely (1) patients/care partners identified fear of home dialysis, lack of space, and lack of home-based support; and (2) clinicians identified poor patient education, limited support for staff and patients, and lack of experienced staff. These findings suggest that patients and clinicians perceive different barriers and that both sets of barriers should be addressed to expand home dialysis use.


Subject(s)
Caregivers , Hemodialysis, Home , Humans , Hemodialysis, Home/psychology , Male , Female , Middle Aged , Aged , Caregivers/psychology , United States , Kidney Failure, Chronic/therapy , Health Services Accessibility , Surveys and Questionnaires , Adult
9.
Stat Med ; 43(16): 3036-3050, 2024 Jul 20.
Article in English | MEDLINE | ID: mdl-38780593

ABSTRACT

In evaluating the performance of different facilities or centers on survival outcomes, the standardized mortality ratio (SMR), which compares the observed to expected mortality has been widely used, particularly in the evaluation of kidney transplant centers. Despite its utility, the SMR may exaggerate center effects in settings where survival probability is relatively high. An example is one-year graft survival among U.S. kidney transplant recipients. We propose a novel approach to estimate center effects in terms of differences in survival probability (ie, each center versus a reference population). An essential component of the method is a prognostic score weighting technique, which permits accurately evaluating centers without necessarily specifying a correct survival model. Advantages of our approach over existing facility-profiling methods include a metric based on survival probability (greater clinical relevance than ratios of counts/rates); direct standardization (valid to compare between centers, unlike indirect standardization based methods, such as the SMR); and less reliance on correct model specification (since the assumed model is used to generate risk classes as opposed to fitted-value based 'expected' counts). We establish the asymptotic properties of the proposed weighted estimator and evaluate its finite-sample performance under a diverse set of simulation settings. The method is then applied to evaluate U.S. kidney transplant centers with respect to graft survival probability.


Subject(s)
Graft Survival , Kidney Transplantation , Models, Statistical , Kidney Transplantation/mortality , Humans , Prognosis , Survival Analysis , United States , Probability , Computer Simulation
10.
BMC Nephrol ; 25(1): 183, 2024 May 28.
Article in English | MEDLINE | ID: mdl-38807063

ABSTRACT

BACKGROUND: Structured Problem Solving (SPS) is a patient-centered approach to promoting behavior change that relies on productive collaboration between coaches and participants and reinforces participant autonomy. We aimed to describe the design, implementation, and assessment of SPS in the multicenter Prevention of Urinary Stones with Hydration (PUSH) randomized trial. METHODS: In the PUSH trial, individuals with a history of urinary stone disease and low urine output were randomized to control versus a multicomponent intervention including SPS that was designed to promote fluid consumption and thereby prevent recurrent stones. We provide details specifically about training and fidelity assessment of the SPS coaches. We report on implementation experiences related to SPS during the initial conduct of the trial. RESULTS: With training and fidelity assessment, coaches in the PUSH trial applied SPS to help participants overcome barriers to fluid consumption. In some cases, coaches faced implementation barriers such as variable participant engagement that required tailoring their work with specific participants. The coaches also faced challenges including balancing rapport with problem solving, and role clarity for the coaches. CONCLUSIONS: We adapted SPS to the setting of kidney stone prevention and overcame challenges in implementation, such as variable patient engagement. Tools from the PUSH trial may be useful to apply to other health behavior change settings in nephrology and other areas of clinical care. TRIAL REGISTRATION: ClinicalTrials.gov Identifier NCT03244189.


Subject(s)
Drinking , Problem Solving , Urinary Calculi , Humans , Urinary Calculi/prevention & control , Male , Female , Drinking Behavior
11.
J Am Soc Nephrol ; 34(2): 205-219, 2023 02 01.
Article in English | MEDLINE | ID: mdl-36735375

ABSTRACT

BACKGROUND: National guidelines recommend twice-yearly hepatitis C virus (HCV) screening for patients receiving in-center hemodialysis. However, studies examining the cost-effectiveness of HCV screening methods or frequencies are lacking. METHODS: We populated an HCV screening, treatment, and disease microsimulation model with a cohort representative of the US in-center hemodialysis population. Clinical outcomes, costs, and cost-effectiveness of the Kidney Disease Improving Global Outcomes (KDIGO) 2018 guidelines-endorsed HCV screening frequency (every 6 months) were compared with less frequent periodic screening (yearly, every 2 years), screening only at hemodialysis initiation, and no screening. We estimated expected quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICERs) between each screening strategy and the next less expensive alternative strategy, from a health care sector perspective, in 2019 US dollars. For each strategy, we modeled an HCV outbreak occurring in 1% of centers. In sensitivity analyses, we varied mortality, linkage to HCV cure, screening method (ribonucleic acid versus antibody testing), test sensitivity, HCV infection rates, and outbreak frequencies. RESULTS: Screening only at hemodialysis initiation yielded HCV cure rates of 79%, with an ICER of $82,739 per QALY saved compared with no testing. Compared with screening at hemodialysis entry only, screening every 2 years increased cure rates to 88% and decreased liver-related deaths by 52%, with an ICER of $140,193. Screening every 6 months had an ICER of $934,757; in sensitivity analyses using a willingness-to-pay threshold of $150,000 per QALY gained, screening every 6 months was never cost-effective. CONCLUSIONS: The KDIGO-recommended HCV screening interval (every 6 months) does not seem to be a cost-effective use of health care resources, suggesting that re-evaluation of less-frequent screening strategies should be considered.


Subject(s)
Hepatitis C, Chronic , Hepatitis C , Humans , Hepacivirus , Cost-Benefit Analysis , Hepatitis C/diagnosis , Hepatitis C/epidemiology , Mass Screening , Renal Dialysis , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/epidemiology , Hepatitis C, Chronic/drug therapy , Antiviral Agents/therapeutic use
12.
JAMA ; 332(3): 215-225, 2024 07 16.
Article in English | MEDLINE | ID: mdl-38780515

ABSTRACT

Importance: Recipient outcomes after kidney transplant from deceased donors who received dialysis prior to kidney donation are not well described. Objective: To compare outcomes of transplant recipients who received kidneys from deceased donors who underwent dialysis prior to kidney donation vs recipients of kidneys from deceased donors who did not undergo dialysis. Design, Setting, and Participants: A retrospective cohort study was conducted including data from 58 US organ procurement organizations on deceased kidney donors and kidney transplant recipients. From 2010 to 2018, 805 donors who underwent dialysis prior to kidney donation were identified. The donors who underwent dialysis prior to kidney donation were matched 1:1 with donors who did not undergo dialysis using a rank-based distance matrix algorithm; 1944 kidney transplant recipients were evaluated. Exposure: Kidney transplants from deceased donors who underwent dialysis prior to kidney donation compared with kidney transplants from deceased donors who did not undergo dialysis. Main Outcomes and Measures: The 4 study outcomes were delayed graft function (defined as receipt of dialysis by the kidney recipient ≤1 week after transplant), all-cause graft failure, death-censored graft failure, and death. Results: From 2010 to 2018, 1.4% of deceased kidney donors (805 of 58 155) underwent dialysis prior to kidney donation. Of these 805 individuals, 523 (65%) donated at least 1 kidney. A total of 969 kidneys (60%) were transplanted and 641 kidneys (40%) were discarded. Among the donors with kidneys transplanted, 514 (mean age, 33 years [SD, 10.8 years]; 98 had hypertension [19.1%] and 36 had diabetes [7%]) underwent dialysis prior to donation and were matched with 514 (mean age, 33 years [SD, 10.9 years]; 98 had hypertension [19.1%] and 36 had diabetes [7%]) who did not undergo dialysis. Kidney transplants from donors who received dialysis prior to donation (n = 954 kidney recipients) were associated with a higher risk of delayed graft function compared with kidney transplants from donors who did not receive dialysis (n = 990 kidney recipients) (59.2% vs 24.6%, respectively; adjusted odds ratio, 4.17 [95% CI, 3.28-5.29]). The incidence rates did not significantly differ at a median follow-up of 34.1 months for all-cause graft failure (43.1 kidney transplants per 1000 person-years from donors who received dialysis prior to donation vs 46.9 kidney transplants per 1000 person-years from donors who did not receive dialysis; adjusted hazard ratio [HR], 0.90 [95% CI, 0.70-1.15]), for death-censored graft failure (22.5 vs 20.6 per 1000 person-years, respectively; adjusted HR, 1.18 [95% CI, 0.83-1.69]), or for death (24.6 vs 30.8 per 1000 person-years; adjusted HR, 0.76 [95% CI, 0.55-1.04]). Conclusions and Relevance: Compared with receiving a kidney from a deceased donor who did not undergo dialysis, receiving a kidney from a deceased donor who underwent dialysis prior to kidney donation was associated with a significantly higher incidence of delayed graft function, but no significant difference in graft failure or death at follow-up.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Renal Dialysis , Tissue Donors , Humans , Retrospective Studies , Female , Male , Adult , Middle Aged , Delayed Graft Function/epidemiology , Tissue and Organ Procurement
13.
Kidney Int ; 103(2): 239-242, 2023 02.
Article in English | MEDLINE | ID: mdl-36332727

ABSTRACT

Over the past year, 3 scientific teams conducted experiments of genetically edited porcine organs into human recipients, 3 of whom were deceased and 1 living. In this editorial, we describe challenges for the design of initial xenotransplantation clinical trials and focus on patient selection, consent, and requisite post-transplant follow-up. Given the uncertain clinical benefit of xenotransplantation, we propose that patient selection criteria might include novel elements, such as approaching patients who have a low quality of life and a strong aversion to continued dialysis therapy. We set expectations related to the importance of informing and protecting family members and medical teams who could be exposed to zoonotic viral infection from the donor organ and/or receive unwanted publicity. Meeting these challenges in trial design and oversight will require multidisciplinary expertise, a conceptual model that extends beyond the individual patient, and creative collaboration between scientists and regulatory agencies.


Subject(s)
Quality of Life , Renal Dialysis , Humans , Animals , Swine , Transplantation, Heterologous , Kidney , Tissue Donors
14.
Kidney Int ; 103(4): 762-771, 2023 04.
Article in English | MEDLINE | ID: mdl-36549364

ABSTRACT

Although hypothermic machine perfusion (HMP) is associated with improved kidney graft viability and function, the underlying biological mechanisms are unknown. Untargeted metabolomic profiling may identify potential metabolites and pathways that can help assess allograft viability and contribute to organ preservation. Therefore, in this multicenter study, we measured all detectable metabolites in perfusate collected at the beginning and end of deceased-donor kidney perfusion and evaluated their associations with graft failure. In our cohort of 190 kidney transplants, 33 (17%) had death-censored graft failure over a median follow-up of 5.0 years (IQR 3.0-6.1 years). We identified 553 known metabolites in perfusate and characterized their experimental and biological consistency through duplicate samples and unsupervised clustering. After perfusion-time adjustment and false discovery correction, six metabolites in post-HMP perfusate were significantly associated with death-censored graft failure, including alpha-ketoglutarate, 3-carboxy-4-methyl-5-propyl-2-furanpropanoate, 1-carboxyethylphenylalanine, and three glycerol-phosphatidylcholines. All six metabolites were associated with an increased risk of graft failure (Hazard Ratio per median absolute deviation range 1.04-1.45). Four of six metabolites also demonstrated significant interaction with donation after cardiac death with notably greater risk in the donation after cardiac death group (Hazard Ratios up to 1.69). Discarded kidneys did not have significantly different levels of any death-censored graft failure-associated metabolites. On interrogation of pathway analysis, production of reactive oxygen species and increased metabolism of fatty acids were upregulated in kidneys that subsequently developed death-censored graft failure. Thus, further understanding the role of these metabolites may inform the HMP process and help improve the objective evaluation of allograft offers, thereby reducing the discard of potentially viable organs.


Subject(s)
Kidney Transplantation , Kidney , Humans , Kidney Transplantation/adverse effects , Perfusion , Tissue Donors , Death , Allografts , Graft Survival
15.
Am J Transplant ; 23(9): 1290-1299, 2023 09.
Article in English | MEDLINE | ID: mdl-37217005

ABSTRACT

In June 2022, the US Food and Drug Administration Center for Biologics Evaluation and Research held the 73rd meeting of the Cellular, Tissue, and Gene Therapies Advisory Committee for public discussion of regulatory expectations for xenotransplantation products. The members of a joint American Society of Transplant Surgeons/American Society of Transplantation committee on xenotransplantation compiled a meeting summary focusing on 7 topics believed to be key by the committee: (1) preclinical evidence supporting progression to a clinical trial, (2) porcine kidney function, (3) ethical aspects, (4) design of initial clinical trials, (5) infectious disease issues, (6) industry perspectives, and (7) regulatory oversight.


Subject(s)
Motivation , Surgeons , United States , Animals , Swine , Humans , Transplantation, Heterologous , United States Food and Drug Administration
16.
Am J Transplant ; 23(3): 316-325, 2023 03.
Article in English | MEDLINE | ID: mdl-36906294

ABSTRACT

Solid organ transplantation provides the best treatment for end-stage organ failure, but significant sex-based disparities in transplant access exist. On June 25, 2021, a virtual multidisciplinary conference was convened to address sex-based disparities in transplantation. Common themes contributing to sex-based disparities were noted across kidney, liver, heart, and lung transplantation, specifically the existence of barriers to referral and wait listing for women, the pitfalls of using serum creatinine, the issue of donor/recipient size mismatch, approaches to frailty and a higher prevalence of allosensitization among women. In addition, actionable solutions to improve access to transplantation were identified, including alterations to the current allocation system, surgical interventions on donor organs, and the incorporation of objective frailty metrics into the evaluation process. Key knowledge gaps and high-priority areas for future investigation were also discussed.


Subject(s)
Frailty , Organ Transplantation , Tissue and Organ Procurement , Female , Humans , Healthcare Disparities , Kidney , Tissue Donors , United States , Waiting Lists
17.
J Urol ; 209(5): 971-980, 2023 05.
Article in English | MEDLINE | ID: mdl-36648152

ABSTRACT

PURPOSE: The STudy to Enhance uNderstanding of sTent-associated Symptoms sought to identify risk factors for pain and urinary symptoms, as well as how these symptoms interfere with daily activities after ureteroscopy for stone treatment. MATERIALS AND METHODS: This prospective observational cohort study enrolled patients aged ≥12 years undergoing ureteroscopy with ureteral stent for stone treatment at 4 clinical centers. Participants reported symptoms at baseline; on postoperative days 1, 3, 5; at stent removal; and day 30 post-stent removal. Outcomes of pain intensity, pain interference, urinary symptoms, and bother were captured with multiple instruments. Multivariable analyses using mixed-effects linear regression models were identified characteristics associated with increased stent-associated symptoms. RESULTS: A total of 424 participants were enrolled. Mean age was 49 years (SD 17); 47% were female. Participants experienced a marked increase in stent-associated symptoms on postoperative day 1. While pain intensity decreased ∼50% from postoperative day 1 to postoperative day 5, interference due to pain remained persistently elevated. In multivariable analysis, older age was associated with lower pain intensity (P = .004). Having chronic pain conditions (P < .001), prior severe stent pain (P = .021), and depressive symptoms at baseline (P < .001) were each associated with higher pain intensity. Neither sex, stone location, ureteral access sheath use, nor stent characteristics were drivers of stent-associated symptoms. CONCLUSIONS: In this multicenter cohort, interference persisted even as pain intensity decreased. Patient factors (eg, age, depression) rather than surgical factors were associated with symptom intensity. These findings provide a foundation for patient-centered care and highlight potential targets for efforts to mitigate the burden of stent-associated symptoms.


Subject(s)
Ureteral Calculi , Urinary Calculi , Urolithiasis , Humans , Female , Middle Aged , Male , Ureteroscopy/adverse effects , Ureteroscopy/methods , Ureteral Calculi/surgery , Prospective Studies , Urinary Calculi/surgery , Urinary Calculi/etiology , Urolithiasis/etiology , Stents/adverse effects , Pain, Postoperative/etiology , Risk Factors
18.
Am J Kidney Dis ; 2023 Nov 20.
Article in English | MEDLINE | ID: mdl-37992981

ABSTRACT

Two of the greatest challenges facing kidney transplantation are the lack of donated organs and inequities in who receives a transplant. Xenotransplantation holds promise as a treatment approach that could solve the supply problem. Major advances in gene-editing procedures have enabled several companies to raise genetically engineered pigs for organ donation. These porcine organs lack antigens and have other modifications that should reduce the probability of immunological rejection when transplanted into humans. The US Food and Drug Administration and transplantation leaders are starting to chart a path to test xenotransplants in clinical trials and later integrate them into routine clinical care. Here we provide a framework that industry, regulatory authorities, payers, transplantation professionals, and patient groups can implement to promote equity during every stage in this process. We also call for immediate action. Companies developing xenotransplant technology should assemble patient advocacy boards to bring the concerns of individuals with end-stage kidney disease to the forefront. For trials, xenotransplantation companies should partner with transplant programs with substantial patient populations of racial and ethnic minority groups and that have reciprocal relationships with those communities. Those companies and transplant programs should reach out now to those communities to inform them about xenotransplantation and try to address their concerns. These actions have the potential to make these communities full partners in the promise of xenotransplantation.

19.
Am J Kidney Dis ; 81(2): 222-231.e1, 2023 02.
Article in English | MEDLINE | ID: mdl-36191727

ABSTRACT

RATIONALE & OBJECTIVE: Donor acute kidney injury (AKI) activates innate immunity, enhances HLA expression in the kidney allograft, and provokes recipient alloimmune responses. We hypothesized that injury and inflammation that manifested in deceased-donor urine biomarkers would be associated with higher rates of biopsy-proven acute rejection (BPAR) and allograft failure after transplantation. STUDY DESIGN: Prospective cohort. SETTING & PARTICIPANTS: 862 deceased donors for 1,137 kidney recipients at 13 centers. EXPOSURES: We measured concentrations of interleukin 18 (IL-18), kidney injury molecule 1 (KIM-1), and neutrophil gelatinase-associated lipocalin (NGAL) in deceased donor urine. We also used the Acute Kidney Injury Network (AKIN) criteria to assess donor clinical AKI. OUTCOMES: The primary outcome was a composite of BPAR and graft failure (not from death). A secondary outcome was the composite of BPAR, graft failure, and/or de novo donor-specific antibody (DSA). Outcomes were ascertained in the first posttransplant year. ANALYTICAL APPROACH: Multivariable Fine-Gray models with death as a competing risk. RESULTS: Mean recipient age was 54 ± 13 (SD) years, and 82% received antithymocyte globulin. We found no significant associations between donor urinary IL-18, KIM-1, and NGAL and the primary outcome (subdistribution hazard ratio [HR] for highest vs lowest tertile of 0.76 [95% CI, 0.45-1.28], 1.20 [95% CI, 0.69-2.07], and 1.14 [95% CI, 0.71-1.84], respectively). In secondary analyses, we detected no significant associations between clinically defined AKI and the primary outcome or between donor biomarkers and the composite outcome of BPAR, graft failure, and/or de novo DSA. LIMITATIONS: BPAR was ascertained through for-cause biopsies, not surveillance biopsies. CONCLUSIONS: In a large cohort of kidney recipients who almost all received induction with thymoglobulin, donor injury biomarkers were associated with neither graft failure and rejection nor a secondary outcome that included de novo DSA. These findings provide some reassurance that centers can successfully manage immunological complications using deceased-donor kidneys with AKI.


Subject(s)
Acute Kidney Injury , Kidney Transplantation , Humans , Adult , Middle Aged , Aged , Lipocalin-2 , Interleukin-18 , Prospective Studies , Acute Kidney Injury/pathology , Tissue Donors , Biomarkers , Graft Rejection/epidemiology , Graft Survival
20.
Hepatology ; 76(3): 700-711, 2022 09.
Article in English | MEDLINE | ID: mdl-35278226

ABSTRACT

BACKGROUND AND AIMS: Cirrhosis is a major cause of death and is associated with extensive health care use. Patients with cirrhosis have complex treatment choices due to risks of morbidity and mortality. To optimally counsel and treat patients with cirrhosis requires tools to predict their longer-term liver-related survival. We sought to develop and validate a risk score to predict longer-term survival of patients with cirrhosis. APPROACH AND RESULTS: We conducted a retrospective cohort study of adults with cirrhosis with no major life-limiting comorbidities. Adults with cirrhosis within the Veterans Health Administration were used for model training and internal validation, and external validation used the OneFlorida Clinical Research Consortium. We used four model-building approaches including variables predictive of cirrhosis-related mortality, focused on discrimination at key time points (1, 3, 5, and 10 years). Among 30,263 patients with cirrhosis ≤75 years old without major life-limiting comorbidities and complete laboratory data during the baseline period, the boosted survival tree models had the highest discrimination, with 1-year, 3-year, 5-year, and 10-year survival rates of 0.77, 0.81, 0.84, and 0.88, respectively. The 1-year, 3-year, and 5-year discrimination was nearly identical in external validation. Secondary analyses with imputation of missing data and subgroups by etiology of liver disease had similar results to the primary model. CONCLUSIONS: We developed and validated (internally and externally) a risk score to predict longer-term survival of patients with cirrhosis. This score would transform management of patients with cirrhosis in terms of referral to specialty care and treatment decision-making for non-liver-related care.


Subject(s)
Liver Cirrhosis , Adult , Aged , Humans , Prognosis , Retrospective Studies , Risk Factors , Survival Rate
SELECTION OF CITATIONS
SEARCH DETAIL