Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 57
Filter
1.
Am J Kidney Dis ; 81(2): 222-231.e1, 2023 02.
Article in English | MEDLINE | ID: mdl-36191727

ABSTRACT

RATIONALE & OBJECTIVE: Donor acute kidney injury (AKI) activates innate immunity, enhances HLA expression in the kidney allograft, and provokes recipient alloimmune responses. We hypothesized that injury and inflammation that manifested in deceased-donor urine biomarkers would be associated with higher rates of biopsy-proven acute rejection (BPAR) and allograft failure after transplantation. STUDY DESIGN: Prospective cohort. SETTING & PARTICIPANTS: 862 deceased donors for 1,137 kidney recipients at 13 centers. EXPOSURES: We measured concentrations of interleukin 18 (IL-18), kidney injury molecule 1 (KIM-1), and neutrophil gelatinase-associated lipocalin (NGAL) in deceased donor urine. We also used the Acute Kidney Injury Network (AKIN) criteria to assess donor clinical AKI. OUTCOMES: The primary outcome was a composite of BPAR and graft failure (not from death). A secondary outcome was the composite of BPAR, graft failure, and/or de novo donor-specific antibody (DSA). Outcomes were ascertained in the first posttransplant year. ANALYTICAL APPROACH: Multivariable Fine-Gray models with death as a competing risk. RESULTS: Mean recipient age was 54 ± 13 (SD) years, and 82% received antithymocyte globulin. We found no significant associations between donor urinary IL-18, KIM-1, and NGAL and the primary outcome (subdistribution hazard ratio [HR] for highest vs lowest tertile of 0.76 [95% CI, 0.45-1.28], 1.20 [95% CI, 0.69-2.07], and 1.14 [95% CI, 0.71-1.84], respectively). In secondary analyses, we detected no significant associations between clinically defined AKI and the primary outcome or between donor biomarkers and the composite outcome of BPAR, graft failure, and/or de novo DSA. LIMITATIONS: BPAR was ascertained through for-cause biopsies, not surveillance biopsies. CONCLUSIONS: In a large cohort of kidney recipients who almost all received induction with thymoglobulin, donor injury biomarkers were associated with neither graft failure and rejection nor a secondary outcome that included de novo DSA. These findings provide some reassurance that centers can successfully manage immunological complications using deceased-donor kidneys with AKI.


Subject(s)
Acute Kidney Injury , Kidney Transplantation , Humans , Adult , Middle Aged , Aged , Lipocalin-2 , Interleukin-18 , Prospective Studies , Acute Kidney Injury/pathology , Tissue Donors , Biomarkers , Graft Rejection/epidemiology , Graft Survival
2.
Clin Transplant ; 37(5): e14947, 2023 05.
Article in English | MEDLINE | ID: mdl-36811329

ABSTRACT

BACKGROUND: Early post-kidney transplantation (KT) changes in physiology, medications, and health stressors likely impact body mass index (BMI) and likely impact all-cause graft loss and mortality. METHODS: We estimated 5-year post-KT (n = 151 170; SRTR) BMI trajectories using an adjusted mixed effects model. We estimated long-term mortality and graft loss risks by 1-year BMI change quartile (decrease [1st quartile]: change < -.07 kg/m2 /month; stable [2nd quartile]: -.07 ≤ change ≤ .09 kg/m2 /month; increase [3rd, 4th quartile]: change > .09 kg/m2 /month) using adjusted Cox proportional hazards models. RESULTS: BMI increased in the 3 years post-KT (.64 kg/m2 /year, 95% CI: .63, .64) and decreased in years 3-5 (-.24 kg/m2 /year, 95% CI: -.26, -.22). 1-year post-KT BMI decrease was associated with elevated risks of all-cause mortality (aHR = 1.13, 95% CI: 1.10-1.16), all-cause graft loss (aHR = 1.13, 95% CI: 1.10-1.15), death-censored graft loss (aHR = 1.15, 95% CI: 1.11-1.19), and mortality with functioning graft (aHR = 1.11, 95% CI: 1.08-1.14). Among recipients with obesity (pre-KT BMI≥30 kg/m2 ), BMI increase was associated with higher all-cause mortality (aHR = 1.09, 95% CI: 1.05-1.14), all-cause graft loss (aHR = 1.05, 95% CI: 1.01-1.09), and mortality with functioning graft (aHR = 1.10, 95% CI: 1.05-1.15) risks, but not death-censored graft loss risks, relative to stable weight. Among individuals without obesity, BMI increase was associated with lower all-cause graft loss (aHR = .97, 95% CI: .95-.99) and death-censored graft loss (aHR = .93, 95% CI: .90-.96) risks, but not all-cause mortality or mortality with functioning graft risks. CONCLUSIONS: BMI increases in the 3 years post-KT, then decreases in years 3-5. BMI loss in all adult KT recipients and BMI gain in those with obesity should be carefully monitored post-KT.


Subject(s)
Kidney Transplantation , Adult , Humans , Kidney Transplantation/adverse effects , Risk Factors , Body Mass Index , Treatment Outcome , Obesity/surgery , Graft Survival
3.
Am J Transplant ; 21(3): 958-967, 2021 03.
Article in English | MEDLINE | ID: mdl-33151614

ABSTRACT

Kidney transplantation prior to dialysis, known as "preemptive transplant," enables patients to live longer and avoid the substantial quality of life burdens due to chronic dialysis. Deceased donor kidneys are a public resource that ought to provide health benefits equitably. Unfortunately, White, better educated, and privately insured patients enjoy disproportionate access to preemptive transplantation using deceased donor kidneys. This problem has persisted for decades and is exacerbated by the first-come, first-served approach to kidney allocation for predialysis patients. In this Personal Viewpoint, we describe the diverse barriers to preemptive waitlisting and kidney transplant. The analysis focuses on healthcare system features that particularly disadvantage Black patients, such as the waitlisting eligibility criterion of a single glomerular filtration rate or creatinine clearance ≤20 ml/min, and neglect of wide variation in the rate of progression to end-stage kidney disease (ESKD) in allocating preemptive transplants. We propose initiatives to improve equity including: (1) standardization of waitlisting eligibility criteria related to kidney function; (2) aggressive education for clinicians about early transplant referral; (3) innovations in electronic medical record capabilities; and (4) rapid status 7 listing by centers. If those initiatives fail, the transplant field should consider eliminating preemptive waitlisting and transplantation with deceased donor kidneys.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Humans , Kidney , Kidney Failure, Chronic/surgery , Quality of Life , Waiting Lists
4.
Nephrol Dial Transplant ; 36(10): 1927-1936, 2021 09 27.
Article in English | MEDLINE | ID: mdl-33895851

ABSTRACT

BACKGROUND: Weight loss before kidney transplant (KT) is a known risk factor for weight gain and mortality, however, while unintentional weight loss is a marker of vulnerability, intentional weight loss might improve health. We tested whether pre-KT unintentional and intentional weight loss have differing associations with post-KT weight gain, graft loss and mortality. METHODS: Among 919 KT recipients from a prospective cohort study, we used adjusted mixed-effects models to estimate post-KT BMI trajectories, and Cox models to estimate death-uncensored graft loss, death-censored graft loss and all-cause mortality by 1-year pre-KT weight change category [stable weight (change ≤ 5%), intentional weight loss (loss > 5%), unintentional weight loss (loss > 5%) and weight gain (gain > 5%)]. RESULTS: The mean age was 53 years, 38% were Black and 40% were female. In the pre-KT year, 62% of recipients had stable weight, 15% had weight gain, 14% had unintentional weight loss and 10% had intentional weight loss. In the first 3 years post-KT, BMI increases were similar among those with pre-KT weight gain and intentional weight loss and lower compared with those with unintentional weight loss {difference +0.79 kg/m2/year [95% confidence interval (CI) 0.50-1.08], P < 0.001}. Only unintentional weight loss was independently associated with higher death-uncensored graft loss [adjusted hazard ratio (aHR) 1.80 (95% CI 1.23-2.62)], death-censored graft loss [aHR 1.91 (95% CI 1.12-3.26)] and mortality [aHR 1.72 (95% CI 1.06-2.79)] relative to stable pre-KT weight. CONCLUSIONS: This study suggests that unintentional, but not intentional, pre-KT weight loss is an independent risk factor for adverse post-KT outcomes.


Subject(s)
Kidney Transplantation , Female , Graft Survival , Humans , Kidney Transplantation/adverse effects , Middle Aged , Prospective Studies , Risk Factors , Transplant Recipients , Weight Loss
5.
Clin Transplant ; 35(11): e14437, 2021 11.
Article in English | MEDLINE | ID: mdl-34297878

ABSTRACT

The coronavirus disease 2019 (COVID-19) pandemic has created unprecedented challenges for solid organ transplant programs. While transplant activity has largely recovered, appropriate management of deceased donor candidates who are asymptomatic but have positive nucleic acid testing (NAT) for SARS-CoV-2 is unclear, as this result may reflect active infection or prolonged viral shedding. Furthermore, candidates who are unvaccinated or partially vaccinated continue to receive donor offers. In the absence of robust outcomes data, transplant professionals at US adult kidney transplant centers were surveyed (February 13, 2021 to April 29, 2021) to determine community practice (N: 92 centers, capturing 41% of centers and 57% of transplants performed). The majority (97%) of responding centers declined organs for asymptomatic NAT+ patients without documented prior infection. However, 32% of centers proceed with kidney transplant in NAT+ patients who were at least 30 days from initial diagnosis with negative chest imaging. Less than 7% of programs reported inactivating patients who were unvaccinated or partially vaccinated. In conclusion, despite national recommendations to wait for negative testing, many centers are proceeding with kidney transplant in patients with positive SARS-CoV-2 NAT results due to presumed viral shedding. Furthermore, few centers are requiring COVID-19 vaccination prior to transplantation at this time.


Subject(s)
COVID-19 , Adult , Asymptomatic Infections , COVID-19 Vaccines , Humans , SARS-CoV-2 , Vaccination
6.
BMC Nephrol ; 22(1): 26, 2021 01 12.
Article in English | MEDLINE | ID: mdl-33435916

ABSTRACT

BACKGROUND: Post-Transplant erythrocytosis (PTE) has not been studied in large recent cohorts. In this study, we evaluated the incidence, risk factors, and outcome of PTE with current transplant practices using the present World Health Organization criteria to define erythrocytosis. We also tested the hypothesis that the risk of PTE is greater with higher-quality kidneys. METHODS: We utilized the Deceased Donor Study which is an ongoing, multicenter, observational study of deceased donors and their kidney recipients that were transplanted between 2010 and 2013 across 13 centers. Eryrthocytosis is defined by hemoglobin> 16.5 g/dL in men and> 16 g/dL in women. Kidney quality is measured by Kidney Donor Profile Index (KDPI). RESULTS: Of the 1123 recipients qualified to be in this study, PTE was observed at a median of 18 months in 75 (6.6%) recipients. Compared to recipients without PTE, those with PTE were younger [mean 48±11 vs 54±13 years, p < 0.001], more likely to have polycystic kidney disease [17% vs 6%, p < 0.001], have received kidneys from younger donors [36 ±13 vs 41±15 years], and be on RAAS inhibitors [35% vs 22%, p < 0.001]. Recipients with PTE were less likely to have received kidneys from donors with hypertension [16% vs 32%, p = 0.004], diabetes [1% vs 11%, p = 0.008], and cerebrovascular event (24% vs 36%, p = 0.036). Higher KDPI was associated with decreased PTE risk [HR 0.98 (95% CI: 0.97-0.99)]. Over 60 months of follow-up, only 17 (36%) recipients had sustained PTE. There was no association between PTE and graft failure or mortality, CONCLUSIONS: The incidence of PTE was low in our study and PTE resolved in majority of patients. Lower KDPI increases risk of PTE. The underutilization of RAAS inhibitors in PTE patients raises the possibility of under-recognition of this phenomenon and should be explored in future studies.


Subject(s)
Kidney Transplantation , Polycythemia/epidemiology , Postoperative Complications/epidemiology , Adult , Female , Humans , Incidence , Male , Middle Aged , Risk Factors , Tissue Donors
7.
Nephrol Dial Transplant ; 35(7): 1099-1112, 2020 07 01.
Article in English | MEDLINE | ID: mdl-32191296

ABSTRACT

The construct of frailty was first developed in gerontology to help identify older adults with increased vulnerability when confronted with a health stressor. This article is a review of studies in which frailty has been applied to pre- and post-kidney transplantation (KT) populations. Although KT is the optimal treatment for end-stage kidney disease (ESKD), KT candidates often must overcome numerous health challenges associated with ESKD before receiving KT. After KT, the impacts of surgery and immunosuppression represent additional health stressors that disproportionately impact individuals with frailty. Frailty metrics could improve the ability to identify KT candidates and recipients at risk for adverse health outcomes and those who could potentially benefit from interventions to improve their frail status. The Physical Frailty Phenotype (PFP) is the most commonly used frailty metric in ESKD research, and KT recipients who are frail at KT (~20% of recipients) are twice as likely to die as nonfrail recipients. In addition to the PFP, many other metrics are currently used to assess pre- and post-KT vulnerability in research and clinical practice, underscoring the need for a disease-specific frailty metric that can be used to monitor KT candidates and recipients. Although frailty is an independent risk factor for post-transplant adverse outcomes, it is not factored into the current transplant program risk-adjustment equations. Future studies are needed to explore pre- and post-KT interventions to improve or prevent frailty.


Subject(s)
Frailty/physiopathology , Kidney Failure, Chronic/surgery , Kidney Transplantation/standards , Aged , Humans , Risk Factors
8.
Transpl Infect Dis ; 22(2): e13253, 2020 Apr.
Article in English | MEDLINE | ID: mdl-31994821

ABSTRACT

BACKGROUND: HIV-positive kidney transplant (KT) recipients have similar outcomes to HIV-negative recipients. However, HIV-positive patients with advanced kidney disease might face additional barriers to initiating the KT evaluation process. We sought to characterize comorbidities, viral control and management, viral resistance, and KT evaluation appointment rates in a cohort of KT evaluation-eligible HIV-positive patients. METHODS: We included patients seen between January 1, 2008, and December 31, 2015, at a primary care HIV clinic who met KT evaluation eligibility by an estimated glomerular filtration rate ≤20 mL/min/1.73 meters2 or dialysis dependence. The primary outcome was a documented appointment for KT evaluation. RESULTS: Of 3735 patients evaluated at the HIV primary clinic during the study period, 42 (1.6%) were KT evaluation-eligible patients. The median age was 47 years, 77% were male, and 95%, black. Median CD4 count was 328 cells/mm3 (IQR 175-461). Among the 63% percent with antiretroviral therapy (ART) prescription, 40% had viral loads >200 copies. Among patients with HIV resistance profiles (50%, n = 21), 52% had resistance to at least one class of ART. A majority (60%, n = 25) were scheduled for KT evaluation appointment, but of those, only 8% (n = 2) had evidence of appointments before dialysis dependence. Those without appointments had more schizophrenia (29% vs 4%, P = .02), resistance (78% vs 33%, P = .04), ART prescription (76% vs 48%, P = .04), and more kidney disease of unknown etiology (53% vs 8%, P = .02). CONCLUSION: Kidney transplant evaluation-eligible HIV-positive patients had a high rate of evaluation appointments, but a low rate of preemptive evaluation appointments. Schizophrenia and viral resistance disproportionally affected patients without evaluation appointments. These data precede the recommendation for universal ART for all HIV+ patients, regardless of CD4 count and viral load, and must be interpreted in the context of this limitation.


Subject(s)
Eligibility Determination , HIV Infections/complications , Kidney Diseases/virology , Kidney Transplantation/adverse effects , Adult , Anti-Retroviral Agents/therapeutic use , CD4 Lymphocyte Count , Electronic Health Records , Female , Glomerular Filtration Rate , HIV Infections/drug therapy , Humans , Kidney Diseases/complications , Kidney Transplantation/standards , Male , Middle Aged , Retrospective Studies , Viral Load
9.
J Ren Nutr ; 30(6): 561-566, 2020 11.
Article in English | MEDLINE | ID: mdl-32144072

ABSTRACT

OBJECTIVES: Over 40% of individuals in the United States with end-stage kidney disease have obesity. Little is known about renal dietitian perspectives on obesity management in the setting of dialysis dependence. DESIGN AND METHODS: An online 21-item survey was distributed to 118 renal dietitians via individual outreach and a professional organization e-mail listserv. Four themes were explored: the burden of obesity among dialysis patients, concepts of healthy weight loss, weight loss approaches, and challenges of obesity management in dialysis settings. Respondents were asked to rank approaches and biomarkers for obesity management from 0 (least important or not used) to 100 (most important). Free text fields were provided in each category for additional comments. RESULTS: Thirty-one renal dietitians responded to the survey (26% response rate). The majority of respondents (90%) indicated that access to kidney transplantation was the main reason that dialysis patients with obesity desired weight loss. Calorie restriction was rated as the most common weight loss approach, and dry weight as the most important weight loss biomarker. Nearly 40% of respondents do not alter their nutritional approach when dialysis patients with obesity are losing weight, and 42% of respondents do not monitor changes in waist circumference. Exercise, diet counseling, and stress management were variably prioritized as weight loss management strategies. Barriers to obesity management in dialysis settings included lack of time, lack of training in weight loss counseling, and gaps in current renal nutritional guidelines. CONCLUSION: Despite the high prevalence of obesity among individuals with end-stage kidney disease, the results of this survey suggest that current approaches to obesity management in dialysis settings are highly variable. Many renal dietitians lack time to counsel patients on healthy weight loss strategies. Nutritional guidelines are also needed to support people with dialysis dependence and obesity who desire or require weight loss.


Subject(s)
Dietetics/methods , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/therapy , Obesity Management/methods , Obesity/complications , Obesity/therapy , Renal Dialysis , Adult , Female , Humans , Male , Middle Aged , Pilot Projects
10.
Kidney Int ; 95(1): 199-209, 2019 01.
Article in English | MEDLINE | ID: mdl-30470437

ABSTRACT

Deceased-donor acute kidney injury (AKI) is associated with organ discard and delayed graft function, but data on longer-term allograft survival are limited. We performed a multicenter study to determine associations between donor AKI (from none to severe based on AKI Network stages) and all-cause graft failure, adjusting for donor, transplant, and recipient factors. We examined whether any of the following factors modified the relationship between donor AKI and graft survival: kidney donor profile index, cold ischemia time, donation after cardiac death, expanded-criteria donation, kidney machine perfusion, donor-recipient gender combinations, or delayed graft function. We also evaluated the association between donor AKI and a 3-year composite outcome of all-cause graft failure or estimated glomerular filtration rate ≤ 20 mL/min/1.73 m2 in a subcohort of 30% of recipients. Among 2,430 kidneys transplanted from 1,298 deceased donors, 585 (24%) were from donors with AKI. Over a median follow-up of 4.0 years, there were no significant differences in graft survival by donor AKI stage. We found no evidence that pre-specified variables modified the effect of donor AKI on graft survival. In the subcohort, donor AKI was not associated with the 3-year composite outcome. Donor AKI was not associated with graft failure in this well-phenotyped cohort. Given the organ shortage, the transplant community should consider measures to increase utilization of kidneys from deceased donors with AKI.


Subject(s)
Acute Kidney Injury/physiopathology , Graft Rejection/epidemiology , Kidney Transplantation/adverse effects , Tissue and Organ Procurement/standards , Adult , Aged , Allografts/physiopathology , Allografts/supply & distribution , Female , Follow-Up Studies , Glomerular Filtration Rate/physiology , Graft Rejection/physiopathology , Graft Survival , Humans , Kidney/physiopathology , Kidney Transplantation/methods , Longitudinal Studies , Male , Middle Aged , Time Factors , Tissue Donors , Tissue and Organ Procurement/methods , Transplantation, Homologous/adverse effects , Transplantation, Homologous/methods , Treatment Outcome
11.
Am J Transplant ; 19(4): 984-994, 2019 04.
Article in English | MEDLINE | ID: mdl-30506632

ABSTRACT

A consensus conference on frailty in kidney, liver, heart, and lung transplantation sponsored by the American Society of Transplantation (AST) and endorsed by the American Society of Nephrology (ASN), the American Society of Transplant Surgeons (ASTS), and the Canadian Society of Transplantation (CST) took place on February 11, 2018 in Phoenix, Arizona. Input from the transplant community through scheduled conference calls enabled wide discussion of current concepts in frailty, exploration of best practices for frailty risk assessment of transplant candidates and for management after transplant, and development of ideas for future research. A current understanding of frailty was compiled by each of the solid organ groups and is presented in this paper. Frailty is a common entity in patients with end-stage organ disease who are awaiting organ transplantation, and affects mortality on the waitlist and in the posttransplant period. The optimal methods by which frailty should be measured in each organ group are yet to be determined, but studies are underway. Interventions to reverse frailty vary among organ groups and appear promising. This conference achieved its intent to highlight the importance of frailty in organ transplantation and to plant the seeds for further discussion and research in this field.


Subject(s)
Frailty , Organ Transplantation , Societies, Medical , Health Care Rationing , Humans , United States
12.
Am J Kidney Dis ; 73(1): 112-118, 2019 01.
Article in English | MEDLINE | ID: mdl-29705074

ABSTRACT

Hahnemann University Hospital has performed 120 kidney transplantations in human immunodeficiency virus (HIV)-positive individuals during the last 16 years. Our patient population represents ∼10% of the entire US population of HIV-positive kidney recipients. In our earlier years of HIV transplantation, we noted increased rejection rates, often leading to graft failure. We have established a multidisciplinary team and over the years have made substantial protocol modifications based on lessons learned. These modifications affected our approach to candidate evaluation, donor selection, perioperative immunosuppression, and posttransplantation monitoring and resulted in excellent posttransplantation outcomes, including 100% patient and graft survival at 1 year and patient and graft survival at 3 years of 100% and 96%, respectively. We present key clinical data, including a granular patient-level analysis of the associations of antiretroviral therapy regimens with long-term survival, cellular and antibody-mediated rejection rates, and the causes of allograft failures. In summary, we provide details on the evolution of our approach to HIV transplantation during the last 16 years, including strategies that may improve outcomes among HIV-positive kidney transplantation candidates throughout the United States.


Subject(s)
HIV Seropositivity/complications , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/surgery , Kidney Transplantation , Aged , Female , Hospitals, University , Humans , Male , Retrospective Studies , Time Factors
13.
Am J Kidney Dis ; 72(4): 499-508, 2018 10.
Article in English | MEDLINE | ID: mdl-29728316

ABSTRACT

BACKGROUND: Advanced chronic kidney disease is associated with elevated risk for cognitive impairment. However, it is not known whether and how cognitive impairment is associated with planning and preparation for end-stage renal disease. STUDY DESIGN: Retrospective observational study. SETTING & PARTICIPANTS: 630 adults participating in the CRIC (Chronic Renal Insufficiency Cohort) Study who had cognitive assessments in late-stage CKD, defined as estimated glome-rular filtration rate ≤ 20mL/min/1.73m2, and subsequently initiated maintenance dialysis therapy. PREDICTOR: Predialysis cognitive impairment, defined as a score on the Modified Mini-Mental State Examination lower than previously derived age-based threshold scores. Covariates included age, race/ethnicity, educational attainment, comorbid conditions, and health literacy. OUTCOMES: Peritoneal dialysis (PD) as first dialysis modality, preemptive permanent access placement, venous catheter avoidance at dialysis therapy initiation, and preemptive wait-listing for a kidney transplant. MEASUREMENTS: Multivariable-adjusted logistic regression. RESULTS: Predialysis cognitive impairment was present in 117 (19%) participants. PD was the first dialysis modality among 16% of participants (n=100), 75% had preemptive access placed (n=473), 45% avoided using a venous catheter at dialysis therapy initiation (n=279), and 20% were preemptively wait-listed (n=126). Predialysis cognitive impairment was independently associated with 78% lower odds of PD as the first dialysis modality (adjusted OR [aOR], 0.22; 95% CI, 0.06-0.74; P=0.02) and 42% lower odds of venous catheter avoidance at dialysis therapy initiation (aOR, 0.58; 95% CI, 0.34-0.98; P=0.04). Predialysis cognitive impairment was not independently associated with preemptive permanent access placement or wait-listing. LIMITATIONS: Potential unmeasured confounders; single measure of cognitive function. CONCLUSIONS: Predialysis cognitive impairment is associated with a lower likelihood of PD as a first dialysis modality and of venous catheter avoidance at dialysis therapy initiation. Future studies may consider addressing cognitive function when testing strategies to improve patient transitions to dialysis therapy.


Subject(s)
Cognitive Dysfunction/epidemiology , Renal Dialysis/adverse effects , Renal Insufficiency, Chronic/psychology , Renal Insufficiency, Chronic/therapy , Transitional Care/organization & administration , Adult , Age Factors , Aged , Cognitive Behavioral Therapy/methods , Cognitive Dysfunction/diagnosis , Cohort Studies , Disease Progression , Female , Humans , Incidence , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/pathology , Kidney Failure, Chronic/therapy , Logistic Models , Male , Middle Aged , Multivariate Analysis , Neuropsychological Tests , Predictive Value of Tests , Prognosis , Renal Dialysis/methods , Renal Dialysis/psychology , Renal Insufficiency, Chronic/diagnosis , Retrospective Studies , Risk Assessment , Severity of Illness Index , Sex Factors , Treatment Outcome
14.
Clin Transplant ; 32(10): e13386, 2018 10.
Article in English | MEDLINE | ID: mdl-30132986

ABSTRACT

BACKGROUND: It is unknown whether the new kidney transplant allocation system (KAS) has attenuated the advantages of preemptive wait-listing as a strategy to minimize pretransplant dialysis exposure. METHODS: We performed a retrospective study of adult US deceased donor kidney transplant (DDKT) recipients between December 4, 2011-December 3, 2014 (pre-KAS) and December 4, 2014-December 3, 2017 (post-KAS). We estimated pretransplant dialysis durations by preemptive listing status in the pre- and post-KAS periods using multivariable gamma regression models. RESULTS: Among 65 385 DDKT recipients, preemptively listed recipients (21%, n = 13 696) were more likely to be white (59% vs 34%, P < 0.001) and have private insurance (64% vs 30%, P < 0.001). In the pre- and post-KAS periods, average adjusted pretransplant dialysis durations for preemptively listed recipients were <2 years in all racial groups. Compared to recipients who were listed after starting dialysis, preemptively listed recipients experienced 3.85 (95% Confidence Interval [CI] 3.71-3.99) and 4.53 (95% CI 4.32-4.74) fewer average years of pretransplant dialysis in the pre- and post-KAS periods, respectively (P < 0.001 for all comparisons). CONCLUSIONS: Preemptively wait-listed DDKT recipients continue to experience substantially fewer years of pretransplant dialysis than recipients listed after dialysis onset. Efforts are needed to improve both socioeconomic and racial disparities in preemptive wait-listing.


Subject(s)
Kidney Failure, Chronic/surgery , Kidney Transplantation , Renal Dialysis/statistics & numerical data , Resource Allocation , Tissue Donors/supply & distribution , Tissue and Organ Procurement/standards , Waiting Lists , Adult , Aged , Cadaver , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors
15.
Eur J Nutr ; 57(1): 191-198, 2018 Feb.
Article in English | MEDLINE | ID: mdl-27614626

ABSTRACT

PURPOSE: We hypothesized that anthropometrically predicted visceral adipose tissue (apVAT) accounts for more variance in blood-based biomarkers of glucose homeostasis, inflammation, and lipid metabolism than body mass index (BMI), waist circumference (WC), and the combination of BMI and WC (BMI + WC). METHODS: This was a cross-sectional analysis of 10,624 males and females who participated in the Third National Health and Nutrition Examination Survey (NHANES III; 1988-1994). apVAT was predicted from a validated regression equation that included age, height, weight, waist, and thigh circumferences. Bootstrapped linear regression models were used to compare the proportion of variance (R 2) in biomarkers explained by apVAT, BMI, WC, and BMI + WC. RESULTS: apVAT accounted for more variance in biomarkers of glucose homeostasis than BMI (∆R 2 = 8.4-11.8 %; P < 0.001), WC (∆R 2 = 5.5-8.4 %; P < 0.001), and BMI + WC (∆R 2 = 5.1-7.7 %; P < 0.001). apVAT accounted for more variance in biomarkers of inflammation than BMI (ΔR 2 = 3.8 %; P < 0.001), WC (ΔR 2 = 3.1 %; P < 0.001), and BMI + WC (ΔR 2 = 2.9 %; P < 0.001). apVAT accounted for more variance in biomarkers of lipid metabolism than BMI (ΔR 2 = 2.9-9.2 %; P < 0.001), WC (ΔR 2 = 2.9-5.2 %; P < 0.001), and BMI + WC (ΔR 2 = 2.4-4.1 %; P ≤ 0.01). CONCLUSIONS: apVAT, estimated with simple and widely used anthropometric measures, accounts for more variance in blood-based biomarkers than BMI, WC, and BMI + WC. Clinicians and researchers may consider utilizing apVAT to characterize cardio-metabolic health, particularly in settings with limited availability of imaging and laboratory data.


Subject(s)
Anthropometry/methods , Biomarkers/blood , Intra-Abdominal Fat , Adult , Aged , Aged, 80 and over , Blood Glucose/analysis , Body Composition , Body Height , Body Mass Index , Body Weight , Cross-Sectional Studies , Fasting , Female , Homeostasis , Humans , Inflammation/blood , Lipid Metabolism/physiology , Male , Metabolic Syndrome/diagnosis , Middle Aged , Nutrition Surveys , Thigh/anatomy & histology , Waist Circumference
16.
Transpl Infect Dis ; 19(6)2017 Dec.
Article in English | MEDLINE | ID: mdl-28921783

ABSTRACT

Human immunodeficiency virus (HIV)-infected patients have excellent outcomes following kidney transplantation (KT) but still might face barriers in the evaluation and listing process. The aim of this study was to characterize the patient population, referral patterns, and outcomes of HIV-infected patients who present for KT evaluation. We performed a single-center retrospective cohort study of HIV-infected patients who were evaluated for KT. The primary outcome was time to determination of eligibility for KT. Between 2011 and 2015, 105 HIV-infected patients were evaluated for KT. Of the 105 patients, 73 were listed for transplantation by the end of the study period. For those who were deemed ineligible, the most common reasons cited were active substance abuse (n = 7, 22%) and failure to complete the full transplant evaluation (n = 7, 22%). Our cohort demonstrated a higher proportion of HIV-infected patients eligible for KT than in previous studies, likely secondary to advances in HIV management. Among those who were denied access to transplantation, we identified that many were unable to complete the evaluation process, and that active substance abuse was common. Future prospective studies should examine reasons and potential interventions for the lack of follow-through and drug use we observed in this population.


Subject(s)
HIV Infections/complications , Kidney Failure, Chronic/surgery , Kidney Transplantation/legislation & jurisprudence , Patient Selection , Adult , Anti-Retroviral Agents/therapeutic use , Female , HIV Infections/drug therapy , HIV Infections/mortality , Humans , Kidney Failure, Chronic/etiology , Kidney Failure, Chronic/mortality , Kidney Transplantation/statistics & numerical data , Male , Middle Aged , Retrospective Studies
17.
Transpl Infect Dis ; 19(4)2017 Aug.
Article in English | MEDLINE | ID: mdl-28520146

ABSTRACT

BACKGROUND: Tenofovir disoproxil fumarate (TDF) is an antiretroviral agent frequently used to treat human immunodeficiency virus (HIV). There are concerns regarding its potential to cause acute kidney injury, chronic kidney disease, and proximal tubulopathy. Although TDF can effectively suppress HIV after kidney transplantation, it is unknown whether use of TDF-based antiretroviral therapy (ART) after kidney transplantation adversely affects allograft survival. METHODS: We examined 104 HIV+ kidney transplant (KT) recipients at our center between 2001 and 2014. We generated a propensity score for TDF treatment using recipient and donor characteristics. We then fit Cox proportional hazards models to investigate the association between TDF treatment and 3-year, death-censored primary allograft failure, adjusting for the propensity score and delayed graft function (DGF). RESULTS: Of the 104 HIV+ KT candidates who underwent transplantation during the study period, 23 (22%) were maintained on TDF-based ART at the time of transplantation, and 81 (78%) were on non-TDF-based ART. Median age of the cohort was 48 years; 87% were male; 88% were black; and median CD4 count at transplantation was 450 cells/mm3 . Median kidney donor risk index was 1.2. At 3 years post transplantation, primary allograft failure occurred in 26% of patients on TDF-based ART and in 28% of patients on non-TDF-based ART (P=.5). TDF treatment was not associated with primary allograft failure at 3 years post transplant after adjusting for DGF and a propensity score for TDF use (hazard ratio 2.12, 95% confidence interval 0.41-10.9). CONCLUSIONS: In a large single-center experience of HIV+ kidney transplantation, TDF use following kidney transplantation was not significantly associated with primary allograft failure. These results may help inform management for HIV+ KT recipients in need of TDF therapy for adequate viral suppression.


Subject(s)
Graft Survival/drug effects , HIV Infections/drug therapy , Kidney Transplantation/mortality , Tenofovir/therapeutic use , Adult , Allografts , Cohort Studies , Female , HIV Seropositivity , Humans , Male , Middle Aged , Retrospective Studies
18.
Am J Hum Biol ; 29(1)2017 Jan.
Article in English | MEDLINE | ID: mdl-27427402

ABSTRACT

OBJECTIVE: This study seeks to quantify the relationship between anthropometrically-predicted visceral adipose tissue (apVAT) and all-cause and cause-specific mortality among individuals of European descent in a population-based prospective cohort study of 10,624 participants. METHODS: The apVAT with a validated regression equation that included age, body mass index, and waist and thigh circumferences were predicted. RESULTS: During a median of 18.8 years, 3531 participants died with 1153 and 741 deaths attributable to cardiovascular disease and cancer, respectively. In multivariable-adjusted analyses that accounted for demographic, clinical, and behavioral characteristics, higher apVAT was associated with an increased risk of all-cause (Ptrend < .001), cardiovascular-specific (Ptrend < .001), and cancer-specific mortality (Ptrend = .007). Excluding participants with a history of cancer, myocardial infarction, heart failure, or diabetes at baseline did not substantively alter effect estimates. apVAT more accurately predicted all-cause, cardiovascular-specific, and cancer-specific mortality than body mass index (P < .001), waist circumference (P < .001), or the combination of body mass index and waist circumference (P < .001). CONCLUSIONS: These data provide evidence that apVAT is associated with all-cause and cause-specific mortality in a large population-based sample of men and women of European descent. These results support the use of apVAT to risk-stratify individuals for premature mortality when imaging data are not available such as in routine clinical practice or in large clinical trials.


Subject(s)
Anthropometry , Intra-Abdominal Fat/metabolism , Mortality, Premature , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Nutrition Surveys , Prospective Studies , Regression Analysis , United States/epidemiology , Young Adult
19.
Aging Clin Exp Res ; 29(2): 257-263, 2017 Apr.
Article in English | MEDLINE | ID: mdl-27020695

ABSTRACT

BACKGROUND: It is unknown if physical activity and good diet quality modify the risk of poor outcomes, such as mortality, among older adults with sarcopenia. AIM: To examine if physical activity and good diet quality modify the risk of poor outcomes, such as mortality, among older adults with sarcopenia. METHODS: A population-based cohort study among 1618 older adults with sarcopenia from the Third National Health and Nutrition Survey (NHANES III; 1988-1994). Sarcopenia was defined by the European Working Group on Sarcopenia in Older People. Physical activity was self-reported, and classified as sedentary (0 bouts per week), physically inactive (1-4 bouts per week), and physically active (≥5 bouts per week). Diet quality was assessed with the healthy eating index (a scale of 0-100 representing adherence to federal dietary recommendations), and classified as poor (<51), fair (51-80), and good (>80) diet quality. RESULTS: Compared to participants who were sedentary, those who were physically inactive were 16 % less likely to die [HR 0.84 (95 % CI 0.64-1.09)], and those who were physically active were 25 % less likely to die [HR 0.75 (95 % CI 0.59-0.97); P trend = 0.026]. Compared to participants with poor diet quality, those with fair diet quality were 37 % less likely to die [HR 0.63 (95 % CI 0.47-0.86)], and those with good diet quality were 45 % less likely to die [HR 0.55 (95 % CI 0.37-0.80); P trend = 0.002]. CONCLUSIONS: Participation in physical activity and consumption of a healthy diet correspond with a lower risk of mortality among older adults with sarcopenia. Randomized trials are needed in this population.


Subject(s)
Aging , Exercise/physiology , Feeding Behavior/physiology , Sarcopenia , Aged , Aging/physiology , Aging/psychology , Cohort Studies , Female , Healthy Lifestyle/physiology , Humans , Male , Mortality , Nutrition Surveys , Risk Factors , Sarcopenia/diagnosis , Sarcopenia/epidemiology , Sarcopenia/physiopathology , Statistics as Topic , United States/epidemiology
20.
J Am Soc Nephrol ; 27(4): 973-80, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26369343

ABSTRACT

Kidney transplantation is a cost-saving treatment that extends the lives of patients with ESRD. Unfortunately, the kidney transplant waiting list has ballooned to over 100,000 Americans. Across large areas of the United States, many kidney transplant candidates spend over 5 years waiting and often die before undergoing transplantation. However, more than 2500 kidneys (>17% of the total recovered from deceased donors) were discarded in 2013, despite evidence that many of these kidneys would provide a survival benefit to wait-listed patients. Transplant leaders have focused attention on transplant center report cards as a likely cause for this discard problem, although that focus is too narrow. In this review, we examine the risks associated with accepting various categories of donated kidneys, including discarded kidneys, compared with the risk of remaining on dialysis. With the goal of improving access to kidney transplant, we describe feasible proposals to increase acceptance of currently discarded organs.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement/statistics & numerical data , Tissue and Organ Procurement/standards , Humans , Medical Waste Disposal , Tissue Donors
SELECTION OF CITATIONS
SEARCH DETAIL