Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 50
Filter
1.
Kidney Int Rep ; 9(6): 1571-1573, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38899210
2.
Clin Transplant ; 38(2): e15256, 2024 02.
Article in English | MEDLINE | ID: mdl-38400674

ABSTRACT

BACKGROUND: Post-transplant health-related quality of life (HRQOL) is associated with health outcomes for kidney transplant (KT) recipients. However, pretransplant predictors of improvements in post-transplant HRQOL remain incompletely understood. Namely, important pretransplant cultural factors, such as experience of discrimination, perceived racism in healthcare, or mistrust of the healthcare system, have not been examined as potential HRQOL predictors. Also, few have examined predictors of decline in HRQOL post-transplant. METHODS: Using data from a prospective cohort study, we examined HRQOL change pre- to post-transplant, and novel cultural predictors of the change. We measured physical, mental, and kidney-specific HRQOL as outcomes, and used cultural factors as predictors, controlling for demographic, clinical, psychosocial, and transplant knowledge covariates. RESULTS: Among 166 KT recipients (57% male; mean age 50.6 years; 61.4% > high school graduates; 80% non-Hispanic White), we found mental and physical, but not kidney-specific, HRQOL significantly improved post-transplant. No culturally related factors outside of medical mistrust significantly predicted change in any HRQOL outcome. Instead, demographic, knowledge, and clinical factors significantly predicted decline in each HRQOL domain: physical HRQOL-older age, more post-KT complications, higher pre-KT physical HRQOL; mental HRQOL-having less information pre-KT, greater pre-KT mental HRQOL; and, kidney-specific HRQOL-poorer kidney functioning post-KT, lower expectations for physical condition to improve, and higher pre-KT kidney-specific HRQOL. CONCLUSIONS: Instead of cultural factors, predictors of HRQOL decline included demographic, knowledge, and clinical factors. These findings are useful for identifying patient groups that may be at greater risk of poorer post-transplant outcomes, in order to target individualized support to patients.


Subject(s)
Kidney Transplantation , Humans , Male , Middle Aged , Female , Kidney Transplantation/psychology , Quality of Life/psychology , Prospective Studies , Trust , Kidney
3.
Am J Transplant ; 24(5): 781-794, 2024 May.
Article in English | MEDLINE | ID: mdl-38307416

ABSTRACT

We analyzed whether there is an interaction between the Kidney Donor Profile Index (KDPI) and cold ischemia time (CIT) in recipients of deceased donor kidney transplant (KTs). Adults who underwent KTs in the United States between 2014 and 2020 were included and divided into 3 KDPI groups (≤20%, 21%-85%, >85%) and 4 CIT strata (<12, 12-17.9, 18-23.9, ≥24 hours). Multivariate analyses were used to test the interaction between KDPI and CIT for the following outcomes: primary graft nonfunction (PGNF), delayed graft function (DGF), estimated glomerular filtration rate (eGFR) at 6 and 12 months, patient survival, graft survival, and death-censored graft survival (DCGS). A total of 69,490 recipients were analyzed: 18,241 (26.3%) received a graft with KDPI ≤20%, 46,953 (67.6%) with KDPI 21%-85%, and 4,296 (6.2%) with KDPI >85%. Increasing KDPI and CIT were associated with worse post-KT outcomes. Contrary to our hypothesis, howerver, the interaction between KDPI and CIT was statistically significant only for PGNF and DGF and eGFR at 6 months. Paradoxically, the negative coefficient of the interaction suggested that increasing duration of CIT was more detrimental for low and intermediate-KDPI organs relative to high-KDPI grafts. Conversely, for mortality, graft survival, and DCGS, we found that the interaction between CIT and KDPI was not statistically significant. We conclude that, high KDPI and prolonged CIT are independent risk factors for inferior outcomes after KT. Their interaction, however, is statistically significant only for the short-term outcomes and more pronounced on low and intermediate-KDPI grafts than high-KDPI kidneys.


Subject(s)
Cold Ischemia , Delayed Graft Function , Glomerular Filtration Rate , Graft Survival , Kidney Transplantation , Tissue Donors , Humans , Male , Female , Middle Aged , Tissue Donors/supply & distribution , Risk Factors , Adult , Follow-Up Studies , Delayed Graft Function/etiology , Prognosis , Survival Rate , Retrospective Studies , Kidney Failure, Chronic/surgery , Graft Rejection/etiology , Kidney Function Tests , Tissue and Organ Procurement , Postoperative Complications
4.
Clin Transplant ; 38(1): e15157, 2024 01.
Article in English | MEDLINE | ID: mdl-37792310

ABSTRACT

INTRODUCTION: Self-reported measures of immunosuppression adherence have been largely examined in research settings. METHODS: In this single center study of 610 kidney transplant recipients, we examined if a voluntary, non-anonymous self-report measure could identify non-adherence in a routine clinic setting and how patients perceived such a measure. Non-adherence was measured using the Basel Assessment of Adherence to Immunosuppressive Medications Scale (BAASIS) and patient perception was elicited using a customized questionnaire. RESULTS: Non-responders to the survey (15%) were younger, more likely to be black, and less likely to have had a pre-emptive transplant. Among complete responders (n = 485), 38% reported non-adherence with non-adherent patients being younger (54 y vs. 60 y; p = .01), less likely to have been on dialysis pre-transplant (59% vs. 68%; p = .04), further out from transplant (37 vs. 22 months; p < .001) and had more rejections in the preceding year (8% vs. 3%; p = .02). Self-reported non-adherence was associated with higher calcineurin inhibitor intra-patient variability (27.4% vs. 24.5%; p = .02), but not with donor-specific antibody detection (27.8% vs. 21.2%, p = .15). Of patients providing feedback (n = 500), the majority of patients felt comfortable reporting adherence (92%), that the survey was relevant to their visit (71%), and that the survey did not interfere with their clinic visit (88%). CONCLUSION: In summary, a self-reported questionnaire during clinic visits identified immunosuppression non-adherence in a significant proportion of patients and was well received by patients. Integrating self-report measures into routine post-transplant care may enable early identification of non-adherence.


Subject(s)
Kidney Transplantation , Humans , Self Report , Immunosuppressive Agents/therapeutic use , Surveys and Questionnaires , Immunosuppression Therapy , Transplant Recipients , Medication Adherence
6.
Am Surg ; 89(4): 1286-1289, 2023 Apr.
Article in English | MEDLINE | ID: mdl-33631945

ABSTRACT

Enteric hyperoxaluria (EH) is a known complication of Roux-en-Y gastric bypass (RYGB) and can lead to nephrolithiasis, oxalate-induced nephropathy, and end-stage renal disease. Recurrent EH-induced renal impairment has been reported after kidney transplantation and may lead to allograft loss. EH occurs in up to one quarter of patients following malabsorption-based bariatric operations. We present a report of medically refractory EH in a renal transplant recipient with allograft dysfunction that was successfully managed with reversal of RYGB. The patient developed renal failure 7 years following gastric bypass requiring renal transplant. Following an uneventful living donor kidney transplant, the patient developed recurrent subacute allograft dysfunction. A diagnosis of oxalate nephropathy was made based on biopsy findings of renal tubular calcium oxalate deposition in conjunction with elevated serum oxalate levels and elevated 24-hr urinary oxalate excretion. Progressive renal failure ensued despite medical management. The patient underwent reversal of her RYGB, which resulted in recovery of allograft function. This report highlights an under-recognized, potentially treatable cause of renal allograft failure in patients with underlying gastrointestinal pathology or history of bariatric surgery and proposes a strategy for management of patients with persistent hyperoxaluria based on a review of the literature.


Subject(s)
Gastric Bypass , Hyperoxaluria , Kidney Transplantation , Renal Insufficiency , Humans , Female , Gastric Bypass/adverse effects , Kidney Transplantation/adverse effects , Calcium Oxalate/urine , Oxalates , Hyperoxaluria/surgery , Hyperoxaluria/complications , Allografts
7.
J Am Soc Nephrol ; 34(1): 26-39, 2023 01 01.
Article in English | MEDLINE | ID: mdl-36302599

ABSTRACT

BACKGROUND: In March 2021, the United States implemented a new kidney allocation system (KAS250) for deceased donor kidney transplantation (DDKT), which eliminated the donation service area-based allocation and replaced it with a system on the basis of distance from donor hospital to transplant center within/outside a radius of 250 nautical miles. The effect of this policy on kidney discards and logistics is unknown. METHODS: We examined discards, donor-recipient characteristics, cold ischemia time (CIT), and delayed graft function (DGF) during the first 9 months of KAS250 compared with a pre-KAS250 cohort from the preceding 2 years. Changes in discards and CIT after the onset of COVID-19 and the implementation of KAS250 were evaluated using an interrupted time-series model. Changes in allocation practices (biopsy, machine perfusion, and virtual cross-match) were also evaluated. RESULTS: Post-KAS250 saw a two-fold increase in kidneys imported from nonlocal organ procurement organizations (OPO) and a higher proportion of recipients with calculated panel reactive antibody (cPRA) 81%-98% (12% versus 8%; P <0.001) and those with >5 years of pretransplant dialysis (35% versus 33%; P <0.001). CIT increased (mean 2 hours), including among local OPO kidneys. DGF was similar on adjusted analysis. Discards after KAS250 did not immediately change, but we observed a statistically significant increase over time that was independent of donor quality. Machine perfusion use decreased, whereas reliance on virtual cross-match increased, which was associated with shorter CIT. CONCLUSIONS: Early trends after KAS250 show an increase in transplant access to patients with cPRA>80% and those with longer dialysis duration, but this was accompanied by an increase in CIT and a suggestion of worsening kidney discards.


Subject(s)
COVID-19 , Kidney Transplantation , Tissue and Organ Procurement , Humans , United States , Kidney , Tissue Donors , Antibodies , Graft Survival , Delayed Graft Function/epidemiology
8.
Hum Immunol ; 84(3): 214-223, 2023 Mar.
Article in English | MEDLINE | ID: mdl-36581507

ABSTRACT

Virtual crossmatch (VXM) is used as an alternative to or in conjunction with a cell-based physical crossmatch (PXM) for assessing HLA (human leukocyte antigen) compatibility prior to deceased donor kidney transplantation (DDKT). Data on practice patterns and perceptions regarding VXM use in the US are limited. We performed a survey of US HLA directors and transplant surgeons regarding HLA testing and crossmatch strategies. 53 (56 %) HLA directors and 68 surgeons (representing âˆ¼ 23 % of US transplant centers) completed the survey. Both groups agreed that VXM could reduce cold ischemia time (CIT), costs and improve allocation efficiency. VXM use increased following the 2021 kidney allocation change. Reducing CIT was the primary reason for favoring VXM over PXM. Preference for VXM reduced as candidates' panel reactive antibodies increased. Regulations, program policies and limitations of HLA technology were cited as important reasons for preferring PXM over VXM. Surgeons reported similar perceptions, but findings are limited by the low response rate. Finally, half the labs reported lacking specific protocols for VXM use. In conclusion, improved HLA technology and protocols along with changes to institutional procedures and policy regulations are needed for safer expansion of VXM in DDKT.


Subject(s)
Kidney Transplantation , Humans , United States , Kidney Transplantation/methods , Blood Grouping and Crossmatching , Histocompatibility Testing/methods , Kidney , HLA Antigens , Histocompatibility , Graft Rejection
9.
Kidney Int ; 102(6): 1371-1381, 2022 12.
Article in English | MEDLINE | ID: mdl-36049641

ABSTRACT

The long-term impact of early subclinical inflammation (SCI) through surveillance biopsy has not been well studied. To do this, we recruited a prospective observational cohort that included 1000 sequential patients who received a kidney transplant from 2013-2017 at our center. A total of 586 patients who underwent a surveillance biopsy in their first year post-transplant were included after excluding those with clinical rejections, and those who were unable to undergo a surveillance biopsy. Patients were classified based on their biopsy findings: 282 with NSI (No Significant Inflammation) and 304 with SCI-T (SCI and Tubulitis) which was further subdivided into 182 with SC-BLR (Subclinical Borderline Changes) and 122 with SC-TCMR (Subclinical T Cell Mediated Rejection, Banff 2019 classification of 1A or more). We followed the clinical and immunological events including Clinical Biopsy Proven Acute Rejection [C-BPAR], long-term kidney function and death-censored graft loss over a median follow-up of five years. Episodes of C-BPAR were noted at a median of two years post-transplant. Adjusted odds of having a subsequent C-BPAR was significantly higher in the SCI-T group [SC-BLR and SC-TCMR] compared to NSI 3.8 (2.1-7.5). The adjusted hazard for death-censored graft loss was significantly higher with SCI-T compared to NSI [1.99 (1.04-3.84)]. Overall, SCI detected through surveillance biopsy within the first year post-transplant is a harbinger for subsequent immunological events and is associated with a significantly greater hazard for subsequent C-BPAR and death-censored graft loss. Thus, our study highlights the need for identifying patients with SCI through surveillance biopsy and develop strategies to prevent further alloimmune injuries.


Subject(s)
Graft Rejection , Graft Survival , Humans , Risk Factors , Biopsy , Inflammation/pathology , Allografts/pathology , Kidney/pathology
10.
Clin Transplant ; 36(9): e14768, 2022 09.
Article in English | MEDLINE | ID: mdl-35801650

ABSTRACT

BACKGROUND: Survival into the second decade after cardiothoracic transplantation (CTX) is no longer uncommon. Few data exist on any health-related quality of life (HRQOL) impairments survivors face, or whether they may even experience positive psychological outcomes indicative of "thriving" (e.g., personal growth). We provide such data in a long-term survivor cohort. METHODS: Among 304 patients prospectively studied across the first 2 years post-CTX, we re-interviewed patients ≥15 years post-CTX. We (a) examined levels of HRQOL and positive psychological outcomes (posttraumatic growth related to CTX, purpose in life, life satisfaction) at follow-up, (b) evaluated change since transplant with mixed-effects models, and (c) identified psychosocial and clinical correlates of study outcomes with multivariable regression. RESULTS: Of 77 survivors, 64 (83%) were assessed (35 heart, 29 lung recipients; 15-19 years post-CTX). Physical HRQOL was poorer than the general population norm and earlier post-transplant levels (P's < .001). Mental HRQOL exceeded the norm (P < .001), with little temporal change (P = .070). Mean positive psychological outcome scores exceeded scales' midpoints at follow-up. Life satisfaction, assessed longitudinally, declined over time (P < .001) but remained similar to the norm at follow-up. Recent hospitalization and dyspnea increased patients' likelihood of poor physical HRQOL at follow-up (P's ≤ .022). Lower sense of mastery and poorer caregiver support lessened patients' likelihood of positive psychological outcomes (P's ≤ .049). Medical comorbidities and type of CTX were not associated with study outcomes at follow-up. CONCLUSIONS: Despite physical HRQOL impairment, long-term CTX survivors otherwise showed favorable outcomes. Clinical attention to correlates of HRQOL and positive psychological outcomes may help maximize survivors' well-being.


Subject(s)
Lung Transplantation , Quality of Life , Cohort Studies , Humans , Lung Transplantation/psychology , Quality of Life/psychology , Survivors
11.
Clin Transplant ; 36(9): e14759, 2022 09.
Article in English | MEDLINE | ID: mdl-35778369

ABSTRACT

BACKGROUND: High kidney-donor profile index (KDPI) kidneys have a shorter survival than grafts with lower KDPI values. It is still unclear, however, whether their shorter longevity depends on an inferior baseline function, faster functional decline, or the combination of both. METHODS: We analyzed the estimated glomerular filtration rate (eGFR) of 605 consecutive recipients of deceased donor kidney transplants (KT) at 1, 3, 6, 12, 18, 24, 36, 48, and 60 months. Comparisons were performed among four groups based on KDPI quartile: Group I-KDPI ≤ 25% (n = 151), Group II-KDPI 26-50% (n = 182), Group III-KDPI 51-75% (n = 176), and Group IV-KDPI 〉 75% (n = 96). Linear mixed model analysis was subsequently used to assess whether KDPI was independently associated with the decline in eGFR during the first 5-years after KT. We also analyzed the incidence of delayed graft function (DGF), rejection within the first year after KT, patient survival, graft survival, and death censored graft survival based on KDPI group. FINDINGS: High-KDPI grafts had lower eGFR immediately after KT, had a higher incidence of DGF and rejection. However, there were no signifcant differences in the adjusted rate (slope) of decline in eGFR among the four KDPI groups (P = .06). Although patient survival was signigicantly lower for recipients of high-KDPI grafts, death-censored graft survival was similar among the four KDPI groups (P = .33). CONCLUSIONS: The shorter functional survival of high-KDPI grafts seems to be due to their lower baseline eGFR rather than a more rapid functional decline after KT.


Subject(s)
Kidney Transplantation , Tissue Donors , Glomerular Filtration Rate , Graft Survival , Humans , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors
12.
Clin Transplant ; 36(9): e14776, 2022 09.
Article in English | MEDLINE | ID: mdl-35821617

ABSTRACT

BACKGROUND: In kidney transplantation, delayed graft function (DGF) is associated with increased morbidity and a higher risk of graft failure. Prior research suggests that chronic hypotension increases DGF risk, but the relationship of preoperative blood pressure to DGF is unclear. METHODS: In this single center study of adult deceased donor kidney transplant recipients transplanted between 2015 and 2019, we evaluated the question of whether preoperative mean arterial pressure (MAP) affected DGF risk. Additionally, we investigated whether the risk of DGF was moderated by certain donor and recipient characteristics. For recipient characteristics associated with increased DGF risk and preoperative MAP, we performed a mediation analysis to estimate the proportion of DGF risk mediated through preoperative MAP. RESULTS: Among 562 deceased donor kidney recipients, DGF risk decreased as preoperative MAP increased, with a 2% lower risk per 1 mm Hg increase in MAP. This increased risk was similar, with no statistically significant interaction effect between preoperative MAP and donor (donation after circulatory death) and recipient characteristics (diabetes, body mass index, and use of anti-hypertensive medications). Preoperative MAP was negativity correlated with recipient BMI and duration of pre transplant dialysis. On mediation analysis, MAP accounted for 12% and 16% of the DGF risk associated with recipient BMI and pre-transplant dialysis duration, respectively. CONCLUSION: In deceased donor kidney transplantation, each 1 mm Hg increase in preoperative MAP was associated with 2% lower DGF risk. Preoperative MAP was influenced by recipient BMI and dialysis duration, and likely contributes to some of the high DGF risk from obesity and long dialysis vintage.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Adult , Antihypertensive Agents , Blood Pressure , Delayed Graft Function/etiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Tissue Donors
13.
Transpl Int ; 35: 10253, 2022.
Article in English | MEDLINE | ID: mdl-35572466

ABSTRACT

Transplantation of kidneys from shorter donors into taller recipients may lead to suboptimal allograft survival. The effect of discrepancy in donor and recipient heights (ΔHeight) on long term transplant outcomes is not known. Adult patients ≥18 years undergoing living or deceased donor (LD or DD) kidney transplants alone from donors ≥18 years between 2000 and 2016 in the United States were included in this observational study. The cohort was divided into three groups based on ΔHeight of 5 inches as 1) Recipient < Donor (DD: 31,688, LD: 12,384), 2) Recipient = Donor (DD: 84,711, LD: 54,709), and 3) Recipient > Donor (DD: 21,741, LD: 18,753). Univariate analysis showed a higher risk of DCGL and mortality in both DD and LD (p < 0.001 for both). The absolute difference in graft and patient survival between the two extremes of ΔHeight was 5.7% and 5.7% for DD, and 0.4% and 1.4% for LD. On multivariate analysis, the HR of DCGL for Recipient < Donor and Recipient > Donor was 0.95 (p = 0.05) and 1.07 (p = 0.01) in DD and 0.98 (p = 0.55) and 1.14 (p < 0.001) in LD. Similarly, the corresponding HR of mortality were 0.97 (p = 0.07) and 1.07 (p = 0.003) for DD and 1.01 (p < 0.001) and 1.05 (p = 0.13) for LD. For DGF, the HR were 1.04 (p = 0.1) and 1.01 (p = 0.7) for DD and 1.07 (p = 0.45) and 0.89 (p = 0.13) for LD. Height mismatch between the donor and recipient influences kidney transplant outcomes.


Subject(s)
Kidney Transplantation , Adult , Cohort Studies , Graft Survival , Humans , Kidney , Living Donors , Tissue Donors , United States/epidemiology
14.
Kidney360 ; 3(3): 426-434, 2022 03 31.
Article in English | MEDLINE | ID: mdl-35582179

ABSTRACT

Background: Investigations of health-related quality of life (HRQoL) in AKI have been limited in number, size, and domains assessed. We surveyed AKI survivors to describe the range of HRQoL AKI-related experiences and examined potential differences in AKI effects by sex and age at AKI episode. Methods: AKI survivors among American Association of Kidney Patients completed an anonymous online survey in September 2020. We assessed: (1) sociodemographic characteristics; (2) effects of AKI-physical, emotional, social; and (3) perceptions about interactions with health care providers using quantitative and qualitative items. Results: Respondents were 124 adult AKI survivors. Eighty-four percent reported that the AKI episode was very/extremely impactful on physical/emotional health. Fifty-seven percent reported being very/extremely concerned about AKI effects on work, and 67% were concerned about AKI effects on family. Only 52% of respondents rated medical team communication as very/extremely good. Individuals aged 22-65 years at AKI episode were more likely than younger/older counterparts to rate the AKI episode as highly impactful overall (90% versus 63% younger and 75% older individuals; P=0.04), more impactful on family (78% versus 50% and 46%; P=0.008), and more impactful on work (74% versus 38% and 10%; P<0.001). Limitations of this work include convenience sampling, retrospective data collection, and unknown AKI severity. Conclusions: These findings are a critical step forward in understanding the range of AKI experiences/consequences. Future research should incorporate more comprehensive HRQoL measures, and health care professionals should consider providing more information in their patient communication about AKI and follow-up.


Subject(s)
Acute Kidney Injury/psychology , Patient Reported Outcome Measures , Quality of Life , Survivors/psychology , Acute Kidney Injury/epidemiology , Adult , Age Factors , Aged , Health Impact Assessment , Humans , Middle Aged , Quality of Life/psychology , Retrospective Studies , Sex Factors , United States/epidemiology , Young Adult
15.
Transpl Int ; 35: 10094, 2022.
Article in English | MEDLINE | ID: mdl-35368641

ABSTRACT

Anti-HLA Donor Specific Antibody (DSA) detection post kidney transplant has been associated with adverse outcomes, though the impact of early DSA screening on stable patients remain unclear. We analyzed impact of DSA detection through screening in 1st year stable patients (n = 736) on subsequent estimated glomerular filtration rate (eGFR), death censored graft survival (DCGS), and graft failure (graft loss including return to dialysis or re-transplant, patient death, or eGFR < 20 ml/min at last follow up). Patients were grouped using 1st year screening into DSA+ (Class I, II; n = 131) or DSA- (n = 605). DSA+ group were more DR mismatched (p = 0.02), more sensitized (cPRA ≥90%, p = 0.002), less Caucasian (p = 0.04), and had less pre-emptive (p = 0.04) and more deceased donor transplants (p = 0.03). DSA+ patients had similar eGFR (54.8 vs. 53.8 ml/min/1.73 m2, p = 0.56), DCGS (91% vs. 94%, p = 0.30), and graft failure free survival (76% vs. 82%, p = 0.11). DSA timing and type did not impact survival. Among those with a protocol biopsy (n = 515), DSA detected on 1st year screening was a predictor for graft failure on multivariate analysis (1.91, 95% CI 1.03-3.55, p = 0.04). Overall, early DSA detection in stable patients was an independent risk factor for graft failure, though only among those who underwent a protocol biopsy.


Subject(s)
Kidney Transplantation , Graft Rejection , HLA Antigens , Humans , Kidney Transplantation/adverse effects , Tissue Donors , Transplant Recipients
16.
Transplantation ; 106(4): e219-e233, 2022 04 01.
Article in English | MEDLINE | ID: mdl-35135973

ABSTRACT

BACKGROUND: Racial/ethnic minorities face known disparities in likelihood of kidney transplantation. These disparities may be exacerbated when coupled with ongoing substance use, a factor also reducing likelihood of transplantation. We examined whether race/ethnicity in combination with ongoing substance use predicted incidence of transplantation. METHODS: Patients were enrolled between March 2010 and October 2012 at the time of transplant evaluation. Substance use data were retrieved from transplant evaluations. Following descriptive analyses, the primary multivariable analyses evaluated whether, relative to the referent group (White patients with no substance use), racial/ethnic minority patients using any substances at the time of evaluation were less likely to receive transplants by the end of study follow-up (August 2020). RESULTS: Among 1152 patients, 69% were non-Hispanic White, 23% non-Hispanic Black, and 8% Other racial/ethnic minorities. White, Black, and Other patients differed in percentages of current tobacco smoking (15%, 26%, and 18%, respectively; P = 0.002) and illicit substance use (3%, 8%, and 9%; P < 0.001) but not heavy alcohol consumption (2%, 4%, and 1%; P = 0.346). Black and Other minority patients using substances were each less likely to receive transplants than the referent group (hazard ratios ≤0.45, P ≤ 0.021). Neither White patients using substances nor racial/ethnic minority nonusers differed from the referent group in transplant rates. Additional analyses indicated that these effects reflected differences in waitlisting rates; once waitlisted, study groups did not differ in transplant rates. CONCLUSIONS: The combination of minority race/ethnicity and substance use may lead to unique disparities in likelihood of transplantation. To facilitate equity, strategies should be considered to remove any barriers to referral for and receipt of substance use care in racial/ethnic minorities.


Subject(s)
Kidney Transplantation , Substance-Related Disorders , Ethnic and Racial Minorities , Ethnicity , Healthcare Disparities , Humans , Minority Groups , United States/epidemiology
17.
Cureus ; 14(1): e21405, 2022 Jan.
Article in English | MEDLINE | ID: mdl-35198312

ABSTRACT

Arterial blood gas (ABG) analysis is a generally reliable and frequently employed test for evaluating blood oxygen content. False readings of low oxygen content are rare but can be expected in specific clinical scenarios such as leukemia patients with marked leukocytosis who can develop "leukocyte larceny," a phenomenon of excess oxygen consumption by leukocytes. Awareness of this phenomenon may lead to early recognition and avoidance of unnecessary diagnostic and therapeutic interventions. This case report presents a patient with marked leukocytosis from chronic myelogenous leukemia whose extubation was briefly delayed due to pseudohypoxemia on ABG measurements.

18.
Clin Transplant ; 36(4): e14582, 2022 04.
Article in English | MEDLINE | ID: mdl-35000234

ABSTRACT

Antithymocyte globulin (ATG) is a commonly used induction agent in kidney transplant recipients. However, the optimal dosing has not been well defined. Our protocol aims for a 5-6 mg/kg cumulative dose. It is unclear if a dose lower than 5 mg/kg is associated with more rejection. We performed a retrospective cohort study of patients who received a kidney transplant at our center between January 1, 2013 and December 31, 2016. Primary outcome was biopsy proven acute rejection (clinical and subclinical) in the first 6 months after kidney transplant. CMV viremia in high risk (D+/R-) recipients and BK viremia was compared as a secondary endpoint. Of the 543 patients, the Low Dose (LD) group (n = 56) received <5 mg/kg ATG and Regular Dose (RD) group (n = 487) received ≧5 mg/kg. Patients in RD were more sensitized (higher PRA and CPRA). LD received a dose of 4 ± 1.1 mg/kg ATG whereas RD received 5.6 ± .3 mg/kg ATG (P < .001). TCMR (Banff 1A or greater) was present in 34% of patients in LD versus 22% in RD (P = .04) (OR 2.1; 95%CI 1.12-3.81; P = .019). There was no difference in the incidence of CMV or BK viremia. ATG doses lower than 5 mg/kg may be associated with a heightened risk of rejection despite a low degree of sensitization.


Subject(s)
Cytomegalovirus Infections , Kidney Transplantation , Antilymphocyte Serum , Cytomegalovirus Infections/diagnosis , Cytomegalovirus Infections/drug therapy , Cytomegalovirus Infections/etiology , Graft Rejection/diagnosis , Graft Rejection/drug therapy , Graft Rejection/etiology , Humans , Immunosuppressive Agents , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Retrospective Studies , Viremia/complications
19.
Transplant Direct ; 8(1): e1256, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34912945

ABSTRACT

Barriers to medication adherence may differ from barriers in other domains of adherence. In this study, we assessed the association between pre-kidney transplantation (KT) factors with nonadherent behaviors in 3 different domains post-KT. METHODS: We conducted a prospective cohort study with patient interviews at initial KT evaluation (baseline-nonadherence predictors in sociodemographic, condition-related, health system, and patient-related psychosocial factors) and at ≈6 mo post-KT (adherence outcomes: medications, healthcare follow-up, and lifestyle behavior). All patients who underwent KT at our institution and had ≈6-mo follow-up interview were included in the study. We assessed nonadherence in 3 different domains using continuous composite measures derived from the Health Habit Survey. We built multiple linear and logistic regression models, adjusting for baseline characteristics, to predict adherence outcomes. RESULTS: We included 173 participants. Black race (mean difference in adherence score: -0.72; 95% confidence interval [CI], -1.12 to -0.32) and higher income (mean difference: -0.34; 95% CI, -0.67 to -0.02) predicted lower medication adherence. Experience of racial discrimination predicted lower adherence (odds ratio, 0.31; 95% CI, 0.12-0.76) and having internal locus of control predicted better adherence (odds ratio, 1.46; 95% CI, 1.06-2.03) to healthcare follow-up. In the lifestyle domain, higher education (mean difference: 0.75; 95% CI, 0.21-1.29) and lower body mass index (mean difference: -0.08; 95% CI, -0.13 to -0.03) predicted better adherence to dietary recommendations, but no risk factors predicted exercise adherence. CONCLUSIONS: Different nonadherence behaviors may stem from different motivation and risk factors (eg, clinic nonattendance due to experiencing racial discrimination). Thus adherence intervention should be individualized to target at-risk population (eg, bias reduction training for medical staff to improve patient adherence to clinic visit).

20.
PLoS One ; 16(8): e0254115, 2021.
Article in English | MEDLINE | ID: mdl-34437548

ABSTRACT

Due to shortage of donor, kidney transplants (KTs) from donors with acute kidney injury (AKI) are expanding. Although previous studies comparing clinical outcomes between AKI and non-AKI donors in KTs have shown comparable results, data on high-volume analysis of KTs outcomes with AKI donors are limited. This study aimed to analyze the selection trends of AKI donors and investigate the impact of AKI on graft failure using the United states cohort data. We analyzed a total 52,757 KTs collected in the Scientific Registry of Transplant Recipient (SRTR) from 2010 to 2015. The sample included 4,962 (9.4%) cases of KTs with AKI donors (creatinine ≥ 2 mg/dL). Clinical characteristics of AKI and non-AKI donors were analyzed and outcomes of both groups were compared. We also analyzed risk factors for graft failure in AKI donor KTs. Although the incidence of delayed graft function was higher in recipients of AKI donors compared to non-AKI donors, graft and patient survival were not significantly different between the two groups. We found donor hypertension, cold ischemic time, the proportion of African American donors, and high KDPI were risk factors for graft failure in AKI donor KTs. KTs from deceased donor with AKI showed comparable outcomes. Thus, donors with AKI need to be considered more actively to expand donor pool. Caution is still needed when donors have additional risk factors of graft failure.


Subject(s)
Acute Kidney Injury , Donor Selection , Graft Rejection/mortality , Kidney Transplantation/mortality , Registries , Acute Kidney Injury/mortality , Acute Kidney Injury/surgery , Adult , Female , Humans , Male , Retrospective Studies , Risk Management
SELECTION OF CITATIONS
SEARCH DETAIL
...