RESUMO
BACKGROUND: The heterogeneous clinical presentation of graft microvascular inflammation poses a major challenge to successful kidney transplantation. The effect of microvascular inflammation on allograft outcomes is unclear. METHODS: We conducted a cohort study that included kidney-transplant recipients from more than 30 transplantation centers in Europe and North America who had undergone allograft biopsy between 2004 and 2023. We integrated clinical and pathological data to classify biopsy specimens according to the 2022 Banff Classification of Renal Allograft Pathology, which includes two new diagnostic categories: probable antibody-mediated rejection and microvascular inflammation without evidence of an antibody-mediated response. We then assessed the association between the newly recognized microvascular inflammation phenotypes and allograft survival and disease progression. RESULTS: A total of 16,293 kidney-transplant biopsy specimens from 6798 patients were assessed. We identified the newly recognized microvascular inflammation phenotypes in 788 specimens, of which 641 were previously categorized as specimens with no evidence of rejection. As compared with patients without rejection, the hazard ratio for graft loss was 2.1 (95% confidence interval [CI], 1.5 to 3.1) among patients with microvascular inflammation without evidence of an antibody-mediated response and 2.7 (95% CI, 2.2 to 3.3) among patients with antibody-mediated rejection. Patients with a diagnosis of probable antibody-mediated rejection had a higher risk of graft failure beyond year 5 after biopsy than those without rejection (hazard ratio, 1.7; 95% CI, 0.8 to 3.5). Patients with a diagnosis of either newly recognized microvascular inflammation phenotype had a higher risk of progression of transplant glomerulopathy during follow-up than patients without microvascular inflammation. CONCLUSIONS: Microvascular inflammation in kidney allografts includes distinct phenotypes, with various disease progression and allograft outcomes. Our findings support the clinical use of additional rejection phenotypes to standardize diagnostics for kidney allografts. (Funded by OrganX. ClinicalTrials.gov number, NCT06496269.).
RESUMO
RATIONALE & OBJECTIVE: Exposure to extreme heat events has been linked to increased morbidity and mortality in the general population. Patients receiving maintenance dialysis may be vulnerable to greater risks from these events, but this is not well understood. We characterized the association of extreme heat events and the risk of death among patients receiving dialysis in the United States. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Data from the US Renal Data System were used to identify adults living in US urban settlements prone to extreme heat who initiated maintenance dialysis between 1997 and 2016. EXPOSURE: An extreme heat event, defined as a time-updated heat index (a humid-heat metric) exceeding 40.6°C for≥2 days or 46.1°C for≥1day. OUTCOME: Death. ANALYTICAL APPROACH: Cox proportional hazards regression to estimate the elevation in risk of death during a humid-heat event adjusted for age, sex, year of dialysis initiation, dialysis modality, poverty level, and climate region. Interactions between humid-heat and these same factors were explored. RESULTS: Among 945,251 adults in 245 urban settlements, the mean age was 63 years, and 44% were female. During a median follow-up period of 3.6 years, 498,049 adults were exposed to at least 1 of 7,154 extreme humid-heat events, and 500,025 deaths occurred. In adjusted models, there was an increased risk of death (hazard ratio 1.18 [95% CI, 1.15-1.20]) during extreme humid-heat exposure. The relative mortality risk was higher among patients living in the Southeast (P<0.001) compared with the Southwest. LIMITATIONS: Possibility of exposure misclassification, did not account for land use and air pollution co-exposures. CONCLUSIONS: This study suggests that patients receiving dialysis face an increased risk of death during extreme humid-heat exposure. PLAIN-LANGUAGE SUMMARY: Patients who receive dialysis are vulnerable to extreme weather events, and rising global temperatures may bring more frequent extreme heat events. We sought to determine whether extreme heat exposure was associated with an increased risk of death in urban-dwelling patients receiving dialysis across the United States. We found that people receiving dialysis were more likely to die during extreme humid-heat events, defined by a heat index exceeding 40.6°C (105°F) for≥2 days or 46.1°C (115°F) for≥1day. These findings inform the nephrology community about the potential importance of protecting patients receiving maintenance dialysis from the risks associated with extreme heat.
Assuntos
Calor Extremo , Diálise Renal , Humanos , Feminino , Masculino , Estudos Retrospectivos , Pessoa de Meia-Idade , Estados Unidos/epidemiologia , Idoso , Calor Extremo/efeitos adversos , Adulto , Umidade , Estudos de Coortes , Falência Renal Crônica/terapia , Falência Renal Crônica/mortalidade , Exposição Ambiental/efeitos adversos , Temperatura Alta/efeitos adversosRESUMO
INTRODUCTION: The Center for Medicare and Medicaid Services introduced an End-Stage Renal Disease Prospective Payment System (PPS) in 2011 to increase the utilization of home dialysis modalities, including peritoneal dialysis (PD). Several studies have shown a significant increase in PD utilization after PPS implementation. However, its impact on patients with kidney allograft failure remains unknown. METHODS: We conducted an interrupted time series analysis using data from the US Renal Data System (USRDS) that include all adult kidney transplant recipients with allograft failure who started dialysis between 2005 and 2019. We compared the PD utilization in the pre-PPS period (2005-2010) to the fully implemented post-PPS period (2014-2019) for early (within 90 days) and late (91-365 days) PD experience. RESULTS: A total of 27,507 adult recipients with allograft failure started dialysis during the study period. There was no difference in early PD utilization between the pre-PPS and the post-PPS period in either immediate change (0.3% increase; 95% CI: -1.95%, 2.54%; p = 0.79) or rate of change over time (0.28% increase per year; 95% CI: -0.16%, 0.72%; p = 0.18). Subgroup analyses revealed a trend toward higher PD utilization post-PPS in for-profit and large-volume dialysis units. There was a significant increase in PD utilization in the post-PPS period in units with low PD experience in the pre-PPS period. Similar findings were seen for the late PD experience. CONCLUSION: PPS did not significantly increase the overall utilization of PD in patients initiating dialysis after allograft failure.
Assuntos
Falência Renal Crônica , Transplante de Rim , Diálise Peritoneal , Sistema de Pagamento Prospectivo , Humanos , Falência Renal Crônica/terapia , Falência Renal Crônica/economia , Transplante de Rim/economia , Transplante de Rim/estatística & dados numéricos , Diálise Peritoneal/economia , Diálise Peritoneal/estatística & dados numéricos , Masculino , Feminino , Pessoa de Meia-Idade , Sistema de Pagamento Prospectivo/estatística & dados numéricos , Estados Unidos , Adulto , Idoso , Análise de Séries Temporais Interrompida , AloenxertosRESUMO
OBJECTIVE: In systemic lupus erythematosus, poor disease outcomes occur in young adults, patients identifying as Black or Hispanic, and socioeconomically disadvantaged patients. These identities and social factors differentially shape care access and quality that contribute to lupus health disparities in the US. Thus, our objective was to measure markers of care access and quality, including rheumatology visits (longitudinal care retention) and lupus-specific serology testing, by race and ethnicity, neighborhood disadvantage, and geographic context. METHODS: This cohort study used a geo-linked 20% national sample of young adult Medicare beneficiaries (ages 18-35) with lupus-coded encounters and a 1-year assessment period. Retention in lupus care required a rheumatology visit in each 6-month period, and serology testing required ≥1 complement or dsDNA antibody test within the year. Multivariable logistic regression models were fit for visit-based retention and serology testing to determine associations with race and ethnicity, neighborhood disadvantage, and geography. RESULTS: Among 1,036 young adults with lupus, 39% saw a rheumatologist every 6 months and 28% had serology testing. White beneficiaries from the least disadvantaged quintile of neighborhoods had higher visit-based retention than other beneficiaries (64% vs 30%-60%). Serology testing decreased with increasing neighborhood disadvantage quintile (aOR 0.80; 95% CI 0.71, 0.90) and in the Midwest (aOR 0.46; 0.30, 0.71). CONCLUSION: Disparities in care, measured by rheumatology visits and serology testing, exist by neighborhood disadvantage, race and ethnicity, and region among young adults with lupus, despite uniform Medicare coverage. Findings support evaluating lupus care quality measures and their impact on US lupus outcomes.
Assuntos
Disparidades em Assistência à Saúde , Lúpus Eritematoso Sistêmico , Medicare , Reumatologia , Adolescente , Adulto , Feminino , Humanos , Masculino , Adulto Jovem , Negro ou Afro-Americano/estatística & dados numéricos , Estudos de Coortes , Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Disparidades em Assistência à Saúde/estatística & dados numéricos , Modelos Logísticos , Lúpus Eritematoso Sistêmico/terapia , Retenção nos Cuidados/estatística & dados numéricos , Estados Unidos , Hispânico ou Latino , BrancosRESUMO
BACKGROUND: Delayed graft function (DGF) after kidney transplantation is associated with adverse patients and allograft outcomes. A longer duration of DGF is predictive of worse graft outcomes compared to a shorter duration. Posttransplant serum ß2-microglobulin (B2M) is associated with long-term graft outcomes, but its relationship with DGF recovery is unknown. METHODS: We included all kidney-only transplant recipients with DGF enrolled in the E-DGF trial. Duration of DGF was defined as the interval between the transplant and the last dialysis session. We analyzed the association of standardized serum creatinine (Scr) and B2M on postoperative Days (POD) 1-7 during the subsequent days of DGF with the recovery of DGF. RESULTS: A total of 97 recipients with DGF were included. The mean duration of DGF was 11.0 ± 11.2 days. Higher Scr was not associated with the duration of DGF in unadjusted or adjusted models. Higher standardized B2M, in contrast, was associated with a prolonged duration of DGF. This association remained in models adjusting for baseline characteristics from POD 2 (3.19 days longer, 95% CI: 0.46-5.93; p = 0.02) through Day 6 of DGF (4.97 days longer, 95% CI: 0.75-9.20; p = 0.02). There was minimal change in mean Scr (0.01 ± 0. 10 mg/dL per day; p = 0.32), while B2M significantly decreased as the time to recovery approached (-0.14 ± 0.05 mg/L per day; p = 0.006), among recipients with DGF. CONCLUSION: B2M is more strongly associated with DGF recovery than Scr. Posttransplant B2M may be an important biomarker to monitor during DGF. TRIAL REGISTRATION: ClinicalTrials.gov identifier: NCT03864926.
Assuntos
Biomarcadores , Função Retardada do Enxerto , Taxa de Filtração Glomerular , Sobrevivência de Enxerto , Transplante de Rim , Microglobulina beta-2 , Humanos , Transplante de Rim/efeitos adversos , Função Retardada do Enxerto/sangue , Função Retardada do Enxerto/etiologia , Feminino , Masculino , Microglobulina beta-2/sangue , Pessoa de Meia-Idade , Prognóstico , Biomarcadores/sangue , Seguimentos , Adulto , Fatores de Risco , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/sangue , Rejeição de Enxerto/diagnóstico , Falência Renal Crônica/cirurgia , Falência Renal Crônica/sangue , Recuperação de Função Fisiológica , Testes de Função Renal , Complicações Pós-Operatórias/sangue , Fatores de Tempo , Transplantados/estatística & dados numéricosRESUMO
BACKGROUND: While presumably less common with modern molecular diagnostic and imaging techniques, fever of unknown origin (FUO) remains a challenge in kidney transplant recipients (KTRs). Additionally, the impact of FUO on patient and graft survival is poorly described. METHODS: A cohort of adult KTRs between January 1, 1995 and December 31, 2018 was followed at the University of Wisconsin Hospital. Patients transplanted from January 1, 1995 to December 31, 2005 were included in the "early era"; patients transplanted from January 1, 2006 to December 31, 2018 were included in the "modern era". The primary objective was to describe the epidemiology and etiology of FUO diagnoses over time. Secondary outcomes included rejection, graft and patient survival. RESULTS: There were 5590 kidney transplants at our center during the study window. FUO was identified in 323 patients with an overall incidence rate of .8/100 person-years. Considering only the first 3 years after transplant, the incidence of FUO was significantly lower in the modern era than in the early era, with an Incidence Rate Ratio (IRR) per 100 person-years of .48; 95% CI: .35-.63; p < .001. A total of 102 (31.9%) of 323 patients had an etiology determined within 90 days after FUO diagnosis: 100 were infectious, and two were malignancies. In the modern era, FUO remained significantly associated with rejection (HR = 44.1; 95% CI: 16.6-102; p < .001) but not graft failure (HR = 1.21; 95% CI: .68-2.18; p = .52) total graft loss (HR = 1.17; 95% CI: .85-1.62; p = .34), or death (HR = 1.17; 95% CI: .79-1.76; p = .43. CONCLUSIONS: FUO is less common in KTRs during the modern era. Our study suggests infection remains the most common etiology. FUO remains associated with significant increases in risk of rejection, warranting further inquiry into the management of immunosuppressive medications in SOT recipients in the setting of FUO.
Assuntos
Febre de Causa Desconhecida , Transplante de Rim , Neoplasias , Adulto , Humanos , Incidência , Transplante de Rim/efeitos adversos , Febre de Causa Desconhecida/epidemiologia , Febre de Causa Desconhecida/etiologia , Febre de Causa Desconhecida/diagnósticoRESUMO
Describing risk factors and outcomes in kidney transplant recipients with oxalate nephropathy (ON) may help elucidate the pathogenesis and guide treatment strategies. We used a large single-center database to identify patients with ON and categorized them into delayed graft function with ON (DGF-ON) and late ON. Incidence density sampling was used to select controls. A total of 37 ON cases were diagnosed between 1/2011 and 1/2021. DGF-ON (n = 13) was diagnosed in 1.05% of the DGF population. Pancreatic atrophy on imaging (36.4% vs. 2.9%, p = 0.002) and gastric bypass history (7.7% vs. 0%; p = 0.06) were more common in DGF-ON than with controls with DGF requiring biopsy but without evidence of ON. DGF-ON was not associated with worse graft survival (p = 0.98) or death-censored graft survival (p = 0.48). Late ON (n = 24) was diagnosed after a mean of 78.2 months. Late ON patients were older (mean age 55.1 vs. 48.4 years; p = 0.02), more likely to be women (61.7% vs. 37.5%; p = 0.03), have gastric bypass history (8.3% vs. 0.8%; p = 0.02) and pancreatic atrophy on imaging (38.9% vs. 13.3%; p = 0.02). Late ON was associated with an increased risk of graft failure (HR 2.0; p = 0.07) and death-censored graft loss (HR 2.5; p = 0.10). We describe two phenotypes of ON after kidney transplantation: DGF-ON and late ON. Our study is the first to our knowledge to evaluate DGF-ON with DGF controls without ON. Although limited by small sample size, DGF-ON was not associated with adverse outcomes when compared with controls. Late ON predicted worse allograft outcomes.
Assuntos
Sobrevivência de Enxerto , Transplante de Rim , Fenótipo , Complicações Pós-Operatórias , Humanos , Transplante de Rim/efeitos adversos , Feminino , Masculino , Pessoa de Meia-Idade , Fatores de Risco , Prognóstico , Seguimentos , Complicações Pós-Operatórias/diagnóstico , Complicações Pós-Operatórias/etiologia , Taxa de Filtração Glomerular , Função Retardada do Enxerto/etiologia , Estudos Retrospectivos , Oxalatos/metabolismo , Testes de Função Renal , Nefropatias/etiologia , Nefropatias/cirurgia , Falência Renal Crônica/cirurgia , Adulto , Estudos de Casos e Controles , Rejeição de Enxerto/etiologia , Rejeição de Enxerto/diagnóstico , Rejeição de Enxerto/patologia , Taxa de SobrevidaRESUMO
BACKGROUND: Cholesterol reduction is considered a mechanism through which cholesterol-lowering drugs including statins are associated with a reduced aggressive prostate cancer risk. While prior cohort studies found positive associations between total cholesterol and more advanced stage and grade in White men, whether associations for total cholesterol, low (LDL)- and high (HDL)-density lipoprotein cholesterol, apolipoprotein B (LDL particle) and A1 (HDL particle), and triglycerides are similar for fatal prostate cancer and in Black men, who experience a disproportionate burden of total and fatal prostate cancer, is unknown. METHODS: We conducted a prospective study of 1553 Black and 5071 White cancer-free men attending visit 1 (1987-1989) of the Atherosclerosis Risk in Communities Study. A total of 885 incident prostate cancer cases were ascertained through 2015, and 128 prostate cancer deaths through 2018. We estimated multivariable-adjusted hazard ratios (HRs) of total and fatal prostate cancer per 1-standard deviation increments and for tertiles (T1-T3) of time-updated lipid biomarkers overall and in Black and White men. RESULTS: Greater total cholesterol concentration (HR per-1 SD = 1.25; 95% CI = 1.00-1.58) and LDL cholesterol (HR per-1 SD = 1.26; 95% CI = 0.99-1.60) were associated with higher fatal prostate cancer risk in White men only. Apolipoprotein B was nonlinearly associated with fatal prostate cancer overall (T2 vs. T1: HR = 1.66; 95% CI = 1.05-2.64) and in Black men (HR = 3.59; 95% CI = 1.53-8.40) but not White men (HR = 1.13; 95% CI = 0.65-1.97). Tests for interaction by race were not statistically significant. CONCLUSIONS: These findings may improve the understanding of lipid metabolism in prostate carcinogenesis by disease aggressiveness, and by race while emphasizing the importance of cholesterol control.
Assuntos
Colesterol , Neoplasias da Próstata , Masculino , Humanos , Triglicerídeos , HDL-Colesterol , Estudos Prospectivos , Apolipoproteínas , Neoplasias da Próstata/epidemiologia , Fatores de RiscoRESUMO
OBJECTIVE: Recent studies suggest young adults with systemic lupus erythematosus (SLE) have high 30-day readmission rates, which may necessitate tailored readmission reduction strategies. To aid in risk stratification for future strategies, we measured 30-day rehospitalization and mortality rates among Medicare beneficiaries with SLE and determined rehospitalization predictors by age. METHODS: In a 2014 20% national Medicare sample of hospitalizations, rehospitalization risk and mortality within 30 days of discharge were calculated for young (aged 18-35 yrs), middle-aged (aged 36-64 yrs), and older (aged 65+ yrs) beneficiaries with and without SLE. Multivariable generalized estimating equation models were used to predict rehospitalization rates among patients with SLE by age group using patient, hospital, and geographic factors. RESULTS: Among 1.39 million Medicare hospitalizations, 10,868 involved beneficiaries with SLE. Hospitalized young adult beneficiaries with SLE were more racially diverse, were living in more disadvantaged areas, and had more comorbidities than older beneficiaries with SLE and those without SLE. Thirty-day rehospitalization was 36% among young adult beneficiaries with SLE-40% higher than peers without SLE and 85% higher than older beneficiaries with SLE. Longer length of stay and higher comorbidity risk score increased odds of rehospitalization in all age groups, whereas specific comorbid condition predictors and their effect varied. Our models, which incorporated neighborhood-level socioeconomic disadvantage, had moderate-to-good predictive value (C statistics 0.67-0.77), outperforming administrative data models lacking comprehensive social determinants in other conditions. CONCLUSION: Young adults with SLE on Medicare had very high 30-day rehospitalization at 36%. Considering socioeconomic disadvantage and comorbidities provided good prediction of rehospitalization risk, particularly in young adults. Young beneficiaries with SLE with comorbidities should be a focus of programs aimed at reducing rehospitalizations.
Assuntos
Lúpus Eritematoso Sistêmico , Readmissão do Paciente , Pessoa de Meia-Idade , Adulto Jovem , Humanos , Idoso , Estados Unidos , Medicare , Estudos de Coortes , Estudos Retrospectivos , HospitalizaçãoRESUMO
Joint modeling and landmark modeling are two mainstream approaches to dynamic prediction in longitudinal studies, that is, the prediction of a clinical event using longitudinally measured predictor variables available up to the time of prediction. It is an important research question to the methodological research field and also to practical users to understand which approach can produce more accurate prediction. There were few previous studies on this topic, and the majority of results seemed to favor joint modeling. However, these studies were conducted in scenarios where the data were simulated from the joint models, partly due to the widely recognized methodological difficulty on whether there exists a general joint distribution of longitudinal and survival data so that the landmark models, which consists of infinitely many working regression models for survival, hold simultaneously. As a result, the landmark models always worked under misspecification, which caused difficulty in interpreting the comparison. In this paper, we solve this problem by using a novel algorithm to generate longitudinal and survival data that satisfies the working assumptions of the landmark models. This innovation makes it possible for a "fair" comparison of joint modeling and landmark modeling in terms of model specification. Our simulation results demonstrate that the relative performance of these two modeling approaches depends on the data settings and one does not always dominate the other in terms of prediction accuracy. These findings stress the importance of methodological development for both approaches. The related methodology is illustrated with a kidney transplantation dataset.
Assuntos
Modelos Estatísticos , Humanos , Simulação por Computador , Estudos LongitudinaisRESUMO
INTRODUCTION: Serum albumin is an indicator of overall health status, but it remains unclear how pre-transplant hypoalbuminemia is associated with early post-transplant outcomes. METHODS: This study included all adult kidney transplant recipients (KTRs) at our center from 01/01/2001-12/31/2017 with serum albumin measured within 30 days before transplantation. KTRs were grouped based on pretransplant albumin level normal (≥4.0 g/dL), mild (≥3.5 - < 4.0g/dL), moderate (≥3.0 - < 3.5g/dL), or severe hypoalbuminemia (<3.0g/dL). Outcomes of interest included: length of hospital stay (LOS), readmission within 30 days, delayed graft function(DGF), and re-operation related to post-transplant surgical complications. We also analyzed rejection, graft failure, and death within 6 months post-transplant. RESULTS: A total of 2807 KTRs were included 43.6% had normal serum albumin, 35.3% mild, 16.6% moderate, and 4.5% severe hypoalbuminemia. Mild and moderate hypoalbuminemia were associated with a shorter LOS by 1.22 (p < 0.001) and 0.80 days (p = 0.01), respectively, compared to normal albumin. Moderate (HR: 0.58; 95% CI: 0.37-0.91; p = 0.02) and severe hypoalbuminemia (HR: 0.21; 95% CI: 0.07-0.68; p = 0.01) were associated with significantly lower rates of acute rejection within 6 months post-transplant. CONCLUSION: Patients with pre-transplant hypoalbuminemia have post-transplant outcomes similar to those with normal serum albumin, but with a lower risk of acute rejection based on the degree of hypoalbuminemia.
Assuntos
Hipoalbuminemia , Transplante de Rim , Adulto , Humanos , Hipoalbuminemia/complicações , Transplante de Rim/efeitos adversos , Estudos Retrospectivos , Albumina Sérica , Transplantados , Fatores de Risco , Rejeição de Enxerto/etiologiaRESUMO
PURPOSE: Studies conducted in the northern United States found cytomegalovirus (CMV) disease after liver transplantation follows a seasonal pattern, with increased incidence in fall and winter. This has not been evaluated in kidney transplant recipients. Improved understanding of CMV seasonality may help guide use of preventative therapies. METHODS: We evaluated adult patients receiving a kidney transplant at our center in Wisconsin from January 1, 1995 to December 31, 2018. CMV event was defined as quantifiable viral replication with clinical signs or symptoms suspicious for CMV per current consensus recommendations. Seasons were divided as follows: winter (December-February), spring (March-May), summer (June-August), and fall (September-November). The primary objective was to evaluate the annual distribution of CMV disease and determine whether this differed by season. RESULTS: There were 6151 kidney transplants in the study period. A total of 913 patients had 1492 episodes of CMV. Median time from transplant to first detection was 5.51 months (interquartile range [IQR] 2.87-11.7). The observed overall incidence exceeded the expected incidence in winter (+.7%), spring (+5.5%), and fall (+3.4%) and was less than expected in summer (-9.5%) (p = .18). The incidence of CMV during summer, however, was 21% less than expected (p = .001) in recipients who were CMV positive (R+) at the time of transplantation. No such difference was observed in CMV negative recipients (R-; p = .58). CONCLUSION: CMV after kidney transplant appears to be less common during the summer season in patients who were R+ at transplant but does not follow seasonal variation in R-. Reasons for this are unclear but are likely related to CMV-specific cell-mediated immunity. These findings may have clinical implications, particularly the use of non-pharmacologic strategies to improve response to antiviral therapy.
Assuntos
Infecções por Citomegalovirus , Transplante de Rim , Adulto , Humanos , Estações do Ano , Citomegalovirus , Transplante de Rim/efeitos adversos , Antivirais/uso terapêutico , Infecções por Citomegalovirus/tratamento farmacológico , Infecções por Citomegalovirus/epidemiologia , Infecções por Citomegalovirus/etiologia , TransplantadosRESUMO
RATIONALE & OBJECTIVE: Evidence is mixed regarding the optimal choice of the first permanent vascular access for elderly patients receiving hemodialysis (HD). Lacking data from randomized controlled trials, we used a target trial emulation approach to compare arteriovenous fistula (AVF) versus arteriovenous graft (AVG) creation among elderly patients receiving HD. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Elderly patients included in the US Renal Data System who initiated HD with a catheter and had an AVF or AVG created within 6 months of starting HD. EXPOSURE: Creation of an AVF versus an AVG as the incident arteriovenous access. OUTCOMES: All-cause mortality, all-cause and cause-specific hospitalization, and sepsis. ANALYTICAL APPROACH: Target trial emulation approach, high-dimensional propensity score and inverse probability of treatment weighting, and instrumental variable analysis using the proclivity of the operating physician to create a fistula as the instrumental variable. RESULTS: A total of 19,867 patients were included, with 80.1% receiving an AVF and 19.9% an AVG. In unweighted analysis, AVF creation was associated with significantly lower risks of mortality and hospitalization, especially within 6 months after vascular access creation. In inverse probability of treatment weighting analysis, AVF creation was associated with lower incidences of mortality and hospitalization within 6 months after creation (hazard ratios of 0.82 [95% CI, 0.75-0.91] and 0.82 [95% CI, 0.78-0.87] for mortality and all-cause hospitalization, respectively), but not between 6 months and 3 years after access creation. No association between AVF creation and mortality, sepsis, or all-cause, cardiovascular disease-related, or infection-related hospitalization was found in instrumental variable analyses. However, AVF creation was associated with a lower risk of access-related hospitalization not due to infection. LIMITATIONS: Potential for unmeasured confounding, analyses limited to elderly patients, and absence of data on actual access use during follow-up. CONCLUSIONS: Using observational data to emulate a target randomized controlled trial, the type of initial arteriovenous access created was not associated with the risks of mortality, sepsis, or all-cause, cardiovascular disease-related, or infection-related hospitalization among elderly patients who initiated HD with a catheter.
Assuntos
Derivação Arteriovenosa Cirúrgica , Falência Renal Crônica , Sepse , Idoso , Derivação Arteriovenosa Cirúrgica/efeitos adversos , Hospitalização , Humanos , Falência Renal Crônica/terapia , Diálise Renal/efeitos adversos , Estudos Retrospectivos , Sepse/terapiaRESUMO
Frailty is commonly assessed during kidney transplant recipient (KTR) evaluation. However, individual frailty components may have varying impact on post-transplant outcomes. In this single-center study of 825 KTRs, we determined the association between the individual components of a modified Fried frailty score and delayed graft function (DGF), early hospital readmission (EHR), cardiovascular (CV) events, acute rejection (AR), death censored graft failure (DCGF), and death. Sum frailty ≥3 was significantly associated with EHR (aOR = 3.62; 95% CI: 1.21-10.80). Among individual components, only grip strength was significantly associated with EHR (aOR = 1.54; 95% CI: 1.03-2.31). The addition of grip strength to a model with the other four components resulted in Net Reclassification Improvement (NRI) of 20.51% (p = .01). Similarly, only grip strength was significantly associated with CV events (aOR = 1.79; 95% CI: 1.12-2.86). The addition of grip strength to a model with the other four components resulted in NRI of 27.37% (p = .006). No other frailty components were associated with the outcomes of interest. Based on our findings, handgrip strength may be an important tool while assessing frailty, mainly predicting early readmission and cardiovascular events post-transplant.
Assuntos
Fragilidade , Transplante de Rim , Transplantes , Humanos , Fragilidade/diagnóstico , Fragilidade/etiologia , Transplante de Rim/efeitos adversos , Força da Mão , Transplantados , Fatores de RiscoRESUMO
BACKGROUND: Access flow dysfunction, often associated with stenosis, is a common problem in hemodialysis access and may result in progression to thrombosis. Timely identification of accesses in need of evaluation is critical to preserving a functioning access. We hypothesized that a risk score using measurements obtained from the Vasc-Alert surveillance device could be used to predict subsequent interventions. METHODS: Measurement of five factors over the preceding 28 days from 1.46 million hemodialysis treatments (6163 patients) were used to develop a score associated with interventions over the subsequent 60 days. The score was validated in a separate dataset of 298,620 treatments (2641 patients). RESULTS: Interventions in arteriovenous fistulae (AVF; n = 4125) were much more common in those with the highest score (36.2%) than in those with the lowest score (11.0). The score also was strongly associated with interventions in patients with an arteriovenous graft (AVG; n = 2,038; 43.2% vs. 21.1%). There was excellent agreement in the Validation datasets for AVF (OR = 2.67 comparing the highest to lowest score) and good agreement for AVG (OR = 1.92). CONCLUSIONS: This simple risk score based on surveillance data may be useful for prioritizing patients for physical examination and potentially early referral for intervention.
Assuntos
Derivação Arteriovenosa Cirúrgica , Derivação Arteriovenosa Cirúrgica/efeitos adversos , Constrição Patológica/etiologia , Oclusão de Enxerto Vascular/diagnóstico , Oclusão de Enxerto Vascular/etiologia , Oclusão de Enxerto Vascular/terapia , Humanos , Diálise Renal/efeitos adversos , Fatores de Risco , Resultado do Tratamento , Grau de Desobstrução VascularRESUMO
RATIONALE & OBJECTIVE: Creation of an arteriovenous fistula (AVF), compared with an arteriovenous graft (AVG), is associated with longer initial catheter dependence after starting hemodialysis (HD) but longer access survival and lower long-term catheter dependence. The extent of these potential long-term benefits in elderly patients is unknown. We assessed catheter dependence after AVF or AVG placement among elderly patients who initiated HD without a permanent access in place. STUDY DESIGN: Retrospective cohort study. SETTING & PARTICIPANTS: Patients≥67 years of age identified in the US Renal Data System who had a first AVF (n=14,532) or AVG (n=3,391) placed within 1 year after HD initiation between May 2012 and May 2017. EXPOSURE: AVF versus AVG placement in the first year of HD. OUTCOME: Catheter dependence after AVF or AVG placement assessed using CROWNWeb data. ANALYTICAL APPROACH: Generalized estimating equations and negative binomial regression for catheter use over time and Cox proportional hazards models for mortality. RESULTS: Creation of an AVF versus AVG placement was associated with greater catheter dependence at 1 month (95.6% vs 92.5%) and 3 months (82.8% vs 41.2%), but lower catheter dependence at 12 months (14.2% vs 15.8%) and 36 months (8.2% vs 15.0%). Creation of an AVF, however, remained significantly associated with greater cumulative catheter-dependent days (80.1 vs 54.6 days per person-year) and a lower proportion of catheter-free survival time (78.1% vs 85.1%) after 3 years of follow-up. LIMITATIONS: Potential for unmeasured confounding and analyses limited to elderly patients. CONCLUSIONS: Creation of an AVF was associated with significantly greater cumulative catheter dependence than placement of an AVG in an elderly population initiating HD without a permanent access. As the long-term benefits in terms of catheter dependence of an AVF are not realized in many elderly patients, specific patient characteristics should be considered when making decisions regarding vascular access.
Assuntos
Derivação Arteriovenosa Cirúrgica/efeitos adversos , Catéteres , Oclusão de Enxerto Vascular/epidemiologia , Falência Renal Crônica/terapia , Diálise Renal/efeitos adversos , Medição de Risco/métodos , Fatores Etários , Idoso , Feminino , Seguimentos , Oclusão de Enxerto Vascular/etiologia , Humanos , Incidência , Masculino , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida/tendências , Fatores de Tempo , Resultado do Tratamento , Estados Unidos/epidemiologiaRESUMO
Anti-glomerular basement membrane (GBM) disease causes rapidly progressive glomerulonephritis and end-stage kidney disease (ESKD). Studies of post-transplant outcomes in patients with ESKD due to anti-GBM disease in the United States are lacking. To better characterize outcomes of transplant recipients with a history of anti-GBM disease, we examined patient survival and graft survival among recipients with anti-GBM disease compared with IgA nephropathy at a single center in the United States. We analyzed patient survival, graft survival, disease recurrence, and malignancy rates for kidney transplant recipients with ESKD due to biopsy-proven anti-GBM disease who underwent kidney transplantation at our center between 1994 and 2015. 26 patients with biopsy-proven anti-GBM disease and 314 patients with IgAN underwent kidney transplantation from 1994 to 2015. The incidence of graft loss was 6.2 per 100 person-years for anti-GBM disease, which was similar to IgAN (4.08 per 100 person-years, p = .09). Patient mortality for anti-GBM was 0.03 per 100 person-years, similar to IgAN (0.02 per 100 person-years, p = .12). Disease recurrence occurred in one of the 26 anti-GBM patients. Four out of 26 patients (15%) developed malignancy, most commonly skin cancer. Long-term graft and patient survival for patients with ESKD due to anti-GBM was similar to IgAN after kidney transplantation.
Assuntos
Doença Antimembrana Basal Glomerular , Glomerulonefrite por IGA , Falência Renal Crônica , Transplante de Rim , Doença Antimembrana Basal Glomerular/etiologia , Sobrevivência de Enxerto , Humanos , Rim , Falência Renal Crônica/etiologia , Falência Renal Crônica/cirurgia , Recidiva Local de Neoplasia , Recidiva , TransplantadosRESUMO
Hypomagnesemia is common in kidney transplant recipients (KTRs). We sought to explore the relationship between Mg and outcomes in KTRs, which may be associated with mortality and thus may be a potential intervention target to improve outcomes. We followed KTRs performed between 01/2000 and 6/2016 at a large US transplant center from 6 months post-transplant to graft failure, death, or loss to follow-up. Using Mg as a time-dependent variable, associations between Mg and outcomes any time after 6 months post-transplant were evaluated. 3680 KTRs with 50 413 Mg measurements met inclusion criteria. 657 deaths occurred over a median follow-up of 5.1 years. Compared to Mg of 1.5-1.8 mg/dl, both lower (HR 1.17, 95% confidence interval (CI): 1.07-1.28) and higher (HR 1.16, 95% CI: 1.09-1.23) Mg levels were associated with greater risk of mortality. Similar U-shaped associations were observed for Mg and cardiovascular disease-related mortality (HR for Mg ≤1.5 mg/dl: 1.31; CI: 1.03-1.68) and infection-related mortality (HR for Mg ≤1.5 mg/dl: 1.28; CI: 1.09-1.51), although relationships for Mg >1.8 mg/dl were not statistically significant. Mg exhibits a U-shaped association with mortality in KTRs, with levels between 1.5 and 1.8 mg/dl associated with the lowest risk.
Assuntos
Transplante de Rim , Magnésio , Estudos de Coortes , Humanos , Fatores de Risco , TransplantadosRESUMO
Delayed graft function (DGF) is a common complication associated with significant untoward effects in kidney-alone transplantation. The incidence and outcomes following kidney delayed graft function (K-DGF) among patients undergoing simultaneous pancreas-kidney (SPK) transplantation are less certain. We analyzed SPK recipients transplanted at our center between January 1994 and December 2017. A total of 632 recipients fulfilled the selection criteria, including 69 (11%) with K-DGF and 563 without. The incidence of K-DGF was significantly higher in recipients of organs from older donors and donation after circulatory death (DCD). The presence of K-DGF was significantly associated with an increased risk of pancreas graft failure during the first 90 days (n = 9, incidence rate [IR] 2.45/100 person-months), but not with late pancreas failure (n = 32, IR 0.84/100 person-months), kidney graft failure, or patient death. Although DCD was associated with K-DGF, it was not associated with either pancreas (hazard ratio [HR] 0.91, 95% CI 0.58-1.44, P = .69) or kidney (HR 1.09, 95% CI 0.66-1.82, P = .74) graft failure after adjustment for potential confounders. We found K-DGF to be a significant risk factor for pancreas graft failure but not kidney graft failure, with the major risk period being early (<90 days) posttransplant, and the major donor risk factor being older donor age.
Assuntos
Transplante de Rim , Aloenxertos , Função Retardada do Enxerto/etiologia , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Humanos , Rim , Transplante de Rim/efeitos adversos , Pâncreas , Estudos Retrospectivos , Fatores de Risco , Doadores de TecidosRESUMO
Third-party vascular allografts (VAs) are an invaluable resource in kidney and pancreas transplantation when vascular reconstruction is needed and additional vessels from the organ donor are not available. We report the largest single-center experience to date on VA use, at a high-volume U.S. transplant center. Over a 7-year period, VAs were used for vascular reconstruction of 65 kidneys and 5 pancreases, in 69 recipients. The renal vein required reconstruction more often with right kidney transplantation (72.5% vs 27.5%, P < .001), and the renal artery required reconstruction more often with left kidney transplantation (67.6% vs 32.4%, P = .003). Eleven patients (15.9%) developed anti-VA de novo HLA donor-specific antibodies (dnDSAs) at a median time after transplantation of 19.0 months. Higher number of HLA mismatches between the VA donor and the recipient, and development of anti-organ allograft dnDSAs were significant predictors of anti-VA dnDSA development. Those with anti-VA dnDSAs had a higher rate of organ allograft rejection (45.4% vs 13.8%, P = .03) compared to those without, but there was no significant difference in incidence of vascular complications or graft outcomes. VAs can help circumvent challenging surgical situations. Anti-VA dnDSAs do not adversely affect organ allograft outcomes; however, they can contribute to HLA sensitization in the recipients.