Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 213
Filtrar
1.
medRxiv ; 2024 Aug 29.
Artículo en Inglés | MEDLINE | ID: mdl-39252893

RESUMEN

Background: Studies on middle-aged or individuals with cognitive or cardiovascular impairments, have established that intensive blood pressure (BP) control reduces cognitive decline risk. However, uncertainty exists on differential effects between antihypertensive medications (AHM) classes on this risk, independent of BP-lowering efficacy, particularly in community-dwelling hypertensive older adults. Methods: A post-hoc analysis of the ASPREE study, a randomized trial of low-dose aspirin in adults aged 70+ years (65+ if US minorities) without baseline dementia, and followed for two years post-trial. Cox proportional-hazards regression models were used to estimate associations between baseline and time-varying AHM exposure and incident dementia (an adjudicated primary trial endpoint), in participants with baseline hypertension. Subgroup analyses included prespecified factors, APO ε4 carrier status and monotherapy AHM use. Results: Most hypertensive participants (9,843/13,916; 70.7%) used AHMs. Overall, 'any' AHM use was not associated with lower incident dementia risk, compared with untreated participants (HR 0.84, 95%CI 0.70-1.02, p=0.08), but risk was decreased when angiotensin receptor blockers (ARBs) were included (HR 0.73, 95%CI 0.59-0.92, p=0.007). ARBs and ß-blockers decreased dementia risk, whereas angiotensin-converting enzyme inhibitors (ACEIs) and diuretics increased risk. There was no association with RAS modulating or blood-brain-barrier crossing AHMs on dementia risk. Conclusions: Overall, AHM exposure in hypertensive older adults was not associated with decreased dementia risk, however, specific AHM classes were with risk direction determined by class; ARBs and ß-blockers were superior to ACEIs and other classes in decreasing risk. Our findings emphasize the importance of considering effects beyond BP-lowering efficacy when choosing AHM in older adults.

2.
Clin Kidney J ; 17(8): sfae217, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39139183

RESUMEN

Background: Very low calorie diets (VLCDs) are an obesity treatment option in the general population, but their efficacy and safety in patients on haemodialysis (HD) is unknown. Methods: Prospective single arm study of VLCD in haemodialysis patients. All participants received 2.5-3.3 MJ/day for 12 weeks. Weekly assessment of VLCD, pre- and post-dialysis weight, inter-dialytic weight gain, and blood electrolytes occurred for the first 4 weeks, then fortnightly for another 8 weeks. Linear mixed models compared the change in weight over time as well as biochemical outcomes including potassium. Results: Twenty-two participants [nine home HD (HHD) and 13 satellite HD (SHD)] enrolled with 19 completing the 12-week intervention. Mean post-dialysis weight declined from 121.1 kg at baseline to 109.9 at week 12 resulting in average decline of 0.88 kg per week (95% C.I. 0.71, 1.05, P < .001) with 12-week mean percentage weight loss9.3% (SD 3.5). Mean post-dialysis body mass index declined from 40.9 kg/m2 at baseline to 37.1 kg/m2 at week 12 (95% C.I. 0.25, 0.35, P < .001). Serum potassium rose from week 1 to 3, stabilized during weeks 4 to 6, and fell from week 8, returning near baseline by week 12. Six of the nine (66.6%) HHD participants and seven of the 13 (70%) SHD participants had at least one episode of hyperkalaemia (K > 6 mmol/l). There were no clinical changes in serum sodium, corrected calcium, or phosphate levels during the study. Conclusion: VLCD with dietitian supervision was effective in producing significant weight reduction, with an acceptable safety profile in patients treated with haemodialysis.

3.
Artículo en Inglés | MEDLINE | ID: mdl-39185614

RESUMEN

High-quality randomized trial evidence is lacking on whether low-dose aspirin exerts significant effects on blood pressure (BP) in older adults. The authors assessed longitudinal BP changes in participants enrolled in ASPirin in Reducing Events in the Elderly (ASPREE), a randomized, placebo-controlled trial of 100 mg daily aspirin in 19 114 community-dwelling Australian and U.S. adults without cardiovascular disease (CVD), dementia, or independence-limiting physical disability. Participants' BP was recorded at baseline and annual study visits, and managed by their usual care provider. BP trajectories for aspirin versus placebo during 4.7 years of follow-up were examined for systolic and diastolic BP separately, using linear mixed models to account for between and within-individual variability in BP. Analyses by subgroups were also explored with inclusion of interaction terms in the models. The difference in mean change in systolic BP between aspirin and placebo during study follow-up was -0.03 mm Hg (95% confidence interval [CI]: -0.13, 0.07; p = .541) (aspirin minus placebo), while the mean difference for change in diastolic BP was -0.05 mm Hg (95% CI: -0.11, 0.01; p = .094). These small, non-significant differences in BP change between the aspirin and placebo groups were consistent across baseline levels of BP and antihypertensive treatment status (treated/untreated). Likewise, subgroups of age, sex, chronic kidney disease, diabetes, and frailty revealed no interaction effect between the subgroup, aspirin treatment, and time. Interval-censored Cox proportional hazards regression showed no difference in rates of incident treated hypertension between aspirin and placebo-treated participants. The authors conclude that daily low-dose aspirin does not significantly affect BP in older adults when managed by usual care.

4.
Clin Transl Immunology ; 13(7): e1523, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39055736

RESUMEN

Objectives: Despite vaccination strategies, people with chronic kidney disease, particularly kidney transplant recipients (KTRs), remained at high risk of poor COVID-19 outcomes. We assessed serological responses to the three-dose COVID-19 vaccine schedule in KTRs and people on dialysis, as well as seroresponse predictors and the relationship between responses and breakthrough infection. Methods: Plasma from 30 KTRs and 17 people receiving dialysis was tested for anti-Spike receptor binding domain (RBD) IgG and neutralising antibodies (NAb) to the ancestral and Omicron BA.2 variant after Doses 2 and 3 of vaccination. Results: After three doses, KTRs achieved lower anti-Spike RBD IgG levels (P < 0.001) and NAb titres than people receiving dialysis (P = 0.002). Seropositive cross-reactive Omicron neutralisation levels were achieved in 11/27 (40.7%) KTRs and 11/14 (78.6%) dialysis recipients. ChAdOx1/viral-vector vaccine type, higher mycophenolate dose (> 1 g per day) and lower absolute B-cell counts predicted poor serological responses in KTRs. ChAdOx-1 vaccine type and higher monocyte counts were negative predictors in dialysis recipients. Among ancestral NAb seroresponders, higher NAb levels positively correlated with higher Omicron neutralisation (R = 0.9, P < 0.001). More KTRs contracted SARS-CoV-2 infection (14/30; 47%) than dialysis recipients (5/17; 29%) and had more severe disease. Those with breakthrough infections had significantly lower median interdose incremental change in anti-Spike RBD IgG and ancestral NAb titres. Conclusion: Serological responses to COVID-19 vaccines in KTRs lag behind their dialysis counterparts. KTRs remained at high risk of breakthrough infection after their primary vaccination schedule underlining their need for booster doses, strict infection prevention measures and close surveillance.

6.
Clin Kidney J ; 17(5): sfae103, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38938326

RESUMEN

Background: Worldwide, most people requiring kidney replacement therapy receive haemodialysis (HD) three times per week. Greater HD time and/or frequency may improve survival, but implementation requires understanding potential benefits across the range of patients. Methods: Using data from the Australia and New Zealand Dialysis and Transplant Registry, we assessed whether quotidian HD (defined as >3 sessions/week and/or >5 h/session) was associated with reduced mortality in adult patients. The primary outcome of all-cause mortality was analysed by a time-varying Cox proportional hazards model with quotidian HD as the exposure of interest. Results: Of 24 138 people who received HD between 2011 and 2019, 2632 (10.9%) received quotidian HD at some stage. These patients were younger, more likely male and more likely to receive HD at home. Overall, quotidian versus standard HD was associated with a decreased risk for all-cause mortality {crude hazard ratio [HR] 0.50 [95% confidence interval (CI) 0.45-0.56]}, but an interaction between quotidian HD and age was identified (P = .005). Stratified by age groups and splitting follow-up time where proportional hazards were violated, the corresponding HR compared with standard HD was 2.43 (95% CI 1.56-3.79) for people >75 years of age in the first year of quotidian HD, 1.52 (95% CI 0.89-2.58) for 1-3 years and 0.95 (95% CI 0.51-1.78) for ≥3 years. There was no significant survival advantage in younger people. Conclusions: Although quotidian HD conferred survival benefit in crude analyses, people ≥75 years of age had greater mortality with quotidian HD than standard HD. The mortality benefit in younger people was attenuated when adjusted for known confounders.

7.
Transplant Direct ; 10(7): e1659, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38881745

RESUMEN

Background: Mycophenolate dose reduction (MDR) is associated with acute rejection and transplant failure in kidney transplant recipients (KTRs). The optimal dose to prevent rejection and reduce complications remains poorly defined in tacrolimus-based regimens. Methods: We assessed adult KTRs from 2005 to 2017 initiated on mycophenolate mofetil 2 g/d, tacrolimus, and prednisolone from the Australia and New Zealand Dialysis and Transplant Registry. KTRs with rejection within the first 30 d posttransplant were excluded. The primary outcome was time to first rejection between 30 d and 2 y posttransplant. Mycophenolate dose was modeled as a time-varying covariate using Cox proportional hazards regression. Secondary outcomes included assessment of early MDR to <1.5 g/d within the first 6 mo posttransplant and subsequent patient and death-censored graft survival. Results: In the primary analysis, 3590 KTRs were included. Compared with mycophenolate dose of ≥2 g/d, both 1.0-<1.5 and <1 g/d were associated with an increased risk of rejection during the 2 y posttransplant (hazard ratio [HR] 1.67; 95% confidence interval [CI], 1.29-2.16; P < 0.001 and HR 2.06; 95% CI, 1.36-3.13; P = 0.001, respectively) but not 1.5-<2 g/d (HR 1.20; 95% CI, 0.94-1.53; P = 0.14). Early MDR to <1.5 g/d occurred in 45.3% of KTRs and was an independent risk factor for death-censored graft failure (HR 1.32; 95% CI, 1.05-1.66; P = 0.016) but not death (HR 1.18; 95% CI, 0.97-1.44; P = 0.10), during a median follow-up of 5.0 (interquartile range, 2.6-8.5) y. Conclusions: Early MDR was a risk factor for subsequent rejection and graft failure in KTRs receiving contemporary tacrolimus-based regimens.

8.
Clin Kidney J ; 17(3): sfad245, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-38468698

RESUMEN

Background: Diabetes mellitus (DM) is associated with a greater risk of mortality in kidney transplant patients, primarily driven by a greater risk of cardiovascular disease (CVD)-related mortality. However, the associations between diabetes status at time of first allograft loss and mortality on dialysis remain unknown. Methods: All patients with failed first kidney allografts transplanted in Australia and New Zealand between 2000 and 2020 were included. The associations between diabetes status at first allograft loss, all-cause and cause-specific mortality were examined using competing risk analyses, separating patients with diabetes into those with pre-transplant DM or post-transplant diabetes mellitus (PTDM). Results: Of 3782 patients with a median (IQR) follow-up duration of 2.7 (1.1-5.4) years, 539 (14%) and 390 (10%) patients had pre-transplant DM or developed PTDM, respectively. In the follow-up period, 1336 (35%) patients died, with 424 (32%), 264 (20%) and 199 (15%) deaths attributed to CVD, dialysis withdrawal and infection, respectively. Compared to patients without DM, the adjusted subdistribution HRs (95% CI) for pre-transplant DM and PTDM for all-cause mortality on dialysis were 1.47 (1.17-1.84) and 1.47 (1.23-1.76), respectively; for CVD-related mortality were 0.81 (0.51-1.29) and 1.02 (0.70-1.47), respectively; for infection-related mortality were 1.84 (1.02-3.35) and 2.70 (1.73-4.20), respectively; and for dialysis withdrawal-related mortality were 1.71 (1.05-2.77) and 1.51 (1.02-2.22), respectively. Conclusions: Patients with diabetes at the time of kidney allograft loss have a significant survival disadvantage, with the excess mortality risk attributed to infection and dialysis withdrawal.

10.
Am J Kidney Dis ; 83(4): 445-455, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38061534

RESUMEN

RATIONALE & OBJECTIVE: Hemodialysis catheter dysfunction is an important problem for patients with kidney failure. The optimal design of the tunneled catheter tip is unknown. This study evaluated the association of catheter tip design with the duration of catheter function. STUDY DESIGN: Observational cohort study using data from the nationwide REDUCCTION trial. SETTING & PARTICIPANTS: 4,722 adults who each received hemodialysis via 1 or more tunneled central venous catheters in 37 Australian nephrology services from December 2016 to March 2020. EXPOSURE: Design of tunneled hemodialysis catheter tip, classified as symmetrical, step, or split. OUTCOME: Time to catheter dysfunction requiring removal due to inadequate dialysis blood flow assessed by the treating clinician. ANALYTICAL APPROACH: Mixed, 3-level accelerated failure time model, assuming a log-normal survival distribution. Secular trends, the intervention, and baseline differences in service, patient, and catheter factors were included in the adjusted model. In a sensitivity analysis, survival times and proportional hazards were compared among participants' first tunneled catheters. RESULTS: Among the study group, 355 of 3,871 (9.2%), 262 of 1,888 (13.9%), and 38 of 455 (8.4%) tunneled catheters with symmetrical, step, and split tip designs, respectively, required removal due to dysfunction. Step tip catheters required removal for dysfunction at a rate 53% faster than symmetrical tip catheters (adjusted time ratio, 0.47 [95% CI, 0.33-0.67) and 76% faster than split tip catheters (adjusted time ratio, 0.24 [95% CI, 0.11-0.51) in the adjusted accelerated failure time models. Only symmetrical tip catheters had performance superior to step tip catheters in unadjusted and sensitivity analyses. Split tip catheters were infrequently used and had risks of dysfunction similar to symmetrical tip catheters. The cumulative incidence of other complications requiring catheter removal, routine removal, and death before removal were similar across the 3 tip designs. LIMITATIONS: Tip design was not randomized. CONCLUSIONS: Symmetrical and split tip catheters had a lower risk of catheter dysfunction requiring removal than step tip catheters. FUNDING: Grants from government (Queensland Health, Safer Care Victoria, Medical Research Future Fund, National Health and Medical Research Council, Australia), academic (Monash University), and not-for-profit (ANZDATA Registry, Kidney Health Australia) sources. TRIAL REGISTRATION: Registered at ANZCTR with study number ACTRN12616000830493. PLAIN-LANGUAGE SUMMARY: Central venous catheters are widely used to facilitate vascular access for life-sustaining hemodialysis treatments but often fail due to blood clots or other mechanical problems that impede blood flow. A range of adaptations to the design of tunneled hemodialysis catheters have been developed, but it is unclear which designs have the greatest longevity. We analyzed data from an Australian nationwide cohort of patients who received hemodialysis via a tunneled catheter and found that catheters with a step tip design failed more quickly than those with a symmetrical tip. Split tip catheters performed well but were infrequently used and require further study. Use of symmetrical rather than step tip hemodialysis catheters may reduce mechanical failures and unnecessary procedures for patients.


Asunto(s)
Cateterismo Venoso Central , Catéteres Venosos Centrales , Adulto , Humanos , Cateterismo Venoso Central/efectos adversos , Estudios de Cohortes , Catéteres de Permanencia/efectos adversos , Australia , Diálisis Renal , Catéteres Venosos Centrales/efectos adversos
11.
J Hypertens ; 42(2): 244-251, 2024 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-38009310

RESUMEN

INTRODUCTION: In healthy older adults, the relationship between long-term, visit-to-visit variability in blood pressure (BP) and frailty is uncertain. METHODS: Secondary analysis of blood pressure variability (BPV) and incident frailty in >13 000 participants ≥65-70 years enrolled in the ASPirin in Reducing Events in the Elderly (ASPREE) trial and its observational follow-up (ASPREE-XT). Participants were without dementia, physical disability, or cardiovascular disease at baseline. BPV was estimated using standard deviation of mean BP from three annual visits (baseline through the second annual follow-up). Frailty was defined using Fried phenotype and a frailty deficit accumulation index (FDAI). Participants with frailty during the BPV estimation period were excluded from the main analysis. Adjusted Cox proportional hazards regression evaluated the association between BPV and incident frailty, and linear mixed models for change in frailty scores, through a maximum of 9 years of follow-up. RESULTS: Participants in the highest systolic BPV tertile were at higher risk of frailty compared to those in the lowest (referent) tertile of systolic BPV [Fried hazard ratio (HR) 1.17, 95% confidence interval (CI) 1.04-1.31; FDAI HR 1.18, 95% CI 1.07-1.30]. Findings were consistent when adjusted for multiple covariates and when stratified by antihypertensive use. Linear mixed models showed that higher systolic BPV was associated with increasing frailty score over time. Diastolic BPV was not consistently associated. CONCLUSIONS: High systolic BPV, independent of mean BP, is associated with increased risk of frailty in healthy older adults. Variability of BP across visits, even in healthy older adults, can convey important risk information beyond mean BP. TRIAL REGISTRATION: ClinicalTrials.gov NCT01038583 and ISRCTN83772183.


Asunto(s)
Presión Sanguínea , Fragilidad , Anciano , Humanos , Antihipertensivos/uso terapéutico , Presión Sanguínea/fisiología , Fragilidad/epidemiología , Hipertensión/tratamiento farmacológico , Estudios de Seguimiento
12.
Kidney Int Rep ; 8(10): 1941-1950, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37849996

RESUMEN

Introduction: Effective strategies to prevent hemodialysis (HD) catheter dysfunction are lacking and there is wide variation in practice. Methods: In this post hoc analysis of the REDUcing the burden of dialysis Catheter ComplicaTIOns: a national (REDUCCTION) stepped-wedge cluster randomized trial, encompassing 37 Australian nephrology services, 6361 participants, and 9872 catheters, we investigated whether the trial intervention, which promoted a suite of evidence-based practices for HD catheter insertion and management, reduced the incidence of catheter dysfunction, which is defined by catheter removal due to inadequate dialysis blood flow. We also analyzed outcomes among tunneled cuffed catheters and sources of event variability. Results: A total of 873 HD catheters were removed because of dysfunction over 1.12 million catheter days. The raw incidence was 0.91 events per 1000 catheter days during the baseline phase and 0.68 events per 1000 catheter days during the intervention phase. The service-wide incidence of catheter dysfunction was 33% lower during the intervention after adjustment for calendar time (incidence rate ratio = 0.67; 95% confidence interval [CI], 0.50-0.89; P = 0.006). Results were consistent among tunneled cuffed catheters (adjusted incidence rate ratio = 0.68; 95% CI, 0.49-0.94), which accounted for 75% of catheters (n = 7403), 97.4% of catheter exposure time and 88.2% of events (n = 770). Among tunneled catheters that survived for 6 months (21.5% of tunneled catheters), between 2% and 5% of the unexplained variation in the number of catheter dysfunction events was attributable to service-level differences, and 18% to 36% was attributable to patient-level differences. Conclusion: Multifaceted interventions that promote evidence-based catheter care may prevent dysfunction, and patient factors are an important source of variation in events.

13.
Health Expect ; 26(6): 2584-2593, 2023 12.
Artículo en Inglés | MEDLINE | ID: mdl-37635378

RESUMEN

BACKGROUND: Little is known about the relationship between patients' cultural and linguistic backgrounds and patient activation, especially in people with diabetes and chronic kidney disease (CKD). We examined the association between culturally and linguistically diverse (CALD) background and patient activation and evaluated the impact of a codesigned integrated kidney and diabetes model of care on patient activation by CALD status in people with diabetes and CKD. METHODS: This longitudinal study recruited adults with diabetes and CKD (Stage 3a or worse) who attended a new diabetes and kidney disease service at a tertiary hospital. All completed the patient activation measure at baseline and after 12 months and had demographic and clinical data collected. Patients from CALD backgrounds included individuals who spoke a language other than English at home, while those from non-CALD backgrounds spoke English only as their primary language. Paired t-tests compared baseline and 12-month patient activation scores by CALD status. RESULTS: Patients from CALD backgrounds had lower activation scores (52.1 ± 17.6) compared to those from non-CALD backgrounds (58.5 ± 14.6) at baseline. Within-group comparisons showed that patient activation scores for patients from CALD backgrounds significantly improved by 7 points from baseline to 12 months follow-up (52.1 ± 17.6-59.4 ± 14.7), and no significant change was observed for those from non-CALD backgrounds (58.5 ± 14.6-58.8 ± 13.6). CONCLUSIONS: Among patients with diabetes and CKD, those from CALD backgrounds report worse activation scores. Interventions that support people from CALD backgrounds with comorbid diabetes and CKD, such as the integrated kidney and diabetes model of care, may address racial and ethnic disparities that exist in patient activation and thus improve clinical outcomes. PATIENT OR PUBLIC CONTRIBUTION: Patients, caregivers and national consumer advocacy organisations (Diabetes Australia and Kidney Health Australia) codesigned a new model of care in partnership with healthcare professionals and researchers. The development of the model of care was informed by focus groups of patients and healthcare professionals and semi-structured interviews of caregivers and healthcare professionals. Patients and caregivers also provided a rigorous evaluation of the new model of care, highlighting its strengths and weaknesses.


Asunto(s)
Diabetes Mellitus , Insuficiencia Renal Crónica , Adulto , Humanos , Participación del Paciente , Estudios Longitudinales , Diversidad Cultural , Diabetes Mellitus/terapia , Insuficiencia Renal Crónica/terapia , Riñón
14.
Am J Kidney Dis ; 82(5): 608-616, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37487818

RESUMEN

RATIONALE & OBJECTIVE: Trends in end-stage kidney disease (ESKD) among people with diabetes may inform clinical management and public health strategies. We estimated trends in the incidence of ESKD among people with type 1 and type 2 diabetes in Australia from 2010-2019 and evaluated their associated factors. STUDY DESIGN: Cohort study. SETTING & PARTICIPANTS: 71,700 people with type 1 and 1,112,690 people with type 2 diabetes registered on the Australian National Diabetes Services Scheme (NDSS). We estimated the incidence of kidney replacement therapy (KRT) via linkage to the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) and the incidence of KRT or death from ESKD by linking the NDSS to the ANZDATA and the National Death Index for Australia. PREDICTORS: Calendar time, sex, age, and duration of diabetes. OUTCOME: Incidence of KRT and KRT or death from ESKD. ANALYTICAL APPROACH: Incidence of ESKD, trends over time, and associations with factors related to these trends were modeled using Poisson regression stratified by diabetes type and sex. RESULTS: The median duration of diabetes increased from 15.3 to 16.8 years in type 1 diabetes, and from 7.6 to 10.2 years in type 2 diabetes between 2010 and 2019. The incidence of KRT and KRT or death from ESKD did not significantly change over this time interval among people with type 1 diabetes. Conversely, the age-adjusted incidence of KRT and KRT or death from ESKD increased among males with type 2 diabetes (annual percent changes [APCs]: 2.52% [95% CI, 1.54 to -3.52] and 1.27% [95% CI, 0.53 2.03], respectively), with no significant change among females (0.67% [95% CI, -0.68 to 2.04] and 0.07% [95% CI, -0.81 to 0.96], respectively). After further adjustment for duration of diabetes, the incidence of ESKD fell between 2010 and 2019, with APCs of-0.09% (95% CI, -1.06 to 0.89) and-2.63% (95% CI, -3.96 to-1.27) for KRT and-0.97% (95% CI, -1.71 to-0.23) and-2.75% (95% CI, -3.62 to-1.87) for KRT or death from ESKD among males and females, respectively. LIMITATIONS: NDSS only captures 80%-90% of people with diabetes; lack of clinical covariates limits understanding of trends. CONCLUSIONS: While the age-adjusted incidence of ESKD increased for males and was stable for females over the last decade, after adjusting for increases in duration of diabetes the risk of developing ESKD has decreased for both males and females. PLAIN-LANGUAGE SUMMARY: Previous studies showed an increase in new cases of kidney failure among people with type 2 diabetes, but more recent data have not been available. Here, we report trends in the rate of kidney failure for people with type 2 diabetes from 2010 to 2019 and showed that while more people with type 2 diabetes are developing kidney failure, accounting for the fact that they are also surviving longer (and therefore have a higher chance of kidney failure) the growth in this population is not caused by a higher risk of kidney failure. Nevertheless, more people are getting kidney failure than before, which will impact health care systems for years to come.

15.
Respirology ; 28(9): 860-868, 2023 09.
Artículo en Inglés | MEDLINE | ID: mdl-37400102

RESUMEN

BACKGROUND AND OBJECTIVE: Raised blood lactate secondary to high dose ß2 -agonist treatment has been reported in asthma exacerbations but has not been investigated during acute exacerbations of COPD (AECOPD). We explored associations of blood lactate measurements with disease outcomes and ß2 -agonist treatments during AECOPD. METHODS: Retrospective (n = 199) and prospective studies (n = 142) of patients hospitalized with AECOPD were conducted. The retrospective cohort was identified via medical records and the prospective cohort was recruited during hospitalization for AECOPD. Baseline demographics, comorbidities, ß2 -agonist treatment, biochemical measurements and clinical outcomes were compared between patients with normal (≤2.0 mmol/L) versus elevated lactate (>2.0 mmol/L). Regression analyses examined associations of lactate measurements with ß2 -agonist dosages. RESULTS: Demographic data and comorbidities were similar between high versus normal lactate groups in both cohorts. The populations were elderly (mean >70 years), predominantly male (>60%) with reduced FEV1 (%) 48.2 ± 19 (prospective cohort). Lactate was elevated in approximately 50% of patients during AECOPD and not related to evidence of sepsis. In the prospective cohort, patients with high lactate had more tachypnoea, tachycardia, acidosis and hyperglycaemia (p < 0.05) and received more non-invasive ventilation (37% vs. 9.7%, p < 0.001, prospective cohort). There was a trend to longer hospitalization (6 vs. 5 days, p = 0.06, prospective cohort). Higher cumulative ß2 -agonist dosages were linked to elevated lactate levels (OR 1.04, p = 0.01). CONCLUSION: Elevated lactate during AECOPD was common, unrelated to sepsis and correlated with high cumulative doses of ß2 -agonists. Raised lactate may indicate excessive ß2 -agonist treatment and should now be investigated as a possible biomarker.


Asunto(s)
Agonistas de Receptores Adrenérgicos beta 2 , Enfermedad Pulmonar Obstructiva Crónica , Humanos , Masculino , Anciano , Femenino , Agonistas de Receptores Adrenérgicos beta 2/efectos adversos , Estudios Prospectivos , Estudios Retrospectivos , Enfermedad Pulmonar Obstructiva Crónica/tratamiento farmacológico , Lactatos/uso terapéutico
16.
Transplantation ; 107(11): 2424-2432, 2023 Nov 01.
Artículo en Inglés | MEDLINE | ID: mdl-37322595

RESUMEN

BACKGROUND: Antibody-mediated rejection (AMR) is a major cause of kidney allograft failure and demonstrates different properties depending on whether it occurs early (<6 mo) or late (>6 mo) posttransplantation. We aimed to compare graft survival and treatment approaches for early and late AMR in Australia and New Zealand. METHODS: Transplant characteristics were obtained for patients with an AMR episode reported to the Australia and New Zealand Dialysis and Transplant Registry from January 2003 to December 2019. The primary outcome of time to graft loss from AMR diagnosis, with death considered a competing risk, was compared between early and late AMR using flexible parametric survival models. Secondary outcomes included treatments used, response to treatment, and time from AMR diagnosis to death. RESULTS: After adjustment for other explanatory factors, late AMR was associated with twice the risk of graft loss relative to early AMR. The risk was nonproportional over time, with early AMR having an increased early risk. Late AMR was also associated with an increased risk of death. Early AMR was treated more aggressively than late with more frequent use of plasma exchange and monoclonal/polyclonal antibodies. There was substantial variation in treatments used by transplant centers. Early AMR was reported to be more responsive to treatment than late. CONCLUSIONS: Late AMR is associated with an increased risk of graft loss and death compared with early AMR. The marked heterogeneity in the treatment of AMR highlights the need for effective, new therapeutic options for these conditions.

17.
Am J Kidney Dis ; 82(4): 429-442.e1, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37178814

RESUMEN

RATIONALE & OBJECTIVE: Central venous catheters (CVCs) are widely used for hemodialysis but are prone to burdensome and costly bloodstream infections. We determined whether multifaceted quality improvement interventions in hemodialysis units can prevent hemodialysis catheter-related bloodstream infections (HDCRBSI). STUDY DESIGN: Systematic review. SETTING & STUDY POPULATIONS: PubMed, EMBASE, and CENTRAL were searched from inception to April 23, 2022, to identify randomized trials, time-series analyses, and before-after studies that examined the effect of multifaceted quality improvement interventions on the incidence of HDCRBSI or access-related bloodstream infections (ARBSI) among people receiving hemodialysis outside of the intensive care unit (ICU). DATA EXTRACTION: Two people independently extracted data and assessed the risk of bias and quality of evidence using validated tools. ANALYTICAL APPROACH: Intervention effects, validity, and characteristics of studies with the same design were compared. Differences between study designs were described. RESULTS: We included 21 studies from 8,824 identified by our search. Among 15 studies that measured HDCRBSI, 2 methodologically heterogenous cluster randomized trials reported discordant intervention effects, 2 interrupted time-series analyses reported favorable interventions with discordant patterns of effect, and 11 before-after studies reported favorable interventions with a very high risk of bias. Among 6 studies that only measured ARBSI, 1 time-series analysis and 1 before-after study did not find a favorable intervention effect, and 4 before-after studies reported a favorable effect with a very high risk of bias. The overall quality of evidence was low for HDCRBSI and very low for ARBSI. LIMITATIONS: Nine definitions of HDCRBSI were used. Ten studies included hospital-based and satellite facilities but did not report separate intervention effects for each type of facility. CONCLUSIONS: Multifaceted quality improvement interventions may prevent HDCRBSI outside the ICU. However, evidence supporting them is of low quality, and further carefully conducted studies are warranted. REGISTRATION: Registered at PROSPERO with registration number CRD42021252290. PLAIN-LANGUAGE SUMMARY: People with kidney failure rely on central venous catheters to facilitate life-sustaining hemodialysis treatments. Unfortunately, hemodialysis catheters are a common source of problematic bloodstream infections. Quality improvement programs have effectively prevented catheter-related infections in intensive care units, but it is unclear whether they can be adapted to patients using hemodialysis catheters in the community. In a systematic review that included 21 studies, we found that most quality improvement programs were reported to be successful. However, the findings were mixed among higher-quality studies, and overall the quality of evidence was low. Ongoing quality improvement programs should be complemented by more high-quality research.


Asunto(s)
Infecciones Relacionadas con Catéteres , Catéteres Venosos Centrales , Sepsis , Humanos , Mejoramiento de la Calidad , Catéteres Venosos Centrales/efectos adversos , Unidades de Cuidados Intensivos , Infecciones Relacionadas con Catéteres/epidemiología , Infecciones Relacionadas con Catéteres/prevención & control
18.
Kidney Int Rep ; 8(4): 737-745, 2023 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-37069989

RESUMEN

Introduction: Data on the association between chronic kidney disease (CKD) and major hemorrhage in older adults are lacking. Methods: We used data from a double-blind randomized controlled trial of aspirin in persons aged ≥ 70 years with prospective capture of bleeding events, including hemorrhagic stroke and clinically significant bleeding. CKD was defined as an estimated glomerular filtration rate (eGFR) < 60 ml/min per 1.73 m2 and/or urinary albumin-to-creatinine ratio (UACR) ≥ 3 mg/mmol (26.6 mg/g). We compared bleeding rates in those with and without CKD, undertook multivariable analyses, and explored effect modification with aspirin. Results: Of 19,114 participants, 17,976 (94.0%) had CKD status recorded, of whom 4952 (27.5%) had CKD. Participants with CKD had an increased rate of major bleeding events compared with those without CKD (10.4/1000 vs. 6.3/1000 person-years [py], respectively) and increased bleeding risk (risk ratio [RR] 1.60; 95% confidence interval [CI]: 1.40, 1.90 for eGFR < 60 ml/min per 1.73 m2) and RR (2.10; 95% CI: 1.70, 2.50) for albuminuria. In adjusted analyses, CKD was associated with a 35% increased risk of bleeding (hazard ratio [HR] 1.37; 95% CI: 1.15, 1.62; P < 0.001). Other risk factors were older age, hypertension, smoking, and aspirin use. There was no differential effect of aspirin on bleeding by CKD status (test of interaction P = 0.65). Conclusion: CKD is independently associated with an increased risk of major hemorrhage in older adults. Increased awareness of modifiable risk factors such as discontinuation of unnecessary aspirin, blood pressure control, and smoking cessation in this group is warranted.

19.
Kidney Int ; 103(6): 1156-1166, 2023 06.
Artículo en Inglés | MEDLINE | ID: mdl-37001602

RESUMEN

Risk of chronic kidney disease (CKD) is influenced by environmental and genetic factors and increases sharply in individuals 70 years and older. Polygenic scores (PGS) for kidney disease-related traits have shown promise but require validation in well-characterized cohorts. Here, we assessed the performance of recently developed PGSs for CKD-related traits in a longitudinal cohort of healthy older individuals enrolled in the Australian ASPREE randomized controlled trial of daily low-dose aspirin with CKD risk at baseline and longitudinally. Among 11,813 genotyped participants aged 70 years or more with baseline eGFR measures, we tested associations between PGSs and measured eGFR at baseline, clinical phenotype of CKD, and longitudinal rate of eGFR decline spanning up to six years of follow-up per participant. A PGS for eGFR was associated with baseline eGFR, with a significant decrease of 3.9 mL/min/1.73m2 (95% confidence interval -4.17 to -3.68) per standard deviation (SD) increase of the PGS. This PGS, as well as a PGS for CKD stage 3 were both associated with higher risk of baseline CKD stage 3 in cross-sectional analysis (Odds Ratio 1.75 per SD, 95% confidence interval 1.66-1.85, and Odds Ratio 1.51 per SD, 95% confidence interval 1.43-1.59, respectively). Longitudinally, two separate PGSs for eGFR slope were associated with significant kidney function decline during follow-up. Thus, our study demonstrates that kidney function has a considerable genetic component in older adults, and that new PGSs for kidney disease-related phenotypes may have potential utility for CKD risk prediction in advanced age.


Asunto(s)
Insuficiencia Renal Crónica , Humanos , Estudios Longitudinales , Estudios Transversales , Tasa de Filtración Glomerular , Progresión de la Enfermedad , Australia , Insuficiencia Renal Crónica/diagnóstico , Insuficiencia Renal Crónica/genética , Insuficiencia Renal Crónica/complicaciones , Fenotipo
20.
Kidney Med ; 5(2): 100583, 2023 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-36794000

RESUMEN

Rationale & Objective: Variability in estimated glomerular filtration rate (eGFR) over time is often observed, but it is unknown whether this variation is clinically important. We investigated the association between eGFR variability and survival free of dementia or persistent physical disability (disability-free survival) and cardiovascular disease (CVD) events (myocardial infarction, stroke, hospitalization for heart failure, or CVD death). Study Design: Post hoc analysis. Setting & Participants: 12,549 participants of the ASPirin in Reducing Events in the Elderly trial. Participants were without documented dementia, major physical disability, previous CVD, and major life-limiting illness at enrollment. Predictors: eGFR variability. Outcomes: Disability-free survival and CVD events. Analytical Approach: eGFR variability was estimated using the standard deviation of eGFR measurements obtained from participants' baseline, first, and second annual visits. Associations between tertiles of eGFR variability with disability-free survival and CVD events occurring after the eGFR variability estimation period were examined. Results: During median follow-up of 2.7 years after the second annual visit, 838 participants died, developed dementia, or acquired a persistent physical disability; 379 had a CVD event. The highest tertile of eGFR variability had an increased risk of death/dementia/disability (HR, 1.35; 95% CI, 1.14-1.59) and CVD events (HR, 1.37; 95% CI, 1.06-1.77) compared with the lowest tertile after covariate adjustment. These associations were present in patients with and without chronic kidney disease at baseline. Limitations: Limited representation of diverse demographics. Conclusions: In older, generally healthy adults, higher variability in eGFR over time predicts increased risk of future death/dementia/disability and CVD events.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...