ABSTRACT
RATIONALE & OBJECTIVE: Hemodialysis catheter dysfunction is an important problem for patients with kidney failure. The optimal design of the tunneled catheter tip is unknown. This study evaluated the association of catheter tip design with the duration of catheter function. STUDY DESIGN: Observational cohort study using data from the nationwide REDUCCTION trial. SETTING & PARTICIPANTS: 4,722 adults who each received hemodialysis via 1 or more tunneled central venous catheters in 37 Australian nephrology services from December 2016 to March 2020. EXPOSURE: Design of tunneled hemodialysis catheter tip, classified as symmetrical, step, or split. OUTCOME: Time to catheter dysfunction requiring removal due to inadequate dialysis blood flow assessed by the treating clinician. ANALYTICAL APPROACH: Mixed, 3-level accelerated failure time model, assuming a log-normal survival distribution. Secular trends, the intervention, and baseline differences in service, patient, and catheter factors were included in the adjusted model. In a sensitivity analysis, survival times and proportional hazards were compared among participants' first tunneled catheters. RESULTS: Among the study group, 355 of 3,871 (9.2%), 262 of 1,888 (13.9%), and 38 of 455 (8.4%) tunneled catheters with symmetrical, step, and split tip designs, respectively, required removal due to dysfunction. Step tip catheters required removal for dysfunction at a rate 53% faster than symmetrical tip catheters (adjusted time ratio, 0.47 [95% CI, 0.33-0.67) and 76% faster than split tip catheters (adjusted time ratio, 0.24 [95% CI, 0.11-0.51) in the adjusted accelerated failure time models. Only symmetrical tip catheters had performance superior to step tip catheters in unadjusted and sensitivity analyses. Split tip catheters were infrequently used and had risks of dysfunction similar to symmetrical tip catheters. The cumulative incidence of other complications requiring catheter removal, routine removal, and death before removal were similar across the 3 tip designs. LIMITATIONS: Tip design was not randomized. CONCLUSIONS: Symmetrical and split tip catheters had a lower risk of catheter dysfunction requiring removal than step tip catheters. FUNDING: Grants from government (Queensland Health, Safer Care Victoria, Medical Research Future Fund, National Health and Medical Research Council, Australia), academic (Monash University), and not-for-profit (ANZDATA Registry, Kidney Health Australia) sources. TRIAL REGISTRATION: Registered at ANZCTR with study number ACTRN12616000830493. PLAIN-LANGUAGE SUMMARY: Central venous catheters are widely used to facilitate vascular access for life-sustaining hemodialysis treatments but often fail due to blood clots or other mechanical problems that impede blood flow. A range of adaptations to the design of tunneled hemodialysis catheters have been developed, but it is unclear which designs have the greatest longevity. We analyzed data from an Australian nationwide cohort of patients who received hemodialysis via a tunneled catheter and found that catheters with a step tip design failed more quickly than those with a symmetrical tip. Split tip catheters performed well but were infrequently used and require further study. Use of symmetrical rather than step tip hemodialysis catheters may reduce mechanical failures and unnecessary procedures for patients.
Subject(s)
Catheterization, Central Venous , Central Venous Catheters , Adult , Humans , Catheterization, Central Venous/adverse effects , Cohort Studies , Catheters, Indwelling/adverse effects , Australia , Renal Dialysis , Central Venous Catheters/adverse effectsABSTRACT
BACKGROUND: Understanding the patient perspective of frailty is critical to offering holistic patient-centred care. Rehabilitation strategies for patients with advanced chronic kidney disease (CKD) and frailty are limited in their ability to overcome patient-perceived barriers to participation, resulting in high rates of drop-out and non-adherence. The aim of this study was to explore patient perspectives and preferences regarding experiences with rehabilitation to inform a CKD/Frailty rehabilitation model. METHODS: This qualitative study involved two focus groups, six individual semi-structured interviews and three caregiver semi-structured interviews with lived experience of advanced kidney disease and frailty. Interviews were recorded, transcribed, and coded for meaningful concepts and analysed using inductive thematic analysis using constant comparative method of data analysis employing Social Cognitive Theory. RESULTS: Six major themes emerged including accommodating frailty is an act of resilience, exercise is endorsed for rehabilitation but existing programs have failed to meet end-users' needs. Rehabilitation goals were framed around return to normative behaviours and rehabilitation should have a social dimension, offering understanding for "people like us". Participants reported on barriers and disruptors to frailty rehabilitation in the CKD context. Participants valued peer-to-peer education, the camaraderie of socialisation and the benefit of feedback for maintaining motivation. Patients undertaking dialysis described the commodity of time and the burden of unresolved symptoms as barriers to participation. Participants reported difficulty envisioning strategies for frailty rehabilitation, maintaining a focus on the immediate and avoidance of future uncertainty. CONCLUSIONS: Frailty rehabilitation efforts in CKD should leverage shared experiences, address comorbidity and symptom burden and focus on goals with normative value.
Subject(s)
Focus Groups , Frailty , Patient Preference , Qualitative Research , Renal Insufficiency, Chronic , Humans , Female , Male , Aged , Renal Insufficiency, Chronic/rehabilitation , Renal Insufficiency, Chronic/psychology , Frailty/rehabilitation , Frailty/psychology , Aged, 80 and over , Middle AgedABSTRACT
Frailty is a multidimensional clinical syndrome characterised by low physical activity, reduced strength, accumulation of multi-organ deficits, decreased physiological reserve and vulnerability to stressors. Frailty pathogenesis and 'inflammageing' is augmented by uraemia, leading to a high prevalence of frailty potentially contributing to adverse outcomes in patients with advanced chronic kidney disease (CKD), including end-stage kidney disease (ESKD). The presence of frailty is a stronger predictor of CKD outcomes than estimated glomerular filtration rate and more aligned with dialysis outcomes than age. Frailty assessment should form part of routine assessment of patients with CKD and inform key medical transitions. Frailty screening and interventions in CKD/ESKD should be a research priority.
Subject(s)
Frailty , Kidney Failure, Chronic , Nephrology , Renal Insufficiency, Chronic , Humans , Frailty/diagnosis , Frailty/epidemiology , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/therapy , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/diagnosis , Renal DialysisABSTRACT
BACKGROUND: Frailty is a clinical syndrome of accelerated aging associated with adverse outcomes. Frailty is prevalent among patients with chronic kidney disease but is infrequently assessed in clinical settings, due to lack of consensus regarding frailty definitions and diagnostic tools. This study aimed to review the practice of frailty assessment in nephrology populations and evaluate the context and timing of frailty assessment. METHODS: The search included published reports of frailty assessment in patients with chronic kidney disease, undergoing dialysis or in receipt of a kidney transplant, published between January 2000 and November 2021. Medline, CINAHL, Embase, PsychINFO, PubMed and Cochrane Library databases were examined. A total of 164 articles were included for review. RESULTS: We found that studies were most frequently set within developed nations. Overall, 161 studies were frailty assessments conducted as part of an observational study design, and 3 within an interventional study. Studies favoured assessment of participants with chronic kidney disease (CKD) and transplant candidates. A total of 40 different frailty metrics were used. The most frequently utilised tool was the Fried frailty phenotype. Frailty prevalence varied across populations and research settings from 2.8% among participants with CKD to 82% among patients undergoing haemodialysis. Studies of frailty in conservatively managed populations were infrequent (N = 4). We verified that frailty predicts higher rates of adverse patient outcomes. There is sufficient literature to justify future meta-analyses. CONCLUSIONS: There is increasing recognition of frailty in nephrology populations and the value of assessment in informing prognostication and decision-making during transitions in care. The Fried frailty phenotype is the most frequently utilised assessment, reflecting the feasibility of incorporating objective measures of frailty and vulnerability into nephrology clinical assessment. Further research examining frailty in low and middle income countries as well as first nations people is required. Future work should focus on interventional strategies exploring frailty rehabilitation.
Subject(s)
Frailty , Nephrology , Humans , Frailty/diagnosis , Frailty/epidemiology , Aging , Consensus , Databases, Factual , Observational Studies as TopicABSTRACT
BACKGROUND: Effective interpersonal communication is critical for shared decision-making (SDM). Previous SDM communication training in nephrology has lacked context-specific evidence from ethnographic analysis of SDM interactions with older patients considering treatment options of end stage kidney disease (ESKD). This study explores communication strategies in SDM discussions in nephrology, specifically focusing on older patients considering dialysis as kidney replacement therapy (KRT). METHODS: We conducted a qualitative study analysing naturally-occurring audio-recorded clinical interactions (n = 12) between Australian kidney doctors, patients aged 60+, and carers. Linguistic ethnography and qualitative socially-oriented functional approaches were used for analysis. RESULTS: Two types of communication strategies emerged: (1) Managing and advancing treatment decisions: involving active checking of knowledge, clear explanations of options, and local issue resolution. (2) Pulling back: Deferring or delaying decisions through mixed messaging. Specifically for non-English speaking patients, pulling back was further characterised by communication challenges deferring decision-making including ineffective issue management, and reliance on family as interpreters. Age was not an explicit topic of discussion among participants when it came to making decisions about KRT but was highly relevant to treatment decision-making. Doctors appeared reluctant to broach non-dialysis conservative management, even when it appears clinically appropriate. Conservative care, an alternative to KRT suitable for older patients with co-morbidities, was only explicitly discussed when prompted by patients or carers. CONCLUSIONS: The findings highlight the impact of different communication strategies on SDM discussions in nephrology. This study calls for linguistic-informed contextualised communication training and provides foundational evidence for nephrology-specific communication skills training in SDM for KRT among older patients. There is urgent need for doctors to become confident and competent in discussing non-dialysis conservative management. Further international research should explore naturally-occurring SDM interactions in nephrology with other vulnerable groups to enhance evidence and training integration.
Subject(s)
Decision Making, Shared , Renal Insufficiency, Chronic , Humans , Renal Dialysis , Physician-Patient Relations , Australia , Renal Insufficiency, Chronic/therapy , Communication , Patient Participation , Decision MakingABSTRACT
OBJECTIVE: To estimate the incidence and postoperative mortality rates of surgery, and variations by age, diabetes, kidney replacement therapy (KRT) modality, and time over a 15-year period. BACKGROUND: Patients with kidney failure receiving chronic KRT (dialysis or kidney transplantation) have increased risks of postoperative mortality and morbidity. Contemporary data on the incidence and types of surgery these patients undergo are lacking. METHODS: This binational population cohort study evaluated all incident and prevalent patients receiving chronic KRT using linked data between Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry and jurisdictional hospital admission datasets between 2000 and 2015. Patients were categorized by their KRT modality (hemodialysis, peritoneal dialysis, home hemodialysis, and kidney transplant) for each calendar year. Incidence rates for overall surgery and subtypes were estimated using Poisson models. Logistic regression was used to estimate 30-day/in-hospital mortality risk. RESULTS: Overall, 46,497 patients over a median (interquartile range) follow-up of 6.3 years (3.5-10.2 years) underwent 81,332 surgeries. The median incidence rate of surgery remained stable over this period with a median of 14.9 surgeries per 100 patient-years. Annual incidence rate was higher in older people and those with diabetes mellitus. Patients receiving hemodialysis had a higher incidence rate of surgery compared with kidney transplant recipients (15.8 vs 10.0 surgeries per 100 patient-years, respectively). Overall adjusted postoperative mortality rates decreased by >70% over the study period, and were lowest in kidney transplant recipients (1.7%, 95% confidence interval, 1.4-2.0). Postoperative mortality following emergency surgery was >3-fold higher than elective surgery (8.4% vs 2.3%, respectively). CONCLUSIONS: Patients receiving chronic KRT have high rates of surgery and morbidity. Further research into strategies to mitigate perioperative risk remain a priority.
Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Humans , Aged , Cohort Studies , Renal Replacement Therapy , Renal Dialysis , RegistriesABSTRACT
BACKGROUND: Secondary hyperparathyroidism (SHPT) in chronic kidney disease is associated with cardiovascular and bone pathology. Measures to achieve parathyroid hormone (PTH) target values and control biochemical abnormalities associated with SHPT require complex therapies, and severe SHPT often requires parathyroidectomy or the calcimimetic cinacalcet. In Australia, cinacalcet was publicly funded for dialysis patients from 2009 to 2015 when funding was withdrawn following publication of the EVOLVE study, which resulted in most patients on cinacalcet ceasing therapy. We examined the clinical and biochemical outcomes associated with this change at Australian renal centres. AIM: To assess changes to biochemical and clinical outcomes in dialysis patients following cessation of cinacalcet. METHODS: We conducted a retrospective study of dialysis patients who ceased cinacalcet after August 2015 in 11 Australian units. Clinical outcomes and changes in biochemical parameters were assessed over a 24- and 12-month period, respectively, from cessation of cinacalcet. RESULTS: A total of 228 patients was included (17.7% of all dialysis patients from the units). Patients were aged 63 ± 15 years with 182 patients on haemodialysis and 46 on peritoneal dialysis. Over 24 months following cessation of cinacalcet, we observed 26 parathyroidectomies, 3 episodes of calciphylaxis, 8 fractures and 50 deaths. Eight patients recommenced cinacalcet, meeting criteria under a special access scheme. Biochemical changes from baseline to 12 months after cessation included increased levels of serum PTH from 54 (interquartile range 27-90) pmol/L to 85 (interquartile range 41-139) pmol/L (P < 0.0001), serum calcium from 2.3 ± 0.2 mmol/L to 2.5 ± 0.1 mmol/L (P < 0.0001) and alkaline phosphatase from 123 (92-176) IU/L to 143 (102-197) IU/L (P < 0.0001). CONCLUSION: Significant increases in serum PTH, calcium and alkaline phosphatase occurred over a 12-month period following withdrawal of cinacalcet. Longer-term follow up will determine if these biochemical and therapeutic changes are associated with altered rates of parathyroidectomies and cardiovascular mortality and morbidity.
Subject(s)
Calcimimetic Agents/administration & dosage , Cinacalcet/administration & dosage , Hyperparathyroidism, Secondary/blood , Kidney Failure, Chronic/therapy , Renal Dialysis/trends , Withholding Treatment/trends , Aged , Alkaline Phosphatase/blood , Australia , Biomarkers/blood , Calcium/blood , Female , Follow-Up Studies , Humans , Hyperparathyroidism, Secondary/diagnosis , Hyperparathyroidism, Secondary/therapy , Kidney Failure, Chronic/complications , Male , Middle Aged , Parathyroid Hormone/blood , Parathyroidectomy , Renal Dialysis/adverse effects , Retrospective StudiesABSTRACT
BACKGROUND: Calcineurin inhibitors (CNI) can reduce acute transplant rejection and immediate graft loss but are associated with significant adverse effects such as hypertension and nephrotoxicity which may contribute to chronic rejection. CNI toxicity has led to numerous studies investigating CNI withdrawal and tapering strategies. Despite this, uncertainty remains about minimisation or withdrawal of CNI. OBJECTIVES: This review aimed to look at the benefits and harms of CNI tapering or withdrawal in terms of graft function and loss, incidence of acute rejection episodes, treatment-related side effects (hypertension, hyperlipidaemia) and death. SEARCH METHODS: We searched the Cochrane Kidney and Transplant Specialised Register to 11 October 2016 through contact with the Information Specialist using search terms relevant to this review. Studies contained in the Specialised Register are identified through search strategies specifically designed for CENTRAL, MEDLINE, and EMBASE; handsearching conference proceedings; and searching the International Clinical Trials Register (ICTRP) Search Portal and ClinicalTrials.gov. SELECTION CRITERIA: All randomised controlled trials (RCTs) where drug regimens containing CNI were compared to alternative drug regimens (CNI withdrawal, tapering or low dose) in the post-transplant period were included, without age or dosage restriction. DATA COLLECTION AND ANALYSIS: Two authors independently assessed studies for eligibility, risk of bias, and extracted data. Results were expressed as risk ratio (RR) or mean difference (MD) with 95% confidence intervals (CI). MAIN RESULTS: We included 83 studies that involved 16,156 participants. Most were open-label studies; less than 30% of studies reported randomisation method and allocation concealment. Studies were analysed as intent-to-treat in 60% and all pre-specified outcomes were reported in 54 studies. The attrition and reporting bias were unclear in the remainder of the studies as factors used to judge bias were reported inconsistently. We also noted that 50% (47 studies) of studies were funded by the pharmaceutical industry.We classified studies into four groups: CNI withdrawal or avoidance with or without substitution with mammalian target of rapamycin inhibitors (mTOR-I); and low dose CNI with or without mTOR-I. The withdrawal groups were further stratified as avoidance and withdrawal subgroups for major outcomes.CNI withdrawal may lead to rejection (RR 2.54, 95% CI 1.56 to 4.12; moderate certainty evidence), may make little or no difference to death (RR 1.09, 95% CI 0.96 to 1.24; moderate certainty), and probably slightly reduces graft loss (RR 0.85, 95% CI 0.74 to 0.98; low quality evidence). Hypertension was probably reduced in the CNI withdrawal group (RR 0.82, 95% CI 0.71 to 0.95; low certainty), while CNI withdrawal may make little or no difference to malignancy (RR 1.10, 95% CI 0.93 to 1.30; low certainty), and probably makes little or no difference to cytomegalovirus (CMV) (RR 0.87, 95% CI 0.52 to 1.45; low certainty)CNI avoidance may result in increased acute rejection (RR 2.16, 95% CI 0.85 to 5.49; low certainty) but little or no difference in graft loss (RR 0.96, 95% CI 0.79 to 1.16; low certainty). Late CNI withdrawal increased acute rejection (RR 3.21, 95% CI 1.59 to 6.48; moderate certainty) but probably reduced graft loss (RR 0.84, 95% CI 0.72 to 0.97, low certainty).Results were similar when CNI avoidance or withdrawal was combined with the introduction of mTOR-I; acute rejection was probably increased (RR 1.43; 95% CI 1.15 to 1.78; moderate certainty) and there was probably little or no difference in death (RR 0.96; 95% CI 0.69 to 1.36, moderate certainty). mTOR-I substitution may make little or no difference to graft loss (RR 0.94, 95% CI 0.75 to 1.19; low certainty), probably makes little of no difference to hypertension (RR 0.86, 95% CI 0.64 to 1.15; moderate), and probably reduced the risk of cytomegalovirus (CMV) (RR 0.60, 95% CI 0.44 to 0.82; moderate certainty) and malignancy (RR 0.69, 95% CI 0.47 to 1.00; low certainty). Lymphoceles were increased with mTOR-I substitution (RR 1.45, 95% CI 0.95 to 2.21; low certainty).Low dose CNI combined with mTOR-I probably increased glomerular filtration rate (GFR) (MD 6.24 mL/min, 95% CI 3.28 to 9.119; moderate certainty), reduced graft loss (RR 0.75, 95% CI 0.55 to 1.02; moderate certainty), and made little or no difference to acute rejection (RR 1.13 ; 95% CI 0.91 to 1.40; moderate certainty). Hypertension was decreased (RR 0.98, 95% CI 0.80 to 1.20; low certainty) as was CMV (RR 0.41, 95% CI 0.16 to 1.06; low certainty). Low dose CNI plus mTOR-I makes probably makes little of no difference to malignancy (RR 1.22, 95% CI 0.42 to 3.53; low certainty) and may make little of no difference to death (RR 1.16, 95% CI 0.71 to 1.90; moderate certainty). AUTHORS' CONCLUSIONS: CNI avoidance increased acute rejection and CNI withdrawal increases acute rejection but reduced graft loss at least over the short-term. Low dose CNI with induction regimens reduced acute rejection and graft loss with no major adverse events, also in the short-term. The use of mTOR-I reduced CMV infections but increased the risk of acute rejection. These conclusions must be tempered by the lack of long-term data in most of the studies, particularly with regards to chronic antibody-mediated rejection, and the suboptimal methodological quality of the included studies.
Subject(s)
Calcineurin Inhibitors/administration & dosage , Calcineurin Inhibitors/adverse effects , Graft Rejection/etiology , Graft Survival , Kidney Transplantation , TOR Serine-Threonine Kinases/antagonists & inhibitors , Withholding Treatment , Acute Disease , Cytomegalovirus Infections/epidemiology , Cytomegalovirus Infections/prevention & control , Drug Substitution , Graft Rejection/epidemiology , Graft Rejection/prevention & control , Humans , Hypertension/epidemiology , Immunosuppression Therapy/methods , Immunosuppressive Agents/therapeutic use , Intention to Treat Analysis , Kidney , Kidney Transplantation/mortality , Neoplasms/epidemiology , Randomized Controlled Trials as Topic , Time FactorsABSTRACT
BACKGROUND: Adequate haemodialysis (HD) in people with end-stage kidney disease (ESKD) is reliant upon establishment of vascular access, which may consist of arteriovenous fistula, arteriovenous graft, or central venous catheters (CVC). Although discouraged due to high rates of infectious and thrombotic complications as well as technical issues that limit their life span, CVC have the significant advantage of being immediately usable and are the only means of vascular access in a significant number of patients. Previous studies have established the role of thrombolytic agents (TLA) in the prevention of catheter malfunction. Systematic review of different thrombolytic agents has also identified their utility in restoration of catheter patency following catheter malfunction. To date the use and efficacy of fibrin sheath stripping and catheter exchange have not been evaluated against thrombolytic agents. OBJECTIVES: This review aimed to evaluate the benefits and harms of TLA, preparations, doses and administration as well as fibrin-sheath stripping, over-the-wire catheter exchange or any other intervention proposed for management of tunnelled CVC malfunction in patients with ESKD on HD. SEARCH METHODS: We searched the Cochrane Kidney and Transplant Specialised Register up to 17 August 2017 through contact with the Information Specialist using search terms relevant to this review. Studies in the Specialised Register are identified through searches of CENTRAL, MEDLINE, and EMBASE, conference proceedings, the International Clinical Trials Register (ICTRP) Search Portal, and ClinicalTrials.gov. SELECTION CRITERIA: We included all studies conducted in people with ESKD who rely on tunnelled CVC for either initiation or maintenance of HD access and who require restoration of catheter patency following late-onset catheter malfunction and evaluated the role of TLA, fibrin sheath stripping or over-the-wire catheter exchange to restore catheter function. The primary outcome was be restoration of line patency defined as ≥ 300 mL/min or adequate to complete a HD session or as defined by the study authors. Secondary outcomes included dialysis adequacy and adverse outcomes. DATA COLLECTION AND ANALYSIS: Two authors independently assessed retrieved studies to determine which studies satisfy the inclusion criteria and carried out data extraction. Included studies were assessed for risk of bias. Summary estimates of effect were obtained using a random-effects model, and results were expressed as risk ratios (RR) and their 95% confidence intervals (CI) for dichotomous outcomes, and mean difference (MD) and 95% CI for continuous outcomes. Confidence in the evidence was assessed using GRADE. MAIN RESULTS: Our search strategy identified 8 studies (580 participants) as eligible for inclusion in this review. Interventions included: thrombolytic therapy versus placebo (1 study); low versus high dose thrombolytic therapy (1); alteplase versus urokinase (1); short versus long thrombolytic dwell (1); thrombolytic therapy versus percutaneous fibrin sheath stripping (1); fibrin sheath stripping versus over-the-wire catheter exchange (1); and over-the-wire catheter exchange versus exchange with and without angioplasty sheath disruption (1). No two studies compared the same interventions. Most studies had a high risk of bias due to poor study design, broad inclusion criteria, low patient numbers and industry involvement.Based on low certainty evidence, thrombolytic therapy may restore catheter function when compared to placebo (149 participants: RR 4.05, 95% CI 1.42 to 11.56) but there is no data available to suggest an optimal dose or administration method. The certainty of this evidence is reduced due to the fact that it is based on only a single study with wide confidence limits, high risk of bias and imprecision in the estimates of adverse events (149 participants: RR 2.03, 95% CI 0.38 to 10.73).Based on the available evidence, physical disruption of a fibrin sheath using interventional radiology techniques appears to be equally efficacious as the use of a pharmaceutical thrombolytic agent for the immediate management of dysfunctional catheters (57 participants: RR 0.92, 95% CI 0.80 to 1.07).Catheter patency is poor following use of thrombolytic agents with studies reporting median catheter survival rates of 14 to 42 days and was reported to improve significantly by fibrin sheath stripping or catheter exchange (37 participants: MD -27.70 days, 95% CI -51.00 to -4.40). Catheter exchange was reported to be superior to sheath disruption with respect to catheter survival (30 participants: MD 213.00 days, 95% CI 205.70 to 220.30).There is insufficient evidence to suggest any specific intervention is superior in terms of ensuring either dialysis adequacy or reduced risk of adverse events. AUTHORS' CONCLUSIONS: Thrombolysis, fibrin sheath disruption and over-the-wire catheter exchange are effective and appropriate therapies for immediately restoring catheter patency in dysfunctional cuffed and tunnelled HD catheters. On current data there is no evidence to support physical intervention over the use of pharmaceutical agents in the acute setting. Pharmacological interventions appear to have a bridging role and long-term catheter survival may be improved by fibrin sheath disruption and is probably superior following catheter exchange. There is no evidence favouring any of these approaches with respect to dialysis adequacy or risk of adverse events.The current review is limited by the small number of available studies with limited numbers of patients enrolled. Most of the studies included in this review were judged to have a high risk of bias and were potentially influenced by pharmaceutical industry involvement.Further research is required to adequately address the question of the most efficacious and clinically appropriate technique for HD catheter dysfunction.
ABSTRACT
BACKGROUND: Sleep apnea is common and associated with poor outcome in severe chronic kidney disease, but validated screening tools are not available. Our objectives were to determine the prevalence of sleep apnea in this population, to assess the validity of screening for sleep apnea using an ApneaLink device and to investigate the relationship of sleep apnea to; symptoms, spirometry and body water. METHODS: Patients with glomerular filtration rate ≤30 mL/min/1.73 m2, whether or not they were receiving haemodialysis, were eligible for enrolment. Participants completed symptom questionnaires, performed an ApneaLink recording and had total body water measured using bioimpedance. This was followed by a multi-channel polysomnography recording which is the gold-standard diagnostic test for sleep apnea. RESULTS: Fifty-seven participants were enrolled and had baseline data collected, of whom only 2 did not have sleep apnea. An apnea hypopnea index ≥30/h was found in 66% of haemodialysis and 54% of non-dialysis participants. A central apnea index ≥5/h was present in 11 patients, with only one dialysis patient having predominantly central sleep apnea. ApneaLink underestimated sleep apnea severity, particularly in the non-dialysis group. Neither total body water corrected for body size, spirometry, subjective sleepiness nor overall symptom scores were associated with sleep apnea severity. CONCLUSIONS: This study demonstrates a very high prevalence of severe sleep apnea in patients with chronic kidney disease. Sleep apnea severity was not associated with quality of life or sleepiness scores and was unrelated to total body water corrected for body size. Routine identification of sleep apnea with polysomnography rather than screening is more appropriate in this group due to the high prevalence.
Subject(s)
Body Water , Renal Insufficiency, Chronic/epidemiology , Sleep Apnea Syndromes/epidemiology , Aged , Australia/epidemiology , Female , Glomerular Filtration Rate , Humans , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/metabolism , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Polysomnography , Prevalence , Quality of Life , Renal Dialysis , Renal Insufficiency, Chronic/metabolism , Renal Insufficiency, Chronic/physiopathology , Severity of Illness Index , Sleep Apnea Syndromes/physiopathology , Surveys and QuestionnairesABSTRACT
Frailty is a multidimensional clinical syndrome characterized by low physical activity, reduced strength, accumulation of multiorgan deficits, decreased physiological reserve, and vulnerability to stressors. Frailty has key social, psychological, and cognitive implications. Frailty is accelerated by uremia, leading to a high prevalence of frailty in patients with advanced chronic kidney disease (CKD) and end-stage kidney disease (ESKD) as well as contributing to adverse outcomes in this patient population. Frailty assessment is not routine in patients with CKD; however, a number of validated clinical assessment tools can assist in prognostication. Frailty assessment in nephrology populations supports shared decision-making and advanced communication and should inform key medical transitions. Frailty screening and interventions in CKD or ESKD are a developing research priority with a rapidly expanding literature base.
ABSTRACT
OBJECTIVES: This study aims to describe the prevalence, characteristics and longitudinal changes in frailty among outpatient chronic kidney disease (CKD) and haemodialysis (HD) populations and their impact on survival. DESIGN: Prospective observational cohort study. SETTING: Single-centre ambulatory tertiary care setting, metropolitan Australian teaching hospital. PARTICIPANTS: Adult patients with advanced CKD (defined as estimated glomerular filtration rate <20 mL/min) or undergoing maintenance HD. Consent model was informed opt-out consent. INTERVENTIONS: Fried frailty assessment at baseline, 6 months and 12 months of longitudinal follow-up. PRIMARY OUTCOMES: All-cause mortality and kidney transplantation events. RESULTS: Frailty was identified in 36.3% of the 256 participants, while an additional 46.5% exhibited prefrailty. Frailty was equally common among CKD and HD cohorts. Frailty outperformed age, comorbidity and laboratory parameters in predicting mortality risk with HR 2.83 (95% CI 1.44 to 5.56, p<0.001). Frailty also substantially reduced access to transplantation. While most participants exhibited static Fried phenotype over longitudinal assessment, improvements in frailty were observed as frequently as frailty progression. Female gender and symptom burden predicted frailty progression. CONCLUSIONS: Frailty is highly prevalent and closely aligned with survival outcomes. Frailty among patients attending routine outpatient care may demonstrate responsiveness to intervention with subsequent improvements in mortality and other patient-level outcomes.
Subject(s)
Frailty , Renal Dialysis , Renal Insufficiency, Chronic , Humans , Female , Male , Frailty/mortality , Prospective Studies , Aged , Middle Aged , Renal Insufficiency, Chronic/mortality , Renal Insufficiency, Chronic/therapy , Australia/epidemiology , Kidney Transplantation , Longitudinal Studies , Prevalence , Glomerular Filtration RateABSTRACT
OBJECTIVE: To describe and analyse the perspectives and communication practices of kidney clinicians and older patients (aged 60 +) during collaborative education and decision-making about dialysis. METHODS: This qualitative study drew on pluralistic data sources and analytical approaches investigating elicited semi-structured interviews (n = 31) with doctors (n = 8), nurses (n = 8) and patients (n = 15), combined with ethnographic observations, written artefacts and audio-recorded naturally-occurring interactions (n = 23, education sessions n = 4; consultations n = 19) in a tertiary Australian kidney outpatient clinic. Data were analysed for themes and linguistic discourse features. RESULTS: Five themes were identified across all data sources: 1) lost opportunity in education; 2) persistent disease knowledge gaps; 3) putting up with dialysis; 4) perceived and real involvement in decision-making and 5) complex role of family as decision-making brokers. CONCLUSION: As the first study to complement interviews with evidence from naturally-occurring kidney interactions, this study balances the perspectives of how older patients and their clinicians view chronic kidney disease education, with how decision-making about dialysis is reflected in practice. PRACTICE IMPLICATIONS: The study provides suggestions for contextualized, multi-perspectives formal and informal training for improving decision-making about dialysis, spanning from indications to boost communication efficiency, to reducing unexplained jargon, incorporating patient navigators and exploring different dialysis modalities.
Subject(s)
Renal Dialysis , Renal Insufficiency, Chronic , Humans , Decision Making , Australia , Decision Making, Shared , Qualitative ResearchABSTRACT
BACKGROUND: Cardiovascular disease is a major cause of death in patients with stage 4-5 Chronic Kidney disease (CKD, eGFR < 30). There are only limited data on the risk factors predicting these complications in CKD patients. Our aim was to determine the role of clinical and echocardiographic parameters in predicting mortality and cardiovascular complications in CKD patients. METHODS: We conducted a prospective observational cohort study of 153 CKD patients between 2007 and 2009. All patients underwent echocardiography at baseline and were followed for a mean of 2.6 years using regular clinic visits and review of files and hospital presentations to record the incidence of cardiovascular events and death. RESULTS: Of 153 patients enrolled, 57 (37%) were on dialysis and 45 (78%) of these patients were on haemodialysis. An enlarged LV was present in 32% of patients and in 22% the LVEF was below 55%. LV mass index was increased in 75% of patients. Some degree of diastolic dysfunction was present in 85% of patients and 35% had grade 2 or higher diastolic dysfunction. During follow up 41 patients (27%) died, 15 (39%) from cardiovascular causes. Mortality was 24.0% in the non-dialysis patients versus 31.6% in patients on dialysis (p=ns). On multivariate analysis age >75 years, previous history of MI, diastolic dysfunction and detectable serum troponin T were significant independent predictor of mortality (P < 0.01). CONCLUSION: Patients with stage 4-5 CKD had a mortality rate of 27% over a mean follow up of 2.6 years. Age >75 years, history of MI, diastolic dysfunction and troponin T were independent predictors of mortality.
Subject(s)
Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/mortality , Stroke Volume , Survival Analysis , Ventricular Dysfunction, Left/diagnostic imaging , Ventricular Dysfunction, Left/mortality , Aged , Australian Capital Territory/epidemiology , Comorbidity , Female , Humans , Incidence , Male , Prognosis , Reproducibility of Results , Risk Factors , Sensitivity and Specificity , Survival Rate , UltrasonographyABSTRACT
Introduction: Effective strategies to prevent hemodialysis (HD) catheter dysfunction are lacking and there is wide variation in practice. Methods: In this post hoc analysis of the REDUcing the burden of dialysis Catheter ComplicaTIOns: a national (REDUCCTION) stepped-wedge cluster randomized trial, encompassing 37 Australian nephrology services, 6361 participants, and 9872 catheters, we investigated whether the trial intervention, which promoted a suite of evidence-based practices for HD catheter insertion and management, reduced the incidence of catheter dysfunction, which is defined by catheter removal due to inadequate dialysis blood flow. We also analyzed outcomes among tunneled cuffed catheters and sources of event variability. Results: A total of 873 HD catheters were removed because of dysfunction over 1.12 million catheter days. The raw incidence was 0.91 events per 1000 catheter days during the baseline phase and 0.68 events per 1000 catheter days during the intervention phase. The service-wide incidence of catheter dysfunction was 33% lower during the intervention after adjustment for calendar time (incidence rate ratio = 0.67; 95% confidence interval [CI], 0.50-0.89; P = 0.006). Results were consistent among tunneled cuffed catheters (adjusted incidence rate ratio = 0.68; 95% CI, 0.49-0.94), which accounted for 75% of catheters (n = 7403), 97.4% of catheter exposure time and 88.2% of events (n = 770). Among tunneled catheters that survived for 6 months (21.5% of tunneled catheters), between 2% and 5% of the unexplained variation in the number of catheter dysfunction events was attributable to service-level differences, and 18% to 36% was attributable to patient-level differences. Conclusion: Multifaceted interventions that promote evidence-based catheter care may prevent dysfunction, and patient factors are an important source of variation in events.
ABSTRACT
Background and Objectives: Chronic kidney disease progression to ESKD is associated with a marked increase in mortality and morbidity. Its progression is highly variable and difficult to predict. Methods: This is an observational, retrospective, single-centre study. The cohort was patients attending hospital and nephrology clinic at The Canberra Hospital from September 1996 to March 2018. Demographic data, vital signs, kidney function test, proteinuria, and serum glucose were extracted. The model was trained on the featurised time series data with XGBoost. Its performance was compared against six nephrologists and the Kidney Failure Risk Equation (KFRE). Results: A total of 12,371 patients were included, with 2,388 were found to have an adequate density (three eGFR data points in the first 2 years) for subsequent analysis. Patients were divided into 80%/20% ratio for training and testing datasets.ML model had superior performance than nephrologist in predicting ESKD within 2 years with 93.9% accuracy, 60% sensitivity, 97.7% specificity, 75% positive predictive value. The ML model was superior in all performance metrics to the KFRE 4- and 8-variable models.eGFR and glucose were found to be highly contributing to the ESKD prediction performance. Conclusions: The computational predictions had higher accuracy, specificity and positive predictive value, which indicates the potential integration into clinical workflows for decision support.
ABSTRACT
OBJECTIVE: To identify whether multifaceted interventions, or care bundles, reduce catheter related bloodstream infections (CRBSIs) from central venous catheters used for haemodialysis. DESIGN: Stepped wedge, cluster randomised design. SETTING: 37 renal services across Australia. PARTICIPANTS: All adults (age ≥18 years) under the care of a renal service who required insertion of a new haemodialysis catheter. INTERVENTIONS: After a baseline observational phase, a service-wide, multifaceted intervention bundle that included elements of catheter care (insertion, maintenance, and removal) was implemented at one of three randomly assigned time points (12 at the first time point, 12 at the second, and 13 at the third) between 20 December 2016 and 31 March 2020. MAIN OUTCOMES MEASURE: The primary endpoint was the rate of CRBSI in the baseline phase compared with intervention phase at the renal service level using the intention-to-treat principle. RESULTS: 1.14 million haemodialysis catheter days of use were monitored across 6364 patients. Patient characteristics were similar across baseline and intervention phases. 315 CRBSIs occurred (158 in the baseline phase and 157 in the intervention phase), with a rate of 0.21 per 1000 days of catheter use in the baseline phase and 0.29 per 1000 days in the intervention phase, giving an incidence rate ratio of 1.37 (95% confidence interval 0.85 to 2.21; P=0.20). This translates to one in 10 patients who undergo dialysis for a year with a catheter experiencing an episode of CRBSI. CONCLUSIONS: Among patients who require a haemodialysis catheter, the implementation of a multifaceted intervention did not reduce the rate of CRBSI. Multifaceted interventions to prevent CRBSI might not be effective in clinical practice settings. TRIAL REGISTRATION: Australia New Zealand Clinical Trials Registry ACTRN12616000830493.
Subject(s)
Catheter-Related Infections , Central Venous Catheters , Sepsis , Adolescent , Adult , Catheter-Related Infections/epidemiology , Catheter-Related Infections/etiology , Catheter-Related Infections/prevention & control , Central Venous Catheters/adverse effects , Humans , Incidence , Renal Dialysis/adverse effects , Sepsis/complicationsABSTRACT
BACKGROUND: Peritoneal dialysis (PD)-associated peritonitis is treated by administration of antibiotics mixed with the PD solution. Data on antibiotic stability for solutions in current use are limited. The aim of this study was to determine the stability of cefepime, cephazolin and ampicillin in three commercial PD solutions. METHODS: Antibiotics were added to the non-glucose compartment of the Gambro (Gambrosol®) and Fresenius (Balance®) multi-compartment systems and Baxter (Dianeal®) single-compartment (glucose 2.5%) PD solutions in a sterile suite. Antibiotic stability over 3 weeks was determined using both a bioassay of bacterial inhibition and antibiotic concentrations. The influence on stability and sterility of storage conditions was determined. RESULTS: The bioassay demonstrated the stability of all antibiotics for 9 days at room temperature and 3 weeks when refrigerated, except ampicillin in the Gambro solution, which displayed no bioactivity after 4 days. However, a ceiling effect in bacterial inhibition at higher antibiotic concentrations limited the ability of the bioassay to detect antibiotic degradation at relevant concentrations. Antibiotic concentrations varied with time but were comparable to the bioassay and supported stability in refrigerated solutions, except ampicillin in the Gambro solution. No bacterial contamination, marked colour change or precipitation occurred. CONCLUSIONS: This study supports the stability of cephazolin and cefepime in all three PD solutions and ampicillin in only the Baxter and Fresenius PD solutions. Antibiotic stability studies should ideally be conducted prior to registration and marketing of new PD solutions.
Subject(s)
Ampicillin/chemistry , Anti-Bacterial Agents/chemistry , Cefazolin/chemistry , Cephalosporins/chemistry , Dialysis Solutions/chemistry , Peritoneal Dialysis/adverse effects , Peritonitis/drug therapy , Bacteria/drug effects , Bacteria/growth & development , Cefepime , Chemistry, Pharmaceutical , Dialysis Solutions/standards , Drug Stability , Drug Storage , Humans , Peritonitis/etiologyABSTRACT
Monitoring of blood flows in arteriovenous fistulae and arteriovenous grafts is recommended to predict access thrombosis. The ultrasound dilution technique (UDT) is the gold standard. We compare a recently described haemoglobin dilution technique (HDT) with the UDT in measurement of vascular access flow. Access blood flow was measured in 67 stable dialysis patients using HDT by bedside Hemocue (Hemocue AB, Ängelholm, Sweden) and laboratory measurement. Access blood flow was then measured by UDT in the same dialysis session. Median flow rate by UDT was 950 ml/min (IQR 490-1,440 ml/min), by Hemocue HDT 935 ml/min (IQR 475-1,395 ml/min, p = 0.534), and by laboratory haemoglobin HDT 920 ml/min (IQR 463-1,378 ml/min). Bland-Altman plots demonstrated poor agreement between UDT and HDT (limits of agreement for Hemocue HDT -22.7 to 20.1%, for laboratory HDT -21.2 to 20.4%). HDT can be used to measure vascular access flow but requires validation against clinical outcomes before being recommended as an alternative to UDT.
Subject(s)
Hematologic Tests , Hemodynamics , Hemoglobins/analysis , Indicator Dilution Techniques/instrumentation , Renal Dialysis/methods , Aged , Arteriovenous Fistula/physiopathology , Blood Pressure , Female , Hemoglobins/chemistry , Humans , Male , Middle Aged , Prospective Studies , Regression Analysis , Renal Dialysis/instrumentation , Thrombosis/diagnostic imaging , UltrasonographyABSTRACT
The stent placement in patients with atherosclerotic renal artery stenosis and impaired renal function (STAR) and revascularization versus medical therapy for renal-artery stenosis (ASTRAL) trials concluded that renal artery angioplasty was not superior to medical management in delaying progression to renal failure or controlling blood pressure in a selected population. (1,2) There were several criticisms of the STAR trial's methodology, and an important criticism of ASTRAL was that the patient was excluded if their clinician was uncertain of the value in correcting the stenosis. Anuric renal failure by renal artery stenosis is a rare occurrence and falls outside this criteria.