ABSTRACT
BACKGROUND: Diet and exercise are important components of treatment for complex chronic conditions, however access to allied health support is limited. When available, support is often siloed and fragmented. Digital health incorporating patient choice may help to align health care services with preferences and goals. This study evaluated the implementation of a ubiquitously accessible patient-centred digital health diet and exercise service. METHODS: U-DECIDE was a single-centre, 26-week randomised controlled trial set in kidney and liver disease clinics in a tertiary hospital in Brisbane, Australia. Participants were adults with a complex chronic condition referred for dietetic consultation with at least one feature of the metabolic syndrome. All participants received a dietary consultation, an activity monitor and usual care. Intervention participants were offered one text message per week and access to additional digital health options (increased text message frequency, nutrition app, exercise app, group-based diet and/or exercise video consultations). The primary outcome of feasibility was determined by safety (study-related serious adverse events: SRSAEs), recruitment (≥ 50% eligible patients), retention (≥ 70%), exposure uptake (≥ 75% of intervention group had greater access to health professional contact than comparator) and video consultation adherence (≥ 80% attendance). Secondary outcomes included process evaluation metrics and clinical outcomes. RESULTS: Of 67 participants (intervention n = 33, comparator n = 34), 37 (55%) were men, median (IQR) age was 51 (41-58) years. The most chosen digital health options were the nutrition app (n = 29, 88%) and exercise video consultations (n = 26, 79%). Only one participant chose no additional digital health options. The intervention group had no SRSAEs. The study exceeded targets for recruitment (52%), retention (81%) and exposure uptake (94%). Video consultation adherence was 42%. Engagement across digital health options was inconsistent. CONCLUSIONS: Digital health options incorporating patient choice were feasible and can be offered to people with complex chronic disease as a service model option. TRIAL REGISTRATION: Australia and New Zealand Trials Register: Trial Registration Number: ACTRN12620001282976. Registered 27th November 2020.
Subject(s)
Feasibility Studies , Humans , Male , Female , Middle Aged , Chronic Disease/therapy , Adult , Text Messaging , Australia , Exercise , Aged , Mobile Applications , Exercise Therapy/methods , TelemedicineABSTRACT
BACKGROUND: Supervised lifestyle interventions have the potential to significantly improve physical activity and fitness in patients with CKD. METHODS: To assess the efficacy of a lifestyle intervention in patients with CKD to improve cardiorespiratory fitness and exercise capacity over 36 months, we conducted a randomized clinical trial, enrolling 160 patients with stage 3-4 CKD, with 81 randomized to usual care and 79 to a 3-year lifestyle intervention. The lifestyle intervention comprised care from a multidisciplinary team, including a nephrologist, nurse practitioner, exercise physiologist, dietitian, diabetes educator, psychologist, and social worker. The exercise training component consisted of an 8-week individualized and supervised gym-based exercise intervention followed by 34 months of a predominantly home-based program. Self-reported physical activity (metabolic equivalent of tasks [METs] minutes per week), cardiorespiratory fitness (peak O2 consumption [VO2peak]), exercise capacity (maximum METs and 6-minute walk distance) and neuromuscular fitness (grip strength and get-up-and-go test time) were evaluated at 12, 24, and 36 months. RESULTS: The intervention increased the percentage of patients meeting physical activity guideline targets of 500 MET min/wk from 29% at baseline to 63% at 3 years. At 12 months, both VO2peak and METs increased significantly in the lifestyle intervention group by 9.7% and 30%, respectively, without change in the usual care group. Thereafter, VO2peak declined to near baseline levels, whereas METs remained elevated in the lifestyle intervention group at 24 and 36 months. After 3 years, the intervention had increased the 6-minute walk distance and blunted declines in the get-up-and-go test time. CONCLUSIONS: A 3-year lifestyle intervention doubled the percentage of CKD patients meeting physical activity guidelines, improved exercise capacity, and ameliorated losses in neuromuscular and cardiorespiratory fitness.
Subject(s)
Healthy Lifestyle , Renal Insufficiency, Chronic/therapy , Aged , Exercise , Exercise Test , Exercise Therapy , Female , Heart Disease Risk Factors , Humans , Longitudinal Studies , Male , Middle Aged , Oxygen Consumption , Physical Fitness , Renal Insufficiency, Chronic/nursing , Renal Insufficiency, Chronic/physiopathology , WalkingABSTRACT
BACKGROUND: Infections are a common complication following kidney transplantation, but are reported inconsistently in clinical trials. This study aimed to identify the infection outcomes of highest priority for patients/caregivers and health professionals to inform a core outcome set to be reported in all kidney transplant clinical trials. METHODS: In an international online survey, participants rated the absolute importance of 16 infections and eight severity dimensions on 9-point Likert Scales, with 7-9 being critically important. Relative importance was determined using a best-worst scale. Means and proportions of the Likert-scale ratings and best-worst preference scores were calculated. RESULTS: 353 healthcare professionals (19 who identified as both patients/caregiver and healthcare professionals) and 220 patients/caregivers (190 patients, 22 caregivers, eight who identified as both) from 55 countries completed the survey. Both healthcare professionals and patients/caregivers rated bloodstream (mean 8.4 and 8.5, respectively; aggregate 8.5), kidney/bladder (mean 7.9 and 8.4; aggregate 8.1), and BK virus (mean 8.1 and 8.6; aggregate 8.3) as the top three most critically important infection outcomes, whilst infectious death (mean 8.8 and 8.6; aggregate 8.7), impaired graft function (mean 8.4 and 8.7; aggregate 8.5) and admission to the intensive care unit (mean 8.2 and 8.3; aggregate 8.2) were the top three severity dimensions. Relative importance (best-worst) scores were consistent. CONCLUSIONS: Healthcare professionals and patients/caregivers consistently identified bloodstream infection, kidney/bladder infections, and BK virus as the three most important infection outcomes, and infectious death, admission to intensive care unit and infection impairing graft function as the three most important infection severity outcomes.
Subject(s)
Caregivers , Kidney Transplantation , Delphi Technique , Health Personnel , Humans , Kidney Transplantation/adverse effects , Surveys and QuestionnairesABSTRACT
AIM: The benefits of dialysis in the older population remain highly debated, particularly for certain dialysis modalities. This study aimed to explore the dialysis modality utilization patterns between in-centre haemodialysis (ICHD), peritoneal dialysis (PD) and home haemodialysis (HHD) and their association with outcomes in older persons. METHODS: Older persons (≥75 years) initiating dialysis in Australia and New Zealand from 1999 to 2018 reported to the Australia and New Zealand Dialysis and Transplant (ANZDATA) registry were included. The main aim of the study was to characterize dialysis modality utilization patterns and describe individual characteristics of each pattern. Relationships between identified patterns and survival, causes of death and withdrawal were examined as secondary analyses, where the pattern was considered as the exposure. RESULTS: A total of 10 306 older persons initiated dialysis over the study period. Of these, 6776 (66%) and 1535 (15%) were exclusively treated by ICHD and PD, respectively, while 136 (1%) ever received HHD during their dialysis treatment course. The remainder received both ICHD and PD: 906 (9%) started dialysis on ICHD and 953 (9%) on PD. Different individual characteristics were seen across dialysis modality utilization patterns. Median survival time was 3.0 (95%CI 2.9-3.1) years. Differences in survival were seen across groups and varied depending on the time period following dialysis initiation. Dialysis withdrawal was an important cause of death and varied according to individual characteristics and utilization patterns. CONCLUSION: This study showed that dialysis modality utilization patterns in older persons are associated with mortality, independent of individual characteristics.
Subject(s)
Kidney Failure, Chronic , Peritoneal Dialysis , Aged , Aged, 80 and over , Hemodialysis, Home/adverse effects , Humans , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/therapy , New Zealand/epidemiology , Peritoneal Dialysis/adverse effects , Registries , Renal Dialysis/adverse effectsABSTRACT
OBJECTIVES: Modulating the large intestinal microbiome of kidney transplant recipients (KTRs) may reduce infectious complications. The aim of this study is to assess the feasibility of a randomized controlled trial of prebiotics in reducing infections and gastrointestinal symptoms in KTRs. (DESIGN) AND METHODS: Acute KTRs were recruited to a double-blind, placebo-controlled, randomized trial at a single kidney transplant center. Patients were provided with prebiotics or placebo for 7 weeks. The primary outcome was feasibility, defined as recruitment of ≥80% of eligible people within 6 months. Secondary outcomes included adherence and tolerability, participant retention in trial, proportions of participants providing serum and stool specimens, self-reported quality of life, gastrointestinal symptoms, and infection events. RESULTS: During the 7-week period, 72 patients met eligibility criteria, of whom 60 (83%) consented to participate (mean ± standard deviation age 53 ± 12 years; 62% males). Fifty-six (78%) participants were randomized (27 interventions and 29 controls). Although participants receiving intervention experienced reduced gastrointestinal symptoms (-0.28 [interquartile range, IQR -0.67 to 0.08] vs. -0.07 [IQR -0.27 to 0], P = .03), both control and intervention groups were similar in adherence (67% vs. 72%, P = .36), tolerability (56% vs. 62%, P = .64), quality of life (-0.2 [IQR -0.6 to 0] vs. -0.2 [IQR -0.8 to 0], P = .82), and infection events (33% vs. 34%, P = .83). Blood and stool samples were collected from ≥90% of participants in both groups. CONCLUSIONS: It is feasible to recruit and retain acute KTRs in a randomized, placebo-controlled trial examining the effect of prebiotics on infections and gastrointestinal symptoms. This study also showed that prebiotics significantly reduced gastrointestinal symptoms.
Subject(s)
Gastrointestinal Microbiome , Kidney Transplantation , Male , Humans , Adult , Middle Aged , Aged , Female , Prebiotics , Feasibility Studies , Quality of Life , Double-Blind MethodABSTRACT
BACKGROUND: In the era of organ shortage, home hemodialysis (HHD) has been identified as the possible preferential bridge to kidney transplantation. Data are conflicting regarding the comparability of HHD and transplantation outcomes. This study aimed to compare patient and treatment survival between HHD patients and kidney transplant recipients. METHODS: The Australia and New Zealand Dialysis and Transplant Registry was used to include incident HHD patients on Day 90 after initiation of kidney replacement therapy and first kidney-only transplant recipients in Australia and New Zealand from 1997 to 2017. Survival times were analyzed using the Kaplan-Meier product-limit method comparing HHD patients with subtypes of kidney transplant recipients using the log-rank test. Adjusted analyses were performed with multivariable Cox proportional hazards regression models for time to all-cause mortality. Time-to-treatment failure or death was assessed as a composite secondary outcome. RESULTS: The study compared 1411 HHD patients with 4960 living donor (LD) recipients, 6019 standard criteria donor (SCD) recipients and 2427 expanded criteria donor (ECD) recipients. While LD and SCD recipients had reduced risks of mortality compared with HHD patients [LD adjusted hazard ratio (HR) = 0.57, 95% confidence interval (CI) 0.46-0.71; SCD HR = 0.65 95% CI 0.52-0.79], the risk of mortality was comparable between ECD recipients and HHD patients (HR = 0.90, 95% CI 0.73-1.12). LD, SCD and ECD kidney recipients each experienced superior time-to-treatment failure or death compared with HHD patients. CONCLUSIONS: This large registry study showed that kidney transplant offers a survival benefit compared with HHD but that this advantage is not significant for ECD recipients.
Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Australia/epidemiology , Graft Survival , Hemodialysis, Home , Humans , Kidney Failure, Chronic/surgery , Living Donors , New Zealand/epidemiology , Registries , Renal Dialysis , Transplant Recipients , Treatment OutcomeABSTRACT
BACKGROUND: Bayesian forecasting-based limited sampling strategies (LSSs) for tacrolimus have not been evaluated for the prediction of subsequent tacrolimus exposure. This study examined the predictive performance of Bayesian forecasting programs/services for the estimation of future tacrolimus area under the curve (AUC) from 0 to 12 hours (AUC0-12) in kidney transplant recipients. METHODS: Tacrolimus concentrations were measured in 20 adult kidney transplant recipients, 1 month post-transplant, on 2 occasions one week apart. Twelve samples were taken predose and 13 samples were taken postdose at the specified times on the first and second sampling occasions, respectively. The predicted AUC0-12 (AUCpredicted) was estimated using Bayesian forecasting programs/services and data from both sampling occasions for each patient and compared with the fully measured AUC0-12 (AUCmeasured) calculated using the linear trapezoidal rule on the second sampling occasion. The bias (median percentage prediction error [MPPE]) and imprecision (median absolute prediction error [MAPE]) were determined. RESULTS: Three programs/services were evaluated using different LSSs (C0; C0, C1, C3; C0, C1, C2, C4; and all available concentrations). MPPE and MAPE for the prediction of fully measured AUC0-12 were <15% for each program/service (with the exclusion of when only C0 was used), when using estimated AUC from data on the same (second) occasion. The MPPE and MAPE for the prediction of a future fully measured AUC0-12 were <15% for 2 programs/services (and for the third when participants who had a tacrolimus dose change between sampling days were excluded), when the occasion 1-AUCpredicted, using C0, C1, and C3, was compared with the occasion 2-AUCmeasured. CONCLUSIONS: All 3 Bayesian forecasting programs/services evaluated had acceptable bias and imprecision for predicting a future AUC0-12, using tacrolimus concentrations at C0, C1, and C3, and could be used for the accurate prediction of tacrolimus exposure in adult kidney transplant recipients.
Subject(s)
Immunosuppressive Agents/pharmacokinetics , Kidney Transplantation , Tacrolimus , Adult , Area Under Curve , Bayes Theorem , Drug Monitoring , Humans , Tacrolimus/pharmacokinetics , Transplant RecipientsABSTRACT
BACKGROUND: There are few studies that have examined whether dysbiosis occurs in kidney donors and transplant recipients following kidney transplant surgery. AIM: To ascertain whether changes occur in the gastrointestinal microbiota of the kidney donor and recipient following kidney transplantation. METHODS: Kidney transplant recipients and their donors were prospectively enrolled in a pilot study to collect one faecal sample prior to, and another faecal sample between four to eight weeks following surgery. Gastrointestinal microbiota richness, Shannon diversity measures and functional assessments of kidney donors and recipients were analysed via metagenomic sequencing. RESULTS: The study included 12 donors (median age 56 years, 6 females) and 12 recipients (median age 51 years, 3 females). Donor microbiota showed no significant changes in gastrointestinal microbiota richness, Shannon diversity, or functional assessments before and after nephrectomy. Recipient microbiota was altered post-transplant, reflected in reductions of the mean (±SD) richness values (156 ± 46.5 to 116 ± 38.6, p = 0.002), and Shannon diversity (3.57 ± 0.49 to 3.14 ± 0.52, p = 0.007), and a dramatic increase in Roseburia spp. abundance post-transplant (26-fold increase from 0.16 ± 0.0091 to 4.6 ± 0.3; p = 0.006; FDR = 0.12). Functionally, the post-transplant microbial community shifted towards those taxa using the glycolysis pathway (1.2-fold increase; p = 0.02; FDR = 0.26) for energy metabolism, while those functions involved with reactive oxygen species degradation decreased (2.6-fold; p = 0.006; FDR = 0.14). CONCLUSION: Live donor kidney transplantation and standard care post-transplant result in significant alterations in gut microbiota richness, diversity, composition and functional parameters in kidney transplant recipients but not in their kidney donors.
Subject(s)
Gastrointestinal Microbiome , Kidney Transplantation , Adult , Cohort Studies , Female , Humans , Living Donors , Male , Middle Aged , Transplant RecipientsABSTRACT
AIM: With improved life expectancy over time, the burden of kidney failure resulting in kidney replacement therapy (KRT) in older persons is increasing. This study aimed to describe the age distribution at dialysis initiation in Australia and New Zealand (ANZ) across centres and over time. METHODS: Adults initiating dialysis as first KRT in ANZ from 1999 to 2018 reported to the Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry were included. The primary outcomes were the age distribution and the proportion of older persons (75 years and older) initiating dialysis across centres and over time. Secondary outcomes were characterization of the older population compared with younger people and differences in dialysis modality and treatment trajectories between groups. RESULTS: Over the study period, 55 382 people initiated dialysis as first KRT, including 10 306 older persons, in 100 centres. Wide variation in age distribution across states/countries was noted, although the proportion of older persons at dialysis initiation did not significantly change over time (from 13% in 1999 to 19% in 2003, then remaining stable thereafter). Older persons were less likely to be treated with home therapies compared with younger people. Older persons were mostly Caucasians; had higher socioeconomic position, more cardiovascular comorbidities and higher eGFR at baseline; and resided in major cities. Higher proportions of older persons per centre were noted in privately funded facilities. CONCLUSION: Wide variations were noted in the proportions of older persons initiating dialysis across centres and states/country, which were associated with different case-mix across regions, particularly in terms of ethnicity, remoteness and socioeconomic advantage.
Subject(s)
Kidney Failure, Chronic/therapy , Renal Dialysis/statistics & numerical data , Age Distribution , Aged , Aged, 80 and over , Australia , Female , Humans , Male , Middle Aged , New Zealand , Time FactorsABSTRACT
AIM: Haemodialysis treatment prescription varies widely internationally. This study explored patient- and centre-level characteristics associated with weekly haemodialysis hours. METHODS: Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry data were analysed. Characteristics associated with weekly duration were evaluated using mixed-effects linear regression models with patient- and centre-level covariates as fixed effects, and dialysis centre and state as random effects using the 2017 prevalent in-centre haemodialysis (ICHD) and home haemodialysis (HHD) cohorts. Evaluation of patterns of weekly duration over time analysed the 2000 to 2017 incident ICHD and HHD cohorts. RESULTS: Overall, 12 494 ICHD and 1493 HHD prevalent patients in 2017 were included. Median weekly treatment duration was 13.5 (interquartile range [IQR] 12-15) hours for ICHD and 16 (IQR 15-20) hours for HHD. Male sex, younger age, higher body mass index, arteriovenous fistula/graft use, Aboriginal and Torres Strait Islander ethnicity and longer dialysis vintage were associated with longer weekly duration for both ICHD and HHD. No centre characteristics were associated with duration. Variability in duration across centres was very limited in ICHD compared with HHD, with variation in HHD being associated with state. Duration did not vary significantly over time for ICHD, whereas longer weekly HHD treatments were reported between 2006 and 2012 compared with before and after this period. CONCLUSION: This study in the Australian and New Zealand haemodialysis population showed that weekly duration was primarily associated with patient characteristics. No centre effect was demonstrated. Practice patterns seemed to differ across states/countries, with more variability in HHD than ICHD.
Subject(s)
Ambulatory Care Facilities/trends , Nephrologists/trends , Practice Patterns, Physicians'/trends , Renal Dialysis/trends , Renal Insufficiency, Chronic/therapy , Adult , Aged , Australia , Female , Healthcare Disparities/trends , Hemodialysis, Home/trends , Humans , Incidence , Male , Middle Aged , Native Hawaiian or Other Pacific Islander , New Zealand/epidemiology , Prevalence , Registries , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/ethnology , Time FactorsABSTRACT
BACKGROUND: Home-based dialysis therapies, home hemodialysis (HHD) and peritoneal dialysis (PD) are underutilized in many countries and significant variation in the uptake of home dialysis exists across dialysis centers. This study aimed to evaluate the patient- and center-level characteristics associated with uptake of home dialysis. METHODS: The Australia and New Zealand Dialysis and Transplant (ANZDATA) Registry was used to include incident dialysis patients in Australia and New Zealand from 1997 to 2017. Uptake of home dialysis was defined as any HHD or PD treatment reported to ANZDATA within 6 months of dialysis initiation. Characteristics associated with home dialysis uptake were evaluated using mixed effects logistic regression models with patient- and center-level covariates, era as a fixed effect and dialysis center as a random effect. RESULTS: Overall, 54 773 patients were included. Uptake of home-based dialysis was reported in 24 399 (45%) patients but varied between 0 and 87% across the 76 centers. Patient-level factors associated with lower uptake included male sex, ethnicity (particularly indigenous peoples), older age, presence of comorbidities, late referral to a nephrology service, remote residence and obesity. Center-level predictors of lower uptake included small center size, smaller proportion of patients with permanent access at dialysis initiation and lower weekly facility hemodialysis hours. The variation in odds of home dialysis uptake across centers increased by 3% after adjusting for the era and patient-level characteristics but decreased by 24% after adjusting for center-level characteristics. CONCLUSION: Center-specific factors are associated with the variation in uptake of home dialysis across centers in Australia and New Zealand.
Subject(s)
Hemodialysis, Home/statistics & numerical data , Kidney Failure, Chronic/therapy , Peritoneal Dialysis/statistics & numerical data , Registries/statistics & numerical data , Adult , Aged , Australia , Female , Humans , Male , Middle Aged , New ZealandABSTRACT
This registry-based study evaluated the contribution of center characteristics to kidney transplant outcomes in adult first kidney transplant recipients in Australia and New Zealand between 2004 and 2014. Primary outcomes were mortality and graft failure, and secondary outcomes were transplant complications. Overall, 6970 transplants from 17 centers were included. For deceased donor transplants, 5-year patient and graft survival rates varied considerably (81.0-93.9% and 72.2-88.3%, respectively). Variations in mortality and graft failure were partially reduced after adjustment for patient characteristics (1% and 20% reductions) and more markedly reduced after adjustment for center characteristics (41% and 55% reductions). For living donor transplants, 5-year patient and graft survival rates varied (89.7-100% and 79.2-96.9%, respectively). Centers with high average total ischemic times (>14 h) were associated with higher mortality for both deceased (adjusted hazard ratio [(AHR] 2.24, 95% CI 1.21-4.13) and living donor transplants (AHR 1.76, 95% CI 1.02-3.04). Small center size (<35 new kidney transplants/year) was associated with a lower hazard of mortality for living donor kidney transplants (AHR 0.48, 95% CI 0.28-0.81). No center characteristic was associated with graft failure. The appreciable variations in deceased donor kidney transplant recipient and graft survival outcomes across centers were attributable to center effects.
Subject(s)
Kidney Transplantation , Adult , Australia/epidemiology , Graft Rejection , Graft Survival , Humans , Living Donors , New Zealand/epidemiology , RegistriesABSTRACT
BACKGROUND: Solid organ transplant recipients are at high risk for infections due to the complexity of surgical procedures combined with the impact of immunosuppression. No consensus exists on the role of antibiotics for surgical site infections in solid organ transplant recipients. OBJECTIVES: To assess the benefits and harms of prophylactic antimicrobial agents for preventing surgical site infections in solid organ transplant recipients. SEARCH METHODS: The Cochrane Kidney and Transplant Register of Studies was searched up to 21 April 2020 through contact with the Information Specialist using search terms relevant to this review. Studies in the Register are identified through searches of CENTRAL, MEDLINE, and EMBASE, conference proceedings, the International Clinical Trials Register (ICTRP) Search Portal, and ClinicalTrials.gov. SELECTION CRITERIA: All randomised controlled trials (RCTs) and quasi-RCTs in any language assessing prophylactic antibiotics in preventing surgical site infections in solid organ transplant recipients at any time point after transplantation. DATA COLLECTION AND ANALYSIS: Two authors independently determined study eligibility, assessed quality, and extracted data. Primary outcomes were surgical site infections and antimicrobial resistance. Other outcomes included urinary tract infections, pneumonias and septicaemia, death (any cause), graft loss, graft rejection, graft function, adverse reactions to antimicrobial agents, and outcomes identified by the Standardised Outcomes of Nephrology Group (SONG), specifically graft health, cardiovascular disease, cancer and life participation. Summary effect estimates were obtained using a random-effects model and results were expressed as risk ratios (RR) and 95% confidence intervals (CI). The quality of the evidence was assessed using the risk of bias and the GRADE approach. MAIN RESULTS: We identified eight eligible studies (718 randomised participants). Overall, five studies (248 randomised participants) compared antibiotics versus no antibiotics, and three studies (470 randomised participants) compared extended duration versus short duration antibiotics. Risk of bias was assessed as high for performance bias (eight studies), detection bias (eight studies) and attrition bias (two studies). It is uncertain whether antibiotics reduce the incidence of surgical site infections as the certainty of the evidence has been assessed as very low (RR 0.42, 95% CI 0.21 to 0.85; 5 studies, 226 participants; I2 = 25%). The certainty of the evidence was very low for all other reported outcomes (death, graft loss, and other infections). It is uncertain whether extended duration antibiotics reduces the incidence of surgical site infections in either solid organ transplant recipients (RR 1.19, 95% CI 0.58 to 2.48; 2 studies, 302 participants; I2 = 0%) or kidney-only transplant recipients (RR 0.50, 95% CI 0.05 to 5.48; 1 study, 205 participants) as the certainty of the evidence has been assessed as very low. The certainty of the evidence was very low for all other reported outcomes (death, graft loss, and other infections). None of the eight included studies evaluated antimicrobial agent adverse reactions, graft health, cardiovascular disease, cancer, life participation, biochemical and haematological parameters, intervention cost, hospitalisation length, or overall hospitalisation costs. AUTHORS' CONCLUSIONS: Due to methodological limitations, risk of bias and significant heterogeneity, the current evidence for the use of prophylactic perioperative antibiotics in transplantation is of very low quality. Further high quality, adequately powered RCTs would help better inform clinical practice.
Subject(s)
Anti-Bacterial Agents/therapeutic use , Antibiotic Prophylaxis , Surgical Wound Infection/prevention & control , Transplant Recipients , Bias , Graft Survival , Humans , Pneumonia/epidemiology , Randomized Controlled Trials as Topic , Sepsis/epidemiology , Surgical Wound Infection/mortalityABSTRACT
Infectious complications are common following kidney transplantation and rank in the top five causes of death in patients with allograft function. Over the last 5 years, there has been emerging evidence that changes in the gastrointestinal microbiota following kidney transplantation may play a key role in the pathogenesis of transplant-associated infections. Different factors have emerged which may disrupt the interaction between the gastrointestinal microbiota and the immune system, which may lead to infective complications in kidney transplant recipients. Over the last 5 years, there has been emerging evidence that changes in the gastrointestinal microbiota following kidney transplantation may play a key role in the pathogenesis of transplant-associated infections. This review will discuss the structure and function of the gastrointestinal microbiota, the changes that occur in the gastrointestinal microbiota following kidney transplantation and the factors underpinning these changes, how these changes may lead to transplant-associated infectious complications and potential treatments which may be instituted to mitigate this risk.
Subject(s)
Bacterial Infections/microbiology , Gastrointestinal Microbiome , Gastrointestinal Tract/microbiology , Kidney Transplantation/adverse effects , Opportunistic Infections/microbiology , Animals , Bacterial Infections/immunology , Bacterial Infections/prevention & control , Dysbiosis , Host-Pathogen Interactions , Humans , Immunocompromised Host , Immunosuppressive Agents/adverse effects , Opportunistic Infections/immunology , Opportunistic Infections/prevention & control , Prebiotics/administration & dosage , Probiotics/administration & dosage , Risk Factors , Synbiotics/administration & dosage , Treatment OutcomeABSTRACT
BACKGROUND: Prednisolone displays significant pharmacokinetic variability and exposure-outcome relationships in renal transplant recipients, suggesting a role for drug monitoring in some scenarios. It is highly protein-bound, and the free form is pharmacologically active but cumbersome to measure. Saliva concentrations might reflect free plasma prednisolone and present an alternative measurement. The aim of this study was to examine the correlation between total and free plasma and saliva prednisolone in adult renal transplant recipients. METHODS: Total and free plasma and saliva prednisolone concentrations were measured in 20 patients receiving oral prednisolone 1-2 months after transplant, between pre-dose and 12 hours post-dose. Prednisolone was determined using high-performance liquid chromatography mass spectrometric detection. The Pearson coefficient was used to assess the association between plasma and salivary prednisolone concentrations and area under the concentration-time curves (AUC0-12). RESULTS: When considering all time points, the total and free plasma prednisolone concentrations correlated well (r = 0.81), but there was poor correlation between saliva and free (r = 0.003) and total (r = 0.01) plasma concentrations. When concentrations before the maximum free prednisolone plasma value were excluded, the correlation between free plasma and saliva concentrations improved (r = 0.57). There was a moderate correlation between free and total plasma prednisolone AUC0-12 (r = 0.62) using all time points, but a poor correlation between free and total plasma prednisolone AUC0-12 and saliva AUC0-12 (r = 0.07; r = 0.17). CONCLUSIONS: Total and free plasma prednisolone measurements correlated poorly with saliva measurements; however, correlation improved when concentrations early in the dosing interval were excluded.
Subject(s)
Glucocorticoids/pharmacokinetics , Kidney Transplantation , Prednisolone/pharmacokinetics , Saliva/chemistry , Adult , Aged , Area Under Curve , Female , Glucocorticoids/blood , Glucocorticoids/chemistry , Glucocorticoids/therapeutic use , Humans , Immunosuppressive Agents/therapeutic use , Male , Middle Aged , Mycophenolic Acid/therapeutic use , Prednisolone/blood , Prednisolone/chemistry , Prednisolone/therapeutic use , Tacrolimus/therapeutic use , Transplant Recipients , Young AdultABSTRACT
INTRODUCTION: High-intensity interval training (HIIT) increases mitochondrial biogenesis and cardiorespiratory fitness in chronic disease populations, however has not been studied in people with chronic kidney disease (CKD). The aim of this study was to compare the feasibility, safety, and efficacy of HIIT with moderate-intensity continuous training (MICT) in people with CKD. METHODS: Fourteen individuals with stage 3-4 CKD were randomized to 3 supervised sessions/wk for 12 weeks, of HIIT (n = 9, 4 × 4 minute intervals, 80%-95% peak heart rate [PHR]) or MICT (n = 5, 40 minutes, 65% PHR). Feasibility was assessed via session attendance and adherence to the exercise intensity. Safety was examined by adverse event reporting. Efficacy was determined from changes in cardiorespiratory fitness (VO2 peak), exercise capacity (METs), and markers of mitochondrial biogenesis (PGC1α protein levels), muscle protein catabolism (MuRF1), and muscle protein synthesis (p-P70S6k Thr389 ). RESULTS: Participants completed a similar number of sessions in each group (HIIT = 33.0[7.0] vs MICT = 33.5[3.3] sessions), and participants adhered to the target heart rates. There were no adverse events attributable to exercise training. There was a significant time effect for exercise capacity (HIIT = +0.8 ± 1.2; MICT = +1.3 ± 1.6 METs; P = 0.01) and muscle protein synthesis (HIIT = +0.6 ± 1.1; MICT = +1.4 ± 1.7 au; P = 0.04). However, there were no significant (P > 0.05) group × time effects for any outcomes. CONCLUSION: This pilot study demonstrated that HIIT is a feasible and safe option for people with CKD, and there were similar benefits of HIIT and MICT on exercise capacity and skeletal muscle protein synthesis. These data support a larger trial to further evaluate the effectiveness of HIIT.
Subject(s)
Cardiorespiratory Fitness , Exercise Therapy , High-Intensity Interval Training , Renal Insufficiency, Chronic/therapy , Aged , Female , Heart Rate , Humans , Male , Middle Aged , Muscle Proteins/metabolism , Muscle, Skeletal/metabolism , Organelle Biogenesis , Oxygen Consumption , Peroxisome Proliferator-Activated Receptor Gamma Coactivator 1-alpha/metabolism , Pilot Projects , Ribosomal Protein S6 Kinases, 70-kDa/metabolism , Tripartite Motif Proteins/metabolism , Ubiquitin-Protein Ligases/metabolismABSTRACT
The incidence of infectious complications, compared with the general population and the pre-transplant status of the recipient, increases substantially following kidney transplantation, causing significant morbidity and mortality. The potent immunosuppressive therapy given to prevent graft rejection in kidney transplant recipients results in an increased susceptibility to a wide range of opportunistic infections including bacterial, viral and fungal infections. Over the last five years, several advances have occurred that may have changed the burden of infectious complications in kidney transplant recipients. Due to the availability of direct-acting antivirals to manage donor-derived hepatitis C infection, this has opened the way for donors with hepatitis C infection to be considered in the donation process. In addition, there have been the development of medications targeting the growing burden of resistant cytomegalovirus, as well as the discovery of the potentially important role of the gastrointestinal microbiota in the pathogenesis of post-transplant infection. In this narrative review, we will discuss these three advances and their potential implications for clinical practice.
Subject(s)
Cytomegalovirus Infections/classification , Hepatitis C/complications , Kidney Transplantation/adverse effects , Adult , Cytomegalovirus/pathogenicity , Cytomegalovirus Infections/physiopathology , Female , Gastrointestinal Microbiome , Hepacivirus/pathogenicity , Hepatitis C/physiopathology , Humans , Incidence , Kidney Transplantation/methods , Male , Middle Aged , Postoperative Complications/etiology , Postoperative Complications/physiopathologyABSTRACT
BACKGROUND: Although multiple linear regression-based limited sampling strategies (LSSs) have been published for enteric-coated mycophenolate sodium, none have been evaluated for the prediction of subsequent mycophenolic acid (MPA) exposure. This study aimed to examine the predictive performance of the published LSS for the estimation of future MPA area under the concentration-time curve from 0 to 12 hours (AUC0-12) in renal transplant recipients. METHODS: Total MPA plasma concentrations were measured in 20 adult renal transplant patients on 2 occasions a week apart. All subjects received concomitant tacrolimus and were approximately 1 month after transplant. Samples were taken at 0, 0.33, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, 6, and 8 hours and 0, 0.25, 0.5, 0.75, 1, 1.25, 1.5, 2, 3, 4, 6, 9, and 12 hours after dose on the first and second sampling occasion, respectively. Predicted MPA AUC0-12 was calculated using 19 published LSSs and data from the first or second sampling occasion for each patient and compared with the second occasion full MPA AUC0-12 calculated using the linear trapezoidal rule. Bias (median percentage prediction error) and imprecision (median absolute prediction error) were determined. RESULTS: Median percentage prediction error and median absolute prediction error for the prediction of full MPA AUC0-12 were <15% for 4 LSSs, using the data from the same (second) occasion. One equation (1.583C1 + 0.765C2 + 0.369C2.5 + 0.748C3 + 1.518C4 + 2.158C6 + 3.292C8 + 3.6690) showed bias and imprecision <15% for the prediction of future MPA AUC0-12, where the predicted AUC0-12 from the first occasion was compared with the full AUC0-12 from the second. All LSSs with an acceptable predictive performance included concentrations taken at least 6 hours after the dose. CONCLUSIONS: Only one LSS had an acceptable bias and precision for future estimation. Accurate dosage prediction using a multiple linear regression-based LSS was not possible without concentrations up to at least 8 hours after the dose.
Subject(s)
Kidney Transplantation/statistics & numerical data , Mycophenolic Acid/pharmacokinetics , Tablets, Enteric-Coated/pharmacokinetics , Tablets, Enteric-Coated/therapeutic use , Transplant Recipients/statistics & numerical data , Australia , Female , Humans , Immunosuppressive Agents/administration & dosage , Linear Models , Male , Middle Aged , Mycophenolic Acid/administration & dosage , Mycophenolic Acid/blood , Sample Size , Tacrolimus/administration & dosageABSTRACT
Thrombotic microangiopathy (TMA) arises in a variety of clinical circumstances with the potential to cause significant dysfunction of the kidneys, brain, gastrointestinal tract and heart. TMA should be considered in all patients with thrombocytopenia and anaemia, with an immediate request to the haematology laboratory to look for red cell fragments on a blood film. Although TMA of any aetiology generally demands prompt treatment, this is especially so in thrombotic thrombocytopenic purpura (TTP) and atypical haemolytic uraemic syndrome (aHUS), where organ failure may be precipitous, irreversible and fatal. In all adults, urgent, empirical plasma exchange (PE) should be started within 4-8 h of presentation for a possible diagnosis of TTP, pending a result for ADAMTS13 (a disintegrin and metalloprotease thrombospondin, number 13) activity. A sodium citrate plasma sample should be collected for ADAMTS13 testing prior to any plasma therapy. In children, Shiga toxin-associated haemolytic uraemic syndrome due to infection with Escherichia coli (STEC-HUS) is the commonest cause of TMA, and is managed supportively. If TTP and STEC-HUS have been excluded, a diagnosis of aHUS should be considered, for which treatment is with the monoclonal complement C5 inhibitor, eculizumab. Although early confirmation of aHUS is often not possible, except in the minority of patients in whom auto-antibodies against factor H are identified, genetic testing ultimately reveals a complement-related mutation in a significant proportion of aHUS cases. The presence of other TMA-associated conditions (e.g. infection, pregnancy/postpartum and malignant hypertension) does not exclude TTP or aHUS as the underlying cause of TMA.