ABSTRACT
Interest in the use of sorbents in chronic dialysis treatment has undergone a revival in the last decades, for which two major factors are responsible. The first is the potential of sorbents as adjunct therapy for the removal of substances that are difficult to remove by conventional dialysis therapies. The second is their use in regeneration of dialysate, which is of pivotal importance in the design of portable or even wearable treatments, next to the potential for reducing water use during conventional dialysis treatment. Sorbent-enhanced dialysis with synthetic polymers was associated with a reduction in inflammatory parameters as compared to hemodialysis and even associated with improved survival in smaller studies, although this needs to be confirmed in large randomized trials. Incorporation of sorbents within a dialysis membrane (mixed matrix membrane) appears a promising way forward to reduce the complexity and costs of a dual therapy but needs to be tested in vivo. For regeneration of dialysate, at present, a combination of urease, zirconium-based sorbents, and activated charcoal is used. Next to sodium release by the sorbent in exchange for ammonium and the CO2 release by the hydrolysis of urea has been a bottleneck in the design of wearable devices, although short-term trials have been performed. Still, for widespread and flexible application of sorbent-assisted portable or wearable devices, a direct urea sorbent would be a major asset. In the near future, it will likely become apparent whether sorbent-assisted dialysis techniques are feasible for routine implementation in clinical practice.
ABSTRACT
Introduction Arterio-venous fistula (AVF) maturation assessment is essential to reduce venous catheter residence. We introduced central venous oxygen saturation (ScvO2) and estimated upper-body blood flow (eUBBF) to monitor newly created fistula maturation and recorded catheter time in patients with and without ScvO2-based fistula maturation. Methods From 2017 to 2019 we conducted a multicenter quality improvement project (QIP) in hemodialysis patients with the explicit goal to shorten catheter residence time post-AVF creation through ScvO2-based maturation monitoring. In patients with a catheter as vascular access, we tracked ScvO2 and eUBBF pre- and post-AVF creation. The primary outcome was catheter residence time post-AVF creation. We compared catheter residence time post-AVF creation between QIP patients and controls. One control group comprised concurrent patients, a second control group comprised historic controls (2014-2016). We conducted Kaplan Meier analysis and constructed a Cox proportional hazards model with variables adjustment to assess time-to-catheter removal. Results The QIP group comprised 44 patients (59±17 years), the concurrent control group 48 patients (59±16 years), the historic control group 57 patients (58±15 years). Six months post-AVF creation, the fraction of non-censored patients with catheter in place was 21% in the QIP cohort, 67% in the concurrent control group, and 68% in the historic control group. In unadjusted and adjusted analysis, catheter residence time post-fistula creation was shorter in QIP patients compared to either control groups (P<0.001). Conclusion ScvO2-based assessment of fistula maturation is associated with shorter catheter residence post-AVF creation.
ABSTRACT
While physical activity (PA) is understood to promote vascular health, little is known about whether the daily and weekly patterns of PA accumulation associate with vascular health. Accelerometer-derived (activPAL3) 6- or 7-day stepping was analyzed for 6430 participants in The Maastricht Study (50.4% women; 22.4% Type 2 diabetes mellitus (T2DM)). Multivariable regression models examined associations between stepping metrics (average step count, and time spent slower and faster paced stepping) with arterial stiffness (measured as carotid-femoral pulse wave velocity (cfPWV)), and several indices of microvascular health (heat-induced skin hyperemia, retinal vessel reactivity and diameter), adjusting for confounders and moderators. PA pattern metrics were added to the regression models to identify associations with vascular health beyond that of stepping metrics. Analyses were stratified by T2DM status if an interaction effect was present. Average step count and time spent faster paced stepping was associated with better vascular health, and the association was stronger in those with compared to those without T2DM. In fully adjusted models a higher step count inter-daily stability was associated with a higher (worse) cfPWV in those without T2DM (std ß = 0.04, p = 0.007) and retinal venular diameter in the whole cohort (std ß = 0.07, p = 0.002). A higher within-day variability in faster paced stepping was associated with a lower (worse) heat-induced skin hyperemia in those with T2DM (std ß = -0.31, p = 0.008). Above and beyond PA volume, the daily and weekly patterns in which PA was accumulated were additionally associated with improved macro- and microvascular health, which may have implications for the prevention of vascular disease.
Subject(s)
Diabetes Mellitus, Type 2 , Exercise , Vascular Stiffness , Humans , Female , Vascular Stiffness/physiology , Male , Middle Aged , Diabetes Mellitus, Type 2/physiopathology , Exercise/physiology , Aged , Hyperemia/physiopathology , Accelerometry , Carotid-Femoral Pulse Wave Velocity , Adult , Pulse Wave Analysis , Retinal Vessels/physiologyABSTRACT
OBJECTIVES: The rising diversity of food preferences and the desire to provide better personalized care provide challenges to renal dietitians working in dialysis clinics. To address this situation, we explored the use of a large language model, specifically, ChatGPT using the GPT-4 model (openai.com), to support nutritional advice given to dialysis patients. METHODS: We tasked ChatGPT-4 with generating a personalized daily meal plan, including nutritional information. Virtual "patients" were generated through Monte Carlo simulation; data from a randomly selected virtual patient were presented to ChatGPT. We provided to ChatGPT patient demographics, food preferences, laboratory data, clinical characteristics, and available budget, to generate a one-day sample menu with recipes and nutritional analyses. The resulting daily recipe recommendations, cooking instructions, and nutritional analyses were reviewed and rated on a five-point Likert scale by an experienced renal dietitian. In addition, the generated content was rated by a renal dietitian and compared with a U. S. Department of Agriculture-approved nutrient analysis software. ChatGPT also analyzed nutrition information of two recipes published online. We also requested a translation of the output into Spanish, Mandarin, Hungarian, German, and Dutch. RESULTS: ChatGPT generated a daily menu with five recipes. The renal dietitian rated the recipes at 3 (3, 3) [median (Q1, Q3)], the cooking instructions at 5 (5,5), and the nutritional analysis at 2 (2, 2) on the five-point Likert scale. ChatGPT's nutritional analysis underestimated calories by 36% (95% CI: 44-88%), protein by 28% (25-167%), fat 48% (29-81%), phosphorus 54% (15-102%), potassium 49% (40-68%), and sodium 53% (14-139%). The nutritional analysis of online available recipes differed only by 0 to 35%. The translations were rated as reliable by native speakers (4 on the five-point Likert scale). CONCLUSION: While ChatGPT-4 shows promise in providing personalized nutritional guidance for diverse dialysis patients, improvements are necessary. This study highlights the importance of thorough qualitative and quantitative evaluation of artificial intelligence-generated content, especially regarding medical use cases.
ABSTRACT
Repeated single-point measurements of thoracic bioimpedance at a single (low) frequency are strongly related to fluid changes during hemodialysis. Extension to semi-continuous measurements may provide longitudinal details in the time pattern of the bioimpedance signal, and multi-frequency measurements may add in-depth information on the distribution between intra- and extracellular fluid. This study aimed to investigate the feasibility of semi-continuous multi-frequency thoracic bioimpedance measurements by a wearable device in hemodialysis patients. Therefore, thoracic bioimpedance was recorded semi-continuously (i.e., every ten minutes) at nine frequencies (8-160 kHz) in 68 patients during two consecutive hemodialysis sessions, complemented by a single-point measurement at home in-between both sessions. On average, the resistance signals increased during both hemodialysis sessions and decreased during the interdialytic interval. The increase during dialysis was larger at 8 kHz (∆ 32.6 Ω during session 1 and ∆ 10 Ω during session 2), compared to 160 kHz (∆ 29.5 Ω during session 1 and ∆ 5.1 Ω during session 2). Whereas the resistance at 8 kHz showed a linear time pattern, the evolution of the resistance at 160 kHz was significantly different (p < 0.0001). Measuring bioimpedance semi-continuously and with a multi-frequency current is a major step forward in the understanding of fluid dynamics in hemodialysis patients. This study paves the road towards remote fluid monitoring.
Subject(s)
Renal Dialysis , Wearable Electronic Devices , Humans , Feasibility Studies , Electric Impedance , Extracellular FluidABSTRACT
BACKGROUND: The 5-year mortality rate for haemodialysis patients is over 50%. Acute and chronic disturbances in salt and fluid homeostasis contribute to poor survival and are established as individual mortality risk factors. However, their interaction in relation to mortality is unclear. METHODS: We used the European Clinical Database 5 to investigate in a retrospective cohort analysis the relationship between transient hypo- and hypernatremia, fluid status and mortality risk of 72 163 haemodialysis patients from 25 countries. Incident haemodialysis patients with at least one valid measurement of bioimpedance spectroscopy were followed until death or administrative censoring from 1 January 2010 to 4 December 2019. Fluid overload and depletion were defined as >2.5 L above, and -1.1 L below normal fluid status, respectively. N = 2 272 041 recorded plasma sodium and fluid status measurements were available over a monthly time grid and analysed in a Cox regression model for time-to-death. RESULTS: Mortality risk of hyponatremia (plasma sodium <135 mmol/L) was slightly increased when fluid status was normal [hazard ratio (HR) 1.26, 95% confidence interval (CI) 1.18-1.35], increased by half when patients were fluid depleted (HR 1.56, 95% CI 1.27-1.93) and accelerated during fluid overload (HR 1.97, 95% CI 1.82-2.12). CONCLUSIONS: Plasma sodium and fluid status act independently as risk factors on mortality. Patient surveillance of fluid status is especially important in the high-risk subpopulation of patients with hyponatremia. Prospective patient-level studies should examine the effects of chronic hypo- and hypernatremia, risk determinants, and their outcome risk.
Subject(s)
Heart Failure , Hypernatremia , Hyponatremia , Water-Electrolyte Imbalance , Humans , Renal Dialysis/adverse effects , Prospective Studies , Retrospective Studies , Sodium , Water-Electrolyte Imbalance/complications , Heart Failure/complicationsABSTRACT
BACKGROUND: In maintenance hemodialysis patients, intradialytic hypotension (IDH) is a frequent complication that has been associated with poor clinical outcomes. Prediction of IDH may facilitate timely interventions and eventually reduce IDH rates. METHODS: We developed a machine learning model to predict IDH in in-center hemodialysis patients 15-75 min in advance. IDH was defined as systolic blood pressure (SBP) <90 mmHg. Demographic, clinical, treatment-related and laboratory data were retrieved from electronic health records and merged with intradialytic machine data that were sent in real-time to the cloud. For model development, dialysis sessions were randomly split into training (80%) and testing (20%) sets. The area under the receiver operating characteristic curve (AUROC) was used as a measure of the model's predictive performance. RESULTS: We utilized data from 693 patients who contributed 42 656 hemodialysis sessions and 355 693 intradialytic SBP measurements. IDH occurred in 16.2% of hemodialysis treatments. Our model predicted IDH 15-75 min in advance with an AUROC of 0.89. Top IDH predictors were the most recent intradialytic SBP and IDH rate, as well as mean nadir SBP of the previous 10 dialysis sessions. CONCLUSIONS: Real-time prediction of IDH during an ongoing hemodialysis session is feasible and has a clinically actionable predictive performance. If and to what degree this predictive information facilitates the timely deployment of preventive interventions and translates into lower IDH rates and improved patient outcomes warrants prospective studies.
Subject(s)
Hypotension , Kidney Failure, Chronic , Humans , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/complications , Prospective Studies , Cloud Computing , Hypotension/diagnosis , Hypotension/etiology , Renal Dialysis/adverse effects , Blood PressureABSTRACT
OBJECTIVE: Dietary protein and physical activity interventions are increasingly implemented during hemodialysis to support muscle maintenance in patients with end-stage renal disease (ESRD). Although muscle maintenance is important, adequate removal of uremic toxins throughout hemodialysis is the primary concern for patients. It remains to be established whether intradialytic protein ingestion and/or exercise modulate uremic toxin removal during hemodialysis. METHODS: We recruited 10 patients with ESRD (age: 65 ± 16 y, BMI: 24.2 ± 4.8 kg/m2) on chronic hemodialysis treatment to participate in this randomized cross-over trial. During hemodialysis, patients were assigned to ingest 40 g protein or a nonprotein placebo both at rest (protein [PRO] and placebo [PLA], respectively) and following 30 min of exercise (PRO + exercise [EX] and PLA + EX, respectively). Blood and spent dialysate samples were collected throughout hemodialysis to assess reduction ratios and removal of urea, creatinine, phosphate, cystatin C, and indoxyl sulfate. RESULTS: The reduction ratios of urea and indoxyl sulfate were higher during PLA (76 ± 6% and 46 ± 9%, respectively) and PLA + EX interventions (77 ± 5% and 45 ± 10%, respectively) when compared to PRO (72 ± 4% and 40 ± 8%, respectively) and PRO + EX interventions (73 ± 4% and 43 ± 7%, respectively; protein effect: P = .001 and P = .023, respectively; exercise effect: P = .25 and P = .52, respectively). Nonetheless, protein ingestion resulted in greater urea removal (P = .046) during hemodialysis. Reduction ratios and removal of creatinine, phosphate, and cystatin C during hemodialysis did not differ following intradialytic protein ingestion or exercise (protein effect: P > .05; exercise effect: P>.05). Urea, creatinine, and phosphate removal were greater throughout the period with intradialytic exercise during PLA + EX and PRO + EX interventions when compared to the same period during PLA and PRO interventions (exercise effect: P = .034, P = .039, and P = .022, respectively). CONCLUSION: The removal of uremic toxins is not compromised by protein feeding and/or exercise implementation during hemodialysis in patients with ESRD.
Subject(s)
Cystatin C , Kidney Failure, Chronic , Humans , Middle Aged , Aged , Aged, 80 and over , Uremic Toxins , Creatinine , Indican , Renal Dialysis/methods , Kidney Failure, Chronic/therapy , Exercise , Urea , Phosphates , Eating , PolyestersABSTRACT
Telemedicine and digitalised healthcare have recently seen exponential growth, led, in part, by increasing efforts to improve patient flexibility and autonomy, as well as drivers from financial austerity and concerns over climate change. Nephrology is no exception, and daily innovations are underway to provide digitalised alternatives to current models of healthcare provision. Wearable technology already exists commercially, and advances in nanotechnology and miniaturisation mean interest is also garnering clinically. Here, we outline the current existing wearable technology pertaining to the diagnosis and monitoring of patients with a spectrum of kidney disease, give an overview of wearable dialysis technology, and explore wearables that do not yet exist but would be of great interest. Finally, we discuss challenges and potential pitfalls with utilising wearable technology and the factors associated with successful implementation.
Subject(s)
Nephrology , Telemedicine , Wearable Electronic Devices , Humans , Delivery of Health Care , Biological TransportABSTRACT
Bioimpedance spectroscopy (BIS) has proven to be a promising non-invasive technique for fluid monitoring in haemodialysis (HD) patients. While current BIS-based monitoring of pre- and post-dialysis fluid status utilizes benchtop devices, designed for intramural use, advancements in micro-electronics have enabled the development of wearable bioimpedance systems. Wearable systems meanwhile can offer a similar frequency range for current injection as commercially available benchtop devices. This opens opportunities for unobtrusive longitudinal fluid status monitoring, including transcellular fluid shifts, with the ultimate goal of improving fluid management, thereby lowering mortality and improving quality of life for HD patients. Ultra-miniaturized wearable devices can also offer simultaneous acquisition of multiple other parameters, including haemodynamic parameters. Combination of wearable BIS and additional longitudinal multiparametric data may aid in the prevention of both haemodynamic instability as well as fluid overload. The opportunity to also acquire data during interdialytic periods using wearable devices likely will give novel pathophysiological insights and the development of smart (predicting) algorithms could contribute to personalizing dialysis schemes and ultimately to autonomous (nocturnal) home dialysis. This review provides an overview of current research regarding wearable bioimpedance, with special attention to applications in end-stage kidney disease patients. Furthermore, we present an outlook on the future use of wearable bioimpedance within dialysis practice.
Subject(s)
Kidney Failure, Chronic , Water-Electrolyte Imbalance , Wearable Electronic Devices , Humans , Renal Dialysis/methods , Quality of Life , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/etiology , Water-Electrolyte Imbalance/etiology , Electric ImpedanceABSTRACT
Patients treated with hemodialysis (HD) repeatedly undergo intradialytic low arterial oxygen saturation and low central venous oxygen saturation, reflecting an imbalance between upper body systemic oxygen supply and demand, which are associated with increased mortality. Abnormalities along the entire oxygen cascade, with impaired diffusive and convective oxygen transport, contribute to the reduced tissue oxygen supply. HD treatment impairs pulmonary gas exchange and reduces ventilatory drive, whereas ultrafiltration can reduce tissue perfusion due to a decline in cardiac output. In addition to these factors, capillary rarefaction and reduced mitochondrial efficacy can further affect the balance between cellular oxygen supply and demand. Whereas it has been convincingly demonstrated that a reduced perfusion of heart and brain during HD contributes to organ damage, the significance of systemic hypoxia remains uncertain, although it may contribute to oxidative stress, systemic inflammation, and accelerated senescence. These abnormalities along the oxygen cascade of patients treated with HD appear to be diametrically opposite to the situation in Tibetan highlanders and Sherpa, whose physiology adapted to the inescapable hypobaric hypoxia of their living environment over many generations. Their adaptation includes pulmonary, vascular, and metabolic alterations with enhanced capillary density, nitric oxide production, and mitochondrial efficacy without oxidative stress. Improving the tissue oxygen supply in patients treated with HD depends primarily on preventing hemodynamic instability by increasing dialysis time/frequency or prescribing cool dialysis. Whether dietary or pharmacological interventions, such as the administration of L-arginine, fermented food, nitrate, nuclear factor erythroid 2-related factor 2 agonists, or prolyl hydroxylase 2 inhibitors, improve clinical outcome in patients treated with HD warrants future research.
Subject(s)
Acclimatization , Altitude , Hypoxia/blood , Kidney Failure, Chronic/therapy , Kidney/physiopathology , Oxygen Consumption , Oxygen/blood , Renal Dialysis , Animals , Biomarkers/blood , Hemodynamics , Humans , Hypoxia/mortality , Hypoxia/physiopathology , Hypoxia/prevention & control , Kidney/metabolism , Kidney Failure, Chronic/blood , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/physiopathology , Renal Dialysis/adverse effects , Renal Dialysis/mortality , Risk Factors , Treatment OutcomeABSTRACT
PURPOSE OF REVIEW: Poor nutritional status is prevalent among end-stage renal disease patients undergoing hemodialysis. Chronic hemodialysis patients show an accelerated decline in skeletal muscle mass and strength, which is associated with higher mortality rates and a reduced quality of life. The current review aims to summarize recent advances regarding underlying causes of muscle loss and interventions that support muscle mass maintenance in patients with chronic hemodialysis. RECENT FINDINGS: Muscle maintenance in chronic hemodialysis patients is compromised by low dietary protein intake levels, anabolic resistance of skeletal muscle tissue, sedentary behavior, and amino acid removal during hemodialysis. Studies assessing the effect of increased protein intake on nutritional status generally show beneficial results, especially in hypoalbuminemic chronic hemodialysis patients. The muscle protein synthetic response following protein ingestion in chronic hemodialysis patients may be enhanced through incorporation of structured physical activity and/or concurrent ketoacid ingestion. SUMMARY: A coordinated program that combines nutritional and physical activity interventions is likely required to attenuate the decline in muscle mass and strength of chronic hemodialysis patients. Nephrologists, dieticians, and exercise specialists should collaborate closely to establish guidelines regarding the appropriate quantity and timing of protein ingestion. In addition, they should provide tailored nutritional and physical activity interventions for chronic hemodialysis patients (see video, Supplemental Digital Content 1, Video abstract, http://links.lww.com/COCN/A14).
Subject(s)
Dietary Proteins , Kidney Failure, Chronic , Humans , Kidney Failure, Chronic/therapy , Nutritional Status , Quality of Life , Renal DialysisABSTRACT
Artificial intelligence (AI) is considered as the next natural progression of traditional statistical techniques. Advances in analytical methods and infrastructure enable AI to be applied in health care. While AI applications are relatively common in fields like ophthalmology and cardiology, its use is scarcely reported in nephrology. We present the current status of AI in research toward kidney disease and discuss future pathways for AI. The clinical applications of AI in progression to end-stage kidney disease and dialysis can be broadly subdivided into three main topics: (a) predicting events in the future such as mortality and hospitalization; (b) providing treatment and decision aids such as automating drug prescription; and (c) identifying patterns such as phenotypical clusters and arteriovenous fistula aneurysm. At present, the use of prediction models in treating patients with kidney disease is still in its infancy and further evidence is needed to identify its relative value. Policies and regulations need to be addressed before implementing AI solutions at the point of care in clinics. AI is not anticipated to replace the nephrologists' medical decision-making, but instead assist them in providing optimal personalized care for their patients.
Subject(s)
Kidney Diseases , Nephrology , Artificial Intelligence , Clinical Decision-Making , Humans , Renal Dialysis/adverse effectsABSTRACT
The COVID-19 pandemic has greatly affected nephrology. Firstly, dialysis patients appear to be at increased risk for infection due to viral transmission next to an enhanced risk for mortality as compared to the general population, even in the face of an often apparently mild clinical presentation. Derangements in the innate and adaptive immune systems may be responsible for a reduced antiviral response, whereas chronic activation of the innate immune system and endothelial dysfunction provide a background for a more severe course. The presence of severe comorbidity, older age, and a reduction of organ reserve may lead to a rapid deterioration of the clinical situation of the patients in case of severe infection. Secondly, patients with COVID-19 are at increased risk of acute kidney injury (AKI), which is related to the severity of the clinical disease. The presence of AKI, and especially the need for renal replacement therapy (RRT), is associated with an increased risk of mortality. AKI in COVID-19 has a multifactorial origin, in which direct viral invasion of kidney cells, activation of the renin-angiotensin aldosterone system, a hyperinflammatory response, hypercoagulability, and nonspecific factors such as hypotension and hypoxemia may be involved. Apart from logistic challenges and the need for strict hygiene within units, treatment of patients with ESRD and COVID-19 is not different from that of the general population. Extracorporeal treatment of patients with AKI with RRT can be complicated by frequent filter clotting due to the hypercoagulable state, for which regional citrate coagulation provides a reasonable solution. Also, acute peritoneal dialysis may be a reasonable option in these patients. Whether adjuncts to extracorporeal therapies, such as hemoadsorption, provide additional benefits in the case of severely ill COVID-19 patients needs to be addressed in controlled studies.
Subject(s)
Acute Kidney Injury/epidemiology , COVID-19/epidemiology , Kidney Failure, Chronic/epidemiology , Pandemics , SARS-CoV-2 , Acute Kidney Injury/etiology , Acute Kidney Injury/therapy , COVID-19/complications , COVID-19/physiopathology , COVID-19/transmission , Comorbidity , Cytokine Release Syndrome/etiology , Cytokine Release Syndrome/prevention & control , Disease Susceptibility , Hemadsorption , Humans , Hygiene , Immunocompromised Host , Immunologic Factors/therapeutic use , Infection Control , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/immunology , Kidney Failure, Chronic/therapy , Renal Replacement Therapy , Risk , Thrombophilia/etiology , Treatment OutcomeABSTRACT
BACKGROUND: Inadequate refilling from extravascular compartments during hemodialysis can lead to intradialytic symptoms, such as hypotension, nausea, vomiting, and cramping/myalgia. Relative blood volume (RBV) plays an important role in adapting the ultrafiltration rate which in turn has a positive effect on intradialytic symptoms. It has been clinically challenging to identify changes RBV in real time to proactively intervene and reduce potential negative consequences of volume depletion. Leveraging advanced technologies to process large volumes of dialysis and machine data in real time and developing prediction models using machine learning (ML) is critical in identifying these signals. METHOD: We conducted a proof-of-concept analysis to retrospectively assess near real-time dialysis treatment data from in-center patients in six clinics using Optical Sensing Device (OSD), during December 2018 to August 2019. The goal of this analysis was to use real-time OSD data to predict if a patient's relative blood volume (RBV) decreases at a rate of at least - 6.5 % per hour within the next 15 min during a dialysis treatment, based on 10-second windows of data in the previous 15 min. A dashboard application was constructed to demonstrate how reporting structures may be developed to alert clinicians in real time of at-risk cases. Data was derived from three sources: (1) OSDs, (2) hemodialysis machines, and (3) patient electronic health records. RESULTS: Treatment data from 616 in-center dialysis patients in the six clinics was curated into a big data store and fed into a Machine Learning (ML) model developed and deployed within the cloud. The threshold for classifying observations as positive or negative was set at 0.08. Precision for the model at this threshold was 0.33 and recall was 0.94. The area under the receiver operating curve (AUROC) for the ML model was 0.89 using test data. CONCLUSIONS: The findings from our proof-of concept analysis demonstrate the design of a cloud-based framework that can be used for making real-time predictions of events during dialysis treatments. Making real-time predictions has the potential to assist clinicians at the point of care during hemodialysis.
Subject(s)
Blood Volume/physiology , Body Fluid Compartments , Hypotension , Kidney Failure, Chronic , Machine Learning , Muscle Cramp , Renal Dialysis , Vomiting , Cloud Computing , Early Diagnosis , Female , Humans , Hypotension/diagnosis , Hypotension/etiology , Hypotension/prevention & control , Kidney Failure, Chronic/physiopathology , Kidney Failure, Chronic/therapy , Male , Middle Aged , Muscle Cramp/diagnosis , Muscle Cramp/etiology , Muscle Cramp/prevention & control , Prognosis , Proof of Concept Study , Renal Dialysis/adverse effects , Renal Dialysis/methods , Vomiting/diagnosis , Vomiting/etiology , Vomiting/prevention & controlABSTRACT
BACKGROUND: Poor nutritional status is frequently observed in end-stage renal disease patients and associated with adverse clinical outcomes and increased mortality. Loss of amino acids (AAs) during hemodialysis (HD) may contribute to protein malnutrition in these patients. OBJECTIVE: We aimed to assess the extent of AA loss during HD in end-stage renal disease patients consuming their habitual diet. METHODS: Ten anuric chronic HD patients (mean ± SD age: 67.9 ± 19.3 y, BMI: 23.2 ± 3.5 kg/m2), undergoing HD 3 times per week, were selected to participate in this study. Spent dialysate was collected continuously and plasma samples were obtained directly before and after a single HD session in each participant. AA profiles in spent dialysate and in pre-HD and post-HD plasma were measured through ultra-performance liquid chromatography to determine AA concentrations and, as such, net loss of AAs. In addition, dietary intake before and throughout HD was assessed using a 24-h food recall questionnaire during HD. Paired-sample t tests were conducted to compare pre-HD and post-HD plasma AA concentrations. RESULTS: During an HD session, 11.95 ± 0.69 g AAs were lost via the dialysate, of which 8.26 ± 0.46 g were nonessential AAs, 3.69 ± 0.31 g were essential AAs, and 1.64 ± 0.17 g were branched-chain AAs. As a consequence, plasma total and essential AA concentrations declined significantly from 2.88 ± 0.15 and 0.80 ± 0.05 mmol/L to 2.27 ± 0.11 and 0.66 ± 0.05 mmol/L, respectively (P < 0.05). AA profiles of pre-HD plasma and spent dialysate were similar. Moreover, AA concentrations in pre-HD plasma and spent dialysate were strongly correlated (Spearman's ρ = 0.92, P < 0.001). CONCLUSIONS: During a single HD session, â¼12 g AAs are lost into the dialysate, causing a significant decline in plasma AA concentrations. AA loss during HD can contribute substantially to protein malnutrition in end-stage renal disease patients. This study was registered at the Netherlands Trial Registry (NTR7101).
Subject(s)
Amino Acids/blood , Dialysis Solutions/analysis , Kidney Failure, Chronic/therapy , Protein-Energy Malnutrition/etiology , Renal Dialysis/adverse effects , Aged , Aged, 80 and over , Amino Acids/analysis , Diet , Dietary Proteins/administration & dosage , Female , Humans , Male , Middle Aged , Nutritional StatusABSTRACT
BACKGROUND: It is a matter of debate whether sodium and potassium intake are associated with heart disease. Further, the mechanisms underlying associations of sodium and potassium intake with cardiac events, if any, are not fully understood. OBJECTIVES: We examined cross-sectional associations of 24-h urinary sodium excretion (UNaE) and potassium excretion (UKE), as estimates of their intakes, with high-sensitivity cardiac troponins T (hs-cTnT) and I (hs-cTnI), and N-terminal pro-B-type natriuretic peptide (NT-proBNP), which are markers of cardiomyocyte injury and cardiac dysfunction. METHODS: We included 2961 participants from the population-based Maastricht Study (mean ± SD age 59.8 ± 8.2 y, 51.9% men), who completed the baseline survey between November 2010 and September 2013. Associations were examined with restricted cubic spline linear regression analyses and ordinary linear regression analyses, adjusted for demographics, lifestyle, and cardiovascular disease (CVD) risk factors. RESULTS: Median [IQR] 24-h UNaE and UKE were 3.7 [2.8-4.7] g/24 h and 3.0 [2.4-3.6] g/24 h, respectively. After adjustment for potential confounders, 24-h UNaE was not associated with hs-cTnT, hs-cTnI, and NT-proBNP concentrations. In contrast, after adjustment for potential confounders, lower 24-h UKE was nonlinearly associated with higher hs-cTnT and NT-proBNP. For example, as compared with the third/median quintile of 24-h UKE (range: 2.8-3.2 g/24 h), participants in the first quintile (range: 0.5-2.3 g/24 h) had 1.05 (95% CI: 0.99, 1.11) times higher hs-cTnT and 1.14 (95% CI: 1.03, 1.26) times higher NT-proBNP. Associations were similar after further adjustment for estimated glomerular filtration rate, albuminuria, blood pressure, and serum potassium. CONCLUSIONS: Twenty-four-hour UNaE was not associated with the studied cardiac biomarkers. In contrast, lower 24-h UKE was nonlinearly associated with higher hs-cTnT and NT-proBNP. This finding supports recommendations to increase potassium intake in the general population. In addition, it suggests that cardiac dysfunction and/or cardiomyocyte injury may underlie previously reported associations of lower potassium intake with CVD mortality.
Subject(s)
Heart/physiopathology , Potassium/urine , Sodium/urine , Aged , Biomarkers/urine , Cross-Sectional Studies , Female , Humans , Life Style , Male , Middle Aged , Natriuretic Peptide, Brain/blood , Netherlands , Peptide Fragments/blood , Prospective Studies , Troponin I/blood , Troponin T/bloodABSTRACT
BACKGROUND: Protein-energy wasting, muscle mass (MM) loss and sarcopenia are highly prevalent and associated with poor outcome in haemodialysis (HD) patients. Monitoring of MM and/or muscle metabolism in HD patients is of paramount importance for timely detection of muscle loss and to intervene adequately. In this study we assessed the reliability and reproducibility of a simplified creatinine index (SCI) as a surrogate marker of MM and explored its predictive value on outcome. METHOD: We included all in-centre HD patients from 16 European countries with at least one SCI. The baseline period was defined as 30 days before and after the first multifrequency bioimpedance spectroscopy measurement; the subsequent 7 years constituted the follow-up. SCI was calculated by the Canaud equation. Multivariate Cox proportional hazards models were applied to assess the association of SCI with all-cause mortality. Using backward analysis, we explored the trends of SCI before death. Bland-Altman analysis was performed to analyse the agreement between estimated and measured MM. RESULTS: We included 23 495 HD patients; 3662 were incident. Females and older patients have lower baseline SCI. Higher SCI was associated with a lower risk of mortality [hazard ratio 0.81 (95% confidence interval 0.79-0.82)]. SCI decline accelerated â¼5-7 months before death. Lean tissue index (LTI) estimated by SCI was correlated with measured LTI in both sexes (males: R2 = 0.94; females: R2 = 0.92; both P < 0.001). Bland-Altman analysis showed that measured LTI was 4.71 kg/m2 (±2 SD: -12.54-3.12) lower than estimated LTI. CONCLUSION: SCI is a simple, easily obtainable and clinically relevant surrogate marker of MM in HD patients.
Subject(s)
Creatinine/blood , Kidney Failure, Chronic/therapy , Renal Dialysis/adverse effects , Sarcopenia/diagnosis , Adolescent , Adult , Aged , Aged, 80 and over , Body Composition , Europe/epidemiology , Female , Humans , Kidney Failure, Chronic/epidemiology , Male , Middle Aged , Prognosis , Reproducibility of Results , Retrospective Studies , Sarcopenia/blood , Sarcopenia/etiology , Young AdultABSTRACT
Digitization of healthcare will be a major innovation driver in the coming decade. Also, enabled by technological advancements and electronics miniaturization, wearable health device (WHD) applications are expected to grow exponentially. This, in turn, may make 4P medicine (predictive, precise, preventive and personalized) a more attainable goal within dialysis patient care. This article discusses different use cases where WHD could be of relevance for dialysis patient care, i.e. measurement of heart rate, arrhythmia detection, blood pressure, hyperkalaemia, fluid overload and physical activity. After adequate validation of the different WHD in this specific population, data obtained from WHD could form part of a body area network (BAN), which could serve different purposes such as feedback on actionable parameters like physical inactivity, fluid overload, danger signalling or event prediction. For a BAN to become clinical reality, not only must technical issues, cybersecurity and data privacy be addressed, but also adequate models based on artificial intelligence and mathematical analysis need to be developed for signal optimization, data representation, data reliability labelling and interpretation. Moreover, the potential of WHD and BAN can only be fulfilled if they are part of a transformative healthcare system with a shared responsibility between patients, healthcare providers and the payors, using a step-up approach that may include digital assistants and dedicated 'digital clinics'. The coming decade will be critical in observing how these developments will impact and transform dialysis patient care and will undoubtedly ask for an increased 'digital literacy' for all those implicated in their care.
Subject(s)
Arrhythmias, Cardiac/diagnosis , Artificial Intelligence , Delivery of Health Care/standards , Renal Dialysis/mortality , Telemedicine/methods , Wearable Electronic Devices/statistics & numerical data , Heart Rate , Humans , Reproducibility of ResultsABSTRACT
BACKGROUND: Pre-dialysis systolic blood pressure (pre-HD SBP) and peridialytic SBP change have been associated with morbidity and mortality among hemodialysis (HD) patients in previous studies, but the nature of their interaction is not well understood. METHODS: We analyzed pre-HD SBP and peridialytic SBP change (calculated as post-HD SBP minus pre-HD SBP) between January 2001 and December 2012 in HD patients treated in US Fresenius Medical Care facilities. The baseline period was defined as Months 4-6 after HD initiation, and all-cause mortality was noted during follow-up. Only patients who survived baseline and had no missing covariates were included. Censoring events were renal transplantation, modality change or study end. We fitted a Cox proportional hazard model with a bivariate spline functions for the primary predictors (pre-HD SBP and peridialytic SBP change) with adjustment for age, gender, race, diabetes, access-type, relative interdialytic weight gain, body mass index, albumin, equilibrated normalized protein catabolic rate and ultrafiltration rate. RESULTS: A total of 172 199 patients were included. Mean age was 62.1 years, 61.6% were white and 55% were male. During a median follow-up of 25.0 months, 73 529 patients (42.7%) died. We found that a peridialytic SBP rise combined with high pre-HD SBP was associated with higher mortality. In contrast, when concurrent with low pre-HD SBP, a peridialytic SBP rise was associated with better survival. CONCLUSION: The association of pre-HD and peridialytic SBP change with mortality is complex. Our findings call for a joint, not isolated, interpretation of pre-HD SBP and peridialytic SBP change.