Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 379
Filtrar
1.
Stat Med ; 43(5): 912-934, 2024 Feb 28.
Artículo en Inglés | MEDLINE | ID: mdl-38122818

RESUMEN

The population-attributable fraction (PAF) is commonly interpreted as the proportion of events that can be ascribed to a certain exposure in a certain population. Its estimation is sensitive to common forms of time-dependent bias in the face of a time-dependent exposure. Predominant estimation approaches based on multistate modeling fail to fully eliminate such bias and, as a result, do not permit a causal interpretation, even in the absence of confounding. While recently proposed multistate modeling approaches can successfully eliminate residual time-dependent bias, and moreover succeed to adjust for time-dependent confounding by means of inverse probability of censoring weighting, inadequate application, and misinterpretation prevails in the medical literature. In this paper, we therefore revisit recent work on previously proposed PAF estimands and estimators in settings with time-dependent exposures and competing events and extend this work in several ways. First, we critically revisit the interpretation and applied terminology of these estimands. Second, we further formalize the assumptions under which a causally interpretable PAF estimand can be identified and provide analogous weighting-based representations of the identifying functionals of other proposed estimands. This representation aims to enhance the applied statistician's understanding of different sources of bias that may arise when the aim is to obtain a valid estimate of a causally interpretable PAF. To illustrate and compare these representations, we present a real-life application to observational data from the Ghent University Hospital ICUs to estimate the fraction of ICU deaths attributable to hospital-acquired infections.


Asunto(s)
Modelos Estadísticos , Humanos , Probabilidad , Tiempo , Sesgo
2.
Pediatr Nephrol ; 2024 Apr 23.
Artículo en Inglés | MEDLINE | ID: mdl-38653885

RESUMEN

BACKGROUND: This study evaluated parenting stress, anxiety, and depression symptoms and their associated factors in parents of children with chronic kidney disease (CKD). METHODS: This cross-sectional study compared parents of patients with CKD (0-18 years) with a matched control group of parents of healthy children. Both groups completed the Parenting Stress Index - Short Form, the Hospital Anxiety and Depression Scale, and a sociodemographic questionnaire. RESULTS: The study group consisted of 45 parents (median age 39; 32 mothers) of CKD patients (median age 8; 36% female). Nearly 75% of children had CKD stages 2, 3, or 4, and 44.5% had congenital anomaly of the kidney and urinary tract. Five children (11%) were on dialysis, and 4 (9%) had a functioning kidney graft. Compared with parents of healthy children, more stress and anxiety symptoms were reported. Since the CKD diagnosis, 47% of parents perceived a deterioration of their own health, and 40% reduced work on a structural basis. Higher levels of stress, anxiety, and depression symptoms were associated with a more negative perception of own health, and more child medical comorbidities and school absence. CONCLUSIONS: This study showed higher levels of parenting stress and anxiety symptoms in parents of children with CKD compared with parents of healthy children. This was associated with a less positive perception of their own health, especially if the child had more medical comorbidities or more absence from school. Psychosocial interventions to reduce the parental burden should be integrated in the standard care of pediatric nephrology departments.

3.
Eur J Pediatr ; 182(4): 1483-1494, 2023 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-36735061

RESUMEN

Adolescents and young adults (AYAs) benefit from healthcare transition (HCT) programs. Despite the well-established literature reviewing HCT, a considerable heterogeneity exists on the involved healthcare professionals. This review aims to explore systematic reviews on the practices and recommendations on which disciplines of professionals should be involved in HCT. An umbrella review was performed using the MEDLINE, EMBASE, and Web of Science databases. To be eligible, systematic reviews had to report on the composition and/or the rationale of members of a transition team. Seventeen reviews were included in this systematic review. A healthcare professional that coordinates HCT was identified as a key caregiver in all reviews. Other reported members of a HCT team were nurses (75% of the reviews), social workers (44%), and peers/mentors (35%). The reported key responsibilities of a HCT team were to (i) manage communication, (ii) ensure continuity of care, and (iii) maintain contact with community services.  Conclusions: A team responsible for HCT should be active on the organizational, medical, and social levels. Key members of a HCT team vary little between diseases and included a coordinator, social worker, and nurse. A coordinating physician could facilitate transition in complex conditions. At all times, the condition and needs of the AYA should determine who should be involved as caregiver. What is Known: • The psychosocial needs of adolescents and young adults during healthcare transition are largely similar between chronic diseases. What is New: • Coordinators, nurses and social workers were the most involved, independent of the condition. • A liaison team should be active on organizational-, medical- and social-levels.


Asunto(s)
Médicos , Transición a la Atención de Adultos , Humanos , Adolescente , Adulto Joven , Personal de Salud , Transferencia de Pacientes , Enfermedad Crónica
4.
Artículo en Inglés | MEDLINE | ID: mdl-36069344

RESUMEN

Mass disasters are characterized by a disparity between health care demand and supply, which hampers complex therapies like kidney transplantation. Considering scarcity of publications on previous disasters, we reviewed transplantation practice during the recent COVID-19 pandemic, and dwelled upon this experience for guiding transplantation strategies in the future pandemic and non-pandemic catastrophes. We strongly suggest continuing transplantation programs during mass disasters, if medical and logistic operational circumstances are appropriate. Postponing transplantations from living donors and referral of urgent cases to safe regions or hospitals are justified. Specific preventative measures in anticipated disasters (such as vaccination programs during pandemics or evacuation in case of hurricanes or wars) may be useful to minimize risks. Immunosuppressive therapies should consider stratifying risk status and avoiding heavy immune suppression in patients with a low probability of therapeutic success. Discharging patients at the earliest convenience is justified during pandemics, whereas delaying discharge is reasonable in other disasters, if infrastructural damage results in unhygienic living environments for the patients. In the outpatient setting, telemedicine is a useful approach to reduce the patient load to hospitals, to minimize the risk of nosocomial transmission in pandemics and the need for transport in destructive disasters. If it comes down to save as many lives as possible, some ethical principles may vary in function of disaster circumstances, but elementary ethical rules are non-negotiable. Patient education is essential to minimize disaster-related complications and to allow for an efficient use of health care resources.

5.
Artículo en Inglés | MEDLINE | ID: mdl-36066915

RESUMEN

Mass disasters are characterized by a disparity between health care demand and supply, which hampers complex therapies like kidney transplantation. Considering scarcity of publications on previous disasters, we reviewed transplantation practice during the recent COVID-19 pandemic, and dwelled upon this experience for guiding transplantation strategies in the future pandemic and non-pandemic catastrophes. We strongly suggest continuing transplantation programs during mass disasters, if medical and logistic operational circumstances are appropriate. Postponing transplantations from living donors and referral of urgent cases to safe regions or hospitals are justified. Specific preventative measures in anticipated disasters (such as vaccination programs during pandemics or evacuation in case of hurricanes or wars) may be useful to minimize risks. Immunosuppressive therapies should consider stratifying risk status and avoiding heavy immune suppression in patients with a low probability of therapeutic success. Discharging patients at the earliest convenience is justified during pandemics, whereas delaying discharge is reasonable in other disasters, if infrastructural damage results in unhygienic living environments for the patients. In the outpatient setting, telemedicine is a useful approach to reduce the patient load to hospitals, to minimize the risk of nosocomial transmission in pandemics and the need for transport in destructive disasters. If it comes down to save as many lives as possible, some ethical principles may vary in function of disaster circumstances, but elementary ethical rules are non-negotiable. Patient education is essential to minimize disaster-related complications and to allow for an efficient use of health care resources.

6.
Stat Med ; 41(26): 5258-5275, 2022 11 20.
Artículo en Inglés | MEDLINE | ID: mdl-36055675

RESUMEN

The optimal moment to start renal replacement therapy in a patient with acute kidney injury (AKI) remains a challenging problem in intensive care nephrology. Multiple randomized controlled trials have tried to answer this question, but these contrast only a limited number of treatment initiation strategies. In view of this, we use routinely collected observational data from the Ghent University Hospital intensive care units (ICUs) to investigate different prespecified timing strategies for renal replacement therapy initiation based on time-updated levels of serum potassium, pH, and fluid balance in critically ill patients with AKI with the aim to minimize 30-day ICU mortality. For this purpose, we apply statistical techniques for evaluating the impact of specific dynamic treatment regimes in the presence of ICU discharge as a competing event. We discuss two approaches, a nonparametric one - using an inverse probability weighted Aalen-Johansen estimator - and a semiparametric one - using dynamic-regime marginal structural models. Furthermore, we suggest an easy to implement cross-validation technique to assess the out-of-sample performance of the optimal dynamic treatment regime. Our work illustrates the potential of data-driven medical decision support based on routinely collected observational data.


Asunto(s)
Lesión Renal Aguda , Terapia de Reemplazo Renal , Humanos , Terapia de Reemplazo Renal/métodos , Unidades de Cuidados Intensivos , Enfermedad Crítica/terapia , Lesión Renal Aguda/terapia , Potasio
7.
Crit Care ; 26(1): 365, 2022 11 28.
Artículo en Inglés | MEDLINE | ID: mdl-36443861

RESUMEN

BACKGROUND AND OBJECTIVES: Defining the optimal moment to start renal replacement therapy (RRT) in acute kidney injury (AKI) remains challenging. Multiple randomized controlled trials (RCTs) addressed this question whilst using absolute criteria such as pH or serum potassium. However, there is a need for identification of the most optimal cut-offs of these criteria. We conducted a causal analysis on routinely collected data (RCD) to compare the impact of different pre-specified dynamic treatment regimes (DTRs) for RRT initiation based on time-updated levels of potassium, pH, and urinary output on 30-day ICU mortality. DESIGN, SETTING, PARTICIPANTS, AND MEASUREMENTS: Patients in the ICU of Ghent University Hospital were included at the time they met KDIGO-AKI-stage ≥ 2. We applied inverse-probability-of-censoring-weighted Aalen-Johansen estimators to evaluate 30-day survival under 81 DTRs prescribing RRT initiation under different thresholds of potassium, pH, or persisting oliguria. RESULTS: Out of 13,403 eligible patients (60.8 ± 16.8 years, SOFA 7.0 ± 4.1), 5622 (63.4 ± 15.3 years, SOFA 8.2 ± 4.2) met KDIGO-AKI-stage ≥ 2. The DTR that delayed RRT until potassium ≥ 7 mmol/l, persisting oliguria for 24-36 h, and/or pH < 7.0 (non-oliguric) or < 7.2 (oliguric) despite maximal conservative treatment resulted in a reduced 30-day ICU mortality (from 12.7% [95% CI 11.9-13.6%] under current standard of care to 10.5% [95% CI 9.5-11.7%]; risk difference 2.2% [95% CI 1.3-3.8%]) with no increase in patients starting RRT (from 471 [95% CI 430-511] to 475 [95% CI 342-572]). The fivefold cross-validation benchmark for the optimal DTR resulted in 30-day ICU mortality of 10.7%. CONCLUSIONS: Our causal analysis of RCD to compare RRT initiation at different thresholds of refractory low pH, high potassium, and persisting oliguria identified a DTR that resulted in a decrease in 30-day ICU mortality without increase in number of RRTs. Our results suggest that the current criteria to start RRT as implemented in most RCTs may be suboptimal. However, as our analysis is hypothesis generating, this optimal DTR should ideally be validated in a multicentric RCT.


Asunto(s)
Lesión Renal Aguda , Datos de Salud Recolectados Rutinariamente , Humanos , Lesión Renal Aguda/terapia , Oliguria , Potasio , Diálisis Renal , Ensayos Clínicos Controlados Aleatorios como Asunto , Persona de Mediana Edad , Anciano
8.
Pediatr Nephrol ; 37(5): 1087-1096, 2022 05.
Artículo en Inglés | MEDLINE | ID: mdl-34599378

RESUMEN

BACKGROUND: Children with chronic kidney disease (CKD) have a low quality of life (QoL). The PedsQL™ 4.0 Generic Core Scales are widely used to assess general QoL in children. The aim of this cross-sectional study was to translate the original version of the CKD-specific PedsQL™ 3.0 End Stage Renal Disease Module into a Dutch version and to evaluate its validity and reliability. METHODS: The forward-backward translation method based on the guidelines from the original developer was used to produce the Dutch version of the PedsQL™ 3.0 ESRD Module. Fifty-eight CKD patients (aged 8-18 years) and their parents (n = 31) filled in both generic and disease-specific modules. The non-clinical control group consisted of the same number of healthy children (matched for gender and age) and their parents. RESULTS: Cronbach's alpha coefficients (α's) for the PedsQL™ 3.0 ESRD Module demonstrated excellent reliability for the Total Scale scores. For all 7 subscales, α's were greater than 0.60, except for Perceived Physical Appearance. Overall, intercorrelations with the PedsQL™ 4.0 Generic Core Scales were in the medium to large range, supporting construct validity. Parent proxy reports showed lower generic QoL for all domains in CKD patients compared to healthy children. Child self-reports only demonstrated lower QoL on the domain School Functioning in children with CKD compared to healthy children. CONCLUSIONS: This study shows good validity and reliability for the Dutch version of the PedsQL™ 3.0 ESRD Module. However, testing with a larger study group is recommended in order to make final conclusions about the psychometric qualities of this measure. A higher resolution version of the Graphical abstract is available as Supplementary information.


Asunto(s)
Fallo Renal Crónico , Insuficiencia Renal Crónica , Bélgica , Niño , Estudios Transversales , Femenino , Humanos , Fallo Renal Crónico/diagnóstico , Masculino , Padres , Calidad de Vida , Insuficiencia Renal Crónica/diagnóstico , Reproducibilidad de los Resultados , Encuestas y Cuestionarios
9.
Pediatr Nephrol ; 37(7): 1657-1665, 2022 07.
Artículo en Inglés | MEDLINE | ID: mdl-34993603

RESUMEN

BACKGROUND: Fruit and vegetable intake is commonly discouraged in children with chronic kidney disease (CKD) to avoid hyperkalemia. However, direct evidence in support of this widespread practice is lacking. Furthermore, the resultant restricted fiber exposure may deprive CKD patients from potential health benefits associated with the latter. Therefore, we investigated associations between dietary potassium intake, fiber intake, and serum potassium levels in pediatric CKD. METHODS: This study is a longitudinal analysis of a 2-year, prospective, multi-institutional study, following children with CKD at 3-month intervals. At each visit, dietary potassium and fiber intake were assessed, using 24-h recalls and 3-day food records. On the same occasion, serum potassium concentrations were determined. Associations between dietary potassium intake, dietary fiber intake, and serum potassium concentrations were determined using linear mixed models. RESULTS: Fifty-two CKD patients (7 transplant recipients, none on dialysis) aged 9 [4;14] years with an estimated glomerular filtration rate (eGFR) of 49 [25;68] mL/min/1.73 m2 were included. For every g/day decrease in dietary potassium intake, the estimated mean daily fiber intake was 5.1 g lower (95% confidence interval (CI), 4.3-5.9 g/day; p < 0.001). Neither dietary potassium intake (p = 0.40) nor dietary fiber intake (p = 0.43) was associated with circulating potassium in a model adjusted for time point, eGFR, treatment with a renin-angiotensin-aldosterone system blocker, serum bicarbonate concentration, and body surface area. CONCLUSIONS: Dietary potassium and fiber intake are closely related but were not associated with circulating potassium levels in pediatric CKD. A higher-resolution version of the graphical abstract is available as Supplementary information.


Asunto(s)
Potasio en la Dieta , Insuficiencia Renal Crónica , Niño , Fibras de la Dieta , Humanos , Potasio , Estudios Prospectivos , Diálisis Renal , Insuficiencia Renal Crónica/terapia
10.
Blood Purif ; 51(7): 577-583, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-34525474

RESUMEN

INTRODUCTION: Hyperlactatemia is a regular condition in the intensive care unit, which is often associated with adverse outcomes. Control of the triggering condition is the most effective treatment of hyperlactatemia, but since this is mostly not readily possible, extracorporeal renal replacement therapy (RRT) is often tried as a last resort. The present study aims to evaluate the factors that may contribute to the decision whether to start RRT or not and the potential impact of the start of RRT on the outcome in patients with severe lactic acidosis (SLA) (lactate ≥5 mmol/L). MATERIALS AND METHODS: We conducted a retrospective single-center cohort analysis over a 3-year period including all patients with a lactate level ≥5 mmol/L. Patients were considered as treated with RRT because of SLA if RRT was started within 24 h after reaching a lactate level ≥5 mmol/L. RESULTS: Overall, 90-day mortality in patients with SLA was 34.5%. Of the 1,203 patients who matched inclusion/exclusion criteria, 11% (n = 133) were dialyzed within 24 h. The propensity to receive RRT was related to the lactate level and to the SOFA renal and cardio score. The most frequently used modality was continuous RRT. Patients who were started on RRT versus those who did not have 2.3 higher odds of mortality, even after adjustment for the propensity to start RRT. CONCLUSIONS: Our analysis confirms the high mortality rate of patients with SLA. It adds that odds for mortality is even higher in patients who were started on RRT versus not. We suggest keeping an open mind to the factors that may influence the decision to start dialysis and bear in mind that without being a bridge to correction of the underlying condition, dialysis is unlikely to affect the outcome.


Asunto(s)
Acidosis Láctica , Lesión Renal Aguda , Hiperlactatemia , Acidosis Láctica/etiología , Acidosis Láctica/terapia , Lesión Renal Aguda/etiología , Lesión Renal Aguda/terapia , Humanos , Unidades de Cuidados Intensivos , Ácido Láctico , Diálisis Renal , Terapia de Reemplazo Renal , Estudios Retrospectivos
11.
Bioethics ; 36(2): 113-120, 2022 02.
Artículo en Inglés | MEDLINE | ID: mdl-34374441

RESUMEN

The use of artificial intelligence (AI) in healthcare comes with opportunities but also numerous challenges. A specific challenge that remains underexplored is the lack of clear and distinct definitions of the concepts used in and/or produced by these algorithms, and how their real world meaning is translated into machine language and vice versa, how their output is understood by the end user. This "semantic" black box adds to the "mathematical" black box present in many AI systems in which the underlying "reasoning" process is often opaque. In this way, whereas it is often claimed that the use of AI in medical applications will deliver "objective" information, the true relevance or meaning to the end-user is frequently obscured. This is highly problematic as AI devices are used not only for diagnostic and decision support by healthcare professionals, but also can be used to deliver information to patients, for example to create visual aids for use in shared decision-making. This paper provides an examination of the range and extent of this problem and its implications, on the basis of cases from the field of intensive care nephrology. We explore how the problematic terminology used in human communication about the detection, diagnosis, treatment, and prognosis of concepts of intensive care nephrology becomes a much more complicated affair when deployed in the form of algorithmic automation, with implications extending throughout clinical care, affecting norms and practices long considered fundamental to good clinical care.


Asunto(s)
Inteligencia Artificial , Semántica , Toma de Decisiones Clínicas , Atención a la Salud , Instituciones de Salud , Humanos
12.
BMC Med Ethics ; 23(1): 50, 2022 05 06.
Artículo en Inglés | MEDLINE | ID: mdl-35524301

RESUMEN

Research regarding the drivers of acceptance of clinical decision support systems (CDSS) by physicians is still rather limited. The literature that does exist, however, tends to focus on problems regarding the user-friendliness of CDSS. We have performed a thematic analysis of 24 interviews with physicians concerning specific clinical case vignettes, in order to explore their underlying opinions and attitudes regarding the introduction of CDSS in clinical practice, to allow a more in-depth analysis of factors underlying (non-)acceptance of CDSS. We identified three general themes from the results. First, 'the perceived role of the AI', including items referring to the tasks that may properly be assigned to the CDSS according to the respondents. Second, 'the perceived role of the physician', referring to the aspects of clinical practice that were seen as being fundamentally 'human' or non-automatable. Third, 'concerns regarding AI', including items referring to more general issues that were raised by the respondents regarding the introduction of CDSS in general and/or in clinical medicine in particular. Apart from the overall concerns expressed by the respondents regarding user-friendliness, we will explain how our results indicate that our respondents were primarily occupied by distinguishing between parts of their job that should be automated and aspects that should be kept in human hands. We refer to this distinction as 'the division of clinical labor.' This division is not based on knowledge regarding AI or medicine, but rather on which parts of a physician's job were seen by the respondents as being central to who they are as physicians and as human beings. Often the respondents' view that certain core parts of their job ought to be shielded from automation was closely linked to claims concerning the uniqueness of medicine as a domain. Finally, although almost all respondents claimed that they highly value their final responsibility, a closer investigation of this concept suggests that their view of 'final responsibility' was not that demanding after all.


Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas , Médicos , Inteligencia Artificial , Actitud , Humanos , Investigación Cualitativa , Ciudad de Roma
13.
BMC Med Inform Decis Mak ; 22(1): 185, 2022 07 16.
Artículo en Inglés | MEDLINE | ID: mdl-35842722

RESUMEN

BACKGROUND: There is increasing interest in incorporating clinical decision support (CDS) into electronic healthcare records (EHR). Successful implementation of CDS systems depends on acceptance of them by healthcare workers. We used a mix of quantitative and qualitative methods starting from Qsort methodology to explore expectations and perceptions of practicing physicians on the use of CDS incorporated in EHR. METHODS: The study was performed in a large tertiary care academic hospital. We used a mixed approach with a Q-sort based classification of pre-defined reactions to clinical case vignettes combined with a thinking-aloud approach, taking into account COREQ recommendations The open source software of Ken-Q Analysis version 1.0.6. was used for the quantitative analysis, using principal components and a Varimax rotation. For the qualitative analysis, a thematic analysis based on the four main themes was performed based on the audiotapes and field notes. RESULTS: Thirty physicians were interviewed (7 in training, 8 junior staff and 15 senior staff; 16 females). Nearly all respondents were strongly averse towards interruptive messages, especially when these also were obstructive. Obstructive interruption was considered to be acceptable only when it increases safety, is adjustable to user expertise level and/or allows deviations when the end-user explains why a deviation is desirable in the case at issue. Transparency was deemed an essential feature, which seems to boil down to providing sufficient clarification on the factors underlying the recommendations of the CDS, so that these can be compared against the physicians' existing knowledge, beliefs and convictions. CONCLUSION: Avoidance of disruptive workflows and transparency of the underlying decision processes are important points to consider when developing CDS systems incorporated in EHR.


Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas , Médicos , Registros Electrónicos de Salud , Femenino , Personal de Salud , Humanos , Motivación , Programas Informáticos
14.
Crit Rev Clin Lab Sci ; 58(2): 131-152, 2021 03.
Artículo en Inglés | MEDLINE | ID: mdl-33045173

RESUMEN

Machine learning (ML) is gaining increased interest in clinical laboratory medicine, mainly triggered by the decreased cost of generating and storing data using laboratory automation and computational power, and the widespread accessibility of open source tools. Nevertheless, only a handful of ML-based products are currently commercially available for routine clinical laboratory practice. In this review, we start with an introduction to ML by providing an overview of the ML landscape, its general workflow, and the most commonly used algorithms for clinical laboratory applications. Furthermore, we aim to illustrate recent evolutions (2018 to mid-2020) of the techniques used in the clinical laboratory setting and discuss the associated challenges and opportunities. In the field of clinical chemistry, the reviewed applications of ML algorithms include quality review of lab results, automated urine sediment analysis, disease or outcome prediction from routine laboratory parameters, and interpretation of complex biochemical data. In the hematology subdiscipline, we discuss the concepts of automated blood film reporting and malaria diagnosis. At last, we handle a broad range of clinical microbiology applications, such as the reduction of diagnostic workload by laboratory automation, the detection and identification of clinically relevant microorganisms, and the detection of antimicrobial resistance.


Asunto(s)
Servicios de Laboratorio Clínico , Laboratorios , Algoritmos , Humanos , Aprendizaje Automático
15.
Nephrol Dial Transplant ; 36(6): 998-1005, 2021 05 27.
Artículo en Inglés | MEDLINE | ID: mdl-33508125

RESUMEN

BACKGROUND: Several protein-bound uraemic toxins (PBUTs) have been associated with cardiovascular (CV) and all-cause mortality in chronic kidney disease (CKD) but the degree to which this is the case per individual PBUT and the pathophysiological mechanism have only partially been unraveled. METHODS: We compared the prognostic value of both total and free concentrations of five PBUTs [p-cresyl sulfate (pCS), p-cresyl glucuronide, indoxyl sulfate, indole acetic acid and hippuric acid] in a cohort of 523 patients with non-dialysis CKD Stages G1-G5. Patients were followed prospectively for the occurrence of a fatal or non-fatal CV event as the primary endpoint and a number of other major complications as secondary endpoints. In addition, association with and the prognostic value of nine markers of endothelial activation/damage was compared. RESULTS: After a median follow-up of 5.5 years, 149 patients developed the primary endpoint. In multivariate Cox regression models adjusted for age, sex, systolic blood pressure, diabetes mellitus and estimated glomerular filtration rate, and corrected for multiple testing, only free pCS was associated with the primary endpoint {hazard ratio [HR]1.39 [95% confidence interval (CI) 1.14-1.71]; P = 0.0014}. Free pCS also correlated with a disintegrin and metalloproteinase with a thrombospondin type 1 motif, member 13 (r = -0.114, P < 0.05), angiopoietin-2 (ANGPT2) (r = 0.194, P < 0.001), matrix metallopeptidase 7 (MMP-7; (r = 0.238, P < 0.001) and syndecan 1 (r = 0.235, P < 0.001). Of these markers of endothelial activation/damage, ANGPT2 [HR 1.46 (95% CI 1.25-1.70); P < 0.0001] and MMP-7 [HR 1.31 (95% CI 1.08-1.59); P = 0.0056] were also predictive of the primary outcome. CONCLUSIONS: Among PBUTs, free pCS shows the highest association with CV outcome in non-dialysed patients with CKD. Two markers of endothelial activation/damage that were significantly correlated with free pCS, ANGPT2 and MMP-7 were also associated with CV outcome. The hypothesis that free pCS exerts its CV toxic effects by an adverse effect on endothelial function deserves further exploration.


Asunto(s)
Insuficiencia Renal Crónica , Cresoles , Humanos , Indicán , Sulfatos , Ésteres del Ácido Sulfúrico , Toxinas Biológicas , Uremia
16.
Nephrol Dial Transplant ; 36(5): 811-818, 2021 04 26.
Artículo en Inglés | MEDLINE | ID: mdl-31837226

RESUMEN

BACKGROUND: The urinary proteomic classifier chronic kidney disease 273 (CKD273) is predictive for the development and progression of chronic kidney disease (CKD) and/or albuminuria in type 2 diabetes. This study evaluates its role in the prediction of cardiovascular (CV) events in patients with CKD Stages G1-G5. METHODS: We applied the CKD273 classifier in a cohort of 451 patients with CKD Stages G1-G5 followed prospectively for a median of 5.5 years. Primary endpoints were all-cause mortality, CV mortality and the composite of non-fatal and fatal CV events (CVEs). RESULTS: In multivariate Cox regression models adjusting for age, sex, prevalent diabetes and CV history, the CKD273 classifier at baseline was significantly associated with total mortality and time to fatal or non-fatal CVE, but not CV mortality. Because of a significant interaction between CKD273 and CV history (P = 0.018) and CKD stages (P = 0.002), a stratified analysis was performed. In the fully adjusted models, CKD273 classifier was a strong and independent predictor of fatal or non-fatal CVE only in the subgroup of patients with CKD Stages G1-G3b and without a history of CV disease. In those patients, the highest tertile of CKD273 was associated with a >10-fold increased risk as compared with the lowest tertile. CONCLUSIONS: The urinary CKD273 classifier provides additional independent information regarding the CV risk in patients with early CKD stage and a blank CV history. Determination of CKD273 scores on a random urine sample may improve the efficacy of intensified surveillance and preventive strategies by selecting patients who potentially will benefit most from early risk management.


Asunto(s)
Proteómica , Adulto , Anciano , Albuminuria/orina , Enfermedades Cardiovasculares/complicaciones , Estudios de Cohortes , Diabetes Mellitus Tipo 2/complicaciones , Humanos , Masculino , Persona de Mediana Edad , Insuficiencia Renal Crónica/complicaciones
17.
Pediatr Nephrol ; 36(6): 1589-1595, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-33387017

RESUMEN

BACKGROUND: Chronic kidney disease (CKD) in children is a pro-inflammatory condition leading to a high morbidity and mortality. Accumulation of organic metabolic waste products, coined as uraemic toxins, parallels kidney function decline. Several of these uraemic toxins are protein-bound (PBUT) and gut-derived. Gut dysbiosis is a hallmark of CKD, resulting in a state of increased proteolytic fermentation that might be counteracted by dietary fibre. Data on fibre intake in children with CKD are lacking. We aimed to assess dietary fibre intake in a paediatric CKD cohort and define its relationship with PBUT concentrations. METHODS: In this multi-centre, cross-sectional observational study, 61 non-dialysis CKD patients (9 ± 5 years) were included. Dietary fibre intake was assessed through the use of 24-h recalls or 3-day food records and coupled to total and free levels of 4 PBUTs (indoxyl sulfate (IxS), p-cresyl sulfate (pCS), p-cresyl glucuronide (pCG) and indole acetic acid (IAA). RESULTS: In general, fibre intake was low, especially in advanced CKD: 10 ± 6 g/day/BSA in CKD 4-5 versus 14 ± 7 in CKD 1-3 (p = 0.017). Lower concentrations of both total (p = 0.036) and free (p = 0.036) pCG were observed in the group with highest fibre intake, independent of kidney function. CONCLUSIONS: Fibre intake in paediatric CKD is low and is even worse in advanced CKD stages. Current dietary fibre recommendations for healthy children are not being achieved. Dietary management of CKD is complex in which too restrictive diets carry the risk of nutritional deficiencies. The relation of fibre intake with PBUTs remains unclear and needs further investigation. Graphical abstract.


Asunto(s)
Insuficiencia Renal Crónica , Uremia , Adolescente , Niño , Preescolar , Estudios Transversales , Fibras de la Dieta , Humanos , Toxinas Biológicas , Tóxinas Urémicas
18.
BMC Med Inform Decis Mak ; 21(1): 87, 2021 03 06.
Artículo en Inglés | MEDLINE | ID: mdl-33676513

RESUMEN

Over the last decades, the face of health care has changed dramatically, with big improvements in what is technically feasible. However, there are indicators that the current approach to evaluating evidence in health care is not holistic and hence in the long run, health care will not be sustainable. New conceptual and normative frameworks for the evaluation of health care need to be developed and investigated. The current paper presents a novel framework of justifiable health care and explores how the use of artificial intelligence and big data can contribute to achieving the goals of this framework.


Asunto(s)
Inteligencia Artificial , Macrodatos , Atención a la Salud , Instituciones de Salud , Humanos
19.
Kidney Int ; 97(6): 1230-1242, 2020 06.
Artículo en Inglés | MEDLINE | ID: mdl-32317112

RESUMEN

Chronic kidney disease (CKD) is characterized by accumulation of protein-bound uremic toxins such as p-cresyl sulfate, p-cresyl glucuronide, indoxyl sulfate and indole-3-acetic acid, which originate in the gut. Intestinal bacteria metabolize aromatic amino acids into p-cresol and indole, (further conjugated in the colon mucosa and liver) and indole-3-acetic acid. Here we measured fecal, plasma and urine metabolite concentrations; the contribution of gut bacterial generation to plasma protein-bound uremic toxins accumulation; and influx into the gut of circulating protein-bound uremic toxins at different stages of CKD. Feces, blood and urine were collected from 14 control individuals and 141 patients with CKD. Solutes were quantified by ultra-high performance liquid chromatography. To assess the rate of bacterial generation of p-cresol, indole and indole-3-acetic acid, fecal samples were cultured ex vivo. With CKD progression, an increase in protein-bound uremic toxins levels was observed in plasma, whereas the levels of these toxins and their precursors remained the same in feces and urine. Anaerobic culture of fecal samples showed no difference in ex vivo p-cresol, indole and indole-3-acetic acid generation. Therefore, differences in plasma protein-bound uremic toxins levels between different CKD stages cannot be explained by differences in bacterial generation rates in the gut, suggesting retention due to impaired kidney function as the main contributor to their increased plasma levels. Thus, as fractional clearance decreased with the progression of CKD, tubular clearance appeared to be more affected than the glomerular filtration rate, and there was no net increase in protein-bound uremic toxins influx into the gut lumen with increased plasma levels.


Asunto(s)
Microbioma Gastrointestinal , Insuficiencia Renal Crónica , Toxinas Biológicas , Uremia , Heces , Humanos , Indicán , Insuficiencia Renal Crónica/diagnóstico
20.
Nephrol Dial Transplant ; 35(6): 979-986, 2020 06 01.
Artículo en Inglés | MEDLINE | ID: mdl-32227227

RESUMEN

BACKGROUND: We compare reimbursement for haemodialysis (HD) and peritoneal dialysis (PD) in European countries to assess the impact on government healthcare budgets. We discuss strategies to reduce costs by promoting sustainable dialysis and kidney transplantation. METHODS: This was a cross-sectional survey among nephrologists conducted online July-December 2016. European countries were categorized by tertiles of gross domestic product per capita (GDP). Reimbursement data were matched to kidney replacement therapy (KRT) data. RESULTS: The prevalence per million population of patients being treated with long-term dialysis was not significantly different across tertiles of GDP (P = 0.22). The percentage of PD increased with GDP across tertiles (4.9, 8.2, 13.4%; P < 0.001). The HD-to-PD reimbursement ratio was higher in countries with the highest tertile of GDP (0.7, 1.0 versus 1.7; P = 0.007). Home HD was mainly reimbursed in countries with the highest tertile of GDP (15, 15 versus 69%; P = 0.005). The percentage of public health expenditure for reimbursement of dialysis decreased across tertiles of GDP (3.3, 1.5, 0.7%; P < 0.001). Transplantation as a proportion of all KRT increased across tertiles of GDP (18.5, 39.5, 56.0%; P < 0.001). CONCLUSIONS: In Europe, dialysis has a disproportionately high impact on public health expenditure, especially in countries with a lower GDP. In these countries, the cost difference between PD and HD is smaller, and home dialysis and transplantation are less frequently provided than in countries with a higher GDP. In-depth evaluation and analysis of influential economic and political measures are needed to steer optimized reimbursement strategies for KRT.


Asunto(s)
Atención a la Salud/normas , Costos de la Atención en Salud/normas , Fallo Renal Crónico/economía , Fallo Renal Crónico/terapia , Mecanismo de Reembolso/normas , Diálisis Renal/economía , Terapia de Reemplazo Renal/economía , Costo de Enfermedad , Estudios Transversales , Atención a la Salud/economía , Europa (Continente) , Gastos en Salud , Humanos , Diálisis Renal/métodos , Terapia de Reemplazo Renal/métodos
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda