RESUMO
INTRODUCTION: In the interferon era, the treatment of hepatitis C virus (HCV) infection in patients on haemodialysis (HD) was limited due to the significant number of treatment-related adverse events (AEs). Direct-acting antivirals (DAAs) have demonstrated their efficacy and safety in the treatment of HCV in patients with advanced chronic kidney disease on haemodialysis. The objective of the study was to evaluate the success in eliminating HCV infection from our dialysis unit using DAAs, and to assess the impact of HCV elimination on clinical and analytical outcomes. PATIENTS AND METHODS: This is a prospective, interventional, single-center study at Hospital Clínic de Barcelona. All HCV-RNA positive patients who received antiviral therapy with DAAs within a 3-year period (2014-2017) were analyzed (n=20). Data on virologic response, adverse events, and biochemical and hematological parameters during and after DAA therapy were analyzed. RESULTS: All patients achieved sustained virologic response (SVR) and only 40% of patients presented with mild AEs. None of the patients presented with HCV reinfection after a 1-year follow-up period, and thus HCV was eliminated from our HD unit. SVR was associated with a significant increase in hemoglobin and hematocrit, and a tendency toward the need for lower doses of iron supplementation with no changes in darbepoetin dose. CONCLUSION: HCV infection can be safely eliminated from HD units with the use of DAAs, preventing new infections in patients and healthcare staff. In the short term, the achievement of SVR is associated with an improvement in the control of anemia.
Assuntos
Anemia/tratamento farmacológico , Antivirais/uso terapêutico , Hepatite C/tratamento farmacológico , Diálise Renal , Insuficiência Renal Crônica/terapia , Resposta Viral Sustentada , 2-Naftilamina , Anemia/etiologia , Anilidas , Carbamatos , Ciclopropanos , Darbepoetina alfa/administração & dosagem , Feminino , Hematínicos/administração & dosagem , Hematócrito , Hemoglobina A , Humanos , Lactamas Macrocíclicas , Compostos Macrocíclicos/uso terapêutico , Masculino , Pessoa de Meia-Idade , Prolina/análogos & derivados , Estudos Prospectivos , Insuficiência Renal Crônica/complicações , Ritonavir/uso terapêutico , Sulfonamidas/uso terapêutico , Uracila/análogos & derivados , Uracila/uso terapêutico , ValinaRESUMO
Pregnancy-associated atypical hemolytic uremic syndrome (aHUS) refers to the thrombotic microangiopathy resulting from uncontrolled complement activation during pregnancy or the postpartum period. Pregnancy-associated aHUS is a devastating disease for which there is a limited clinical understanding and treatment experience. Here we report a retrospective study to analyze the clinical and prognostic data of 22 cases of pregnancy-associated aHUS from the Spanish aHUS Registry under different treatments. Sixteen patients presented during the first pregnancy and as many as nine patients required hemodialysis at diagnosis. Identification of inherited complement abnormalities explained nine of the 22 cases, with CFH mutations and CFH to CFHR1 gene conversion events being the most prevalent genetic alterations associated with this disorder (66%). In thirteen of the cases, pregnancy complications were sufficient to trigger a thrombotic microangiopathy in the absence of genetic or acquired complement alterations. The postpartum period was the time with highest risk to develop the disease and the group shows an association of cesarean section with pregnancy-associated aHUS. Seventeen patients underwent plasma treatments with a positive renal response in only three cases. In contrast, ten patients received eculizumab with an excellent renal response in all, independent of carrying or not inherited complement abnormalities. Although the cohort is relatively small, the data suggest that pregnancy-associated aHUS is not different from other types of aHUS and suggest the efficacy of eculizumab treatment over plasma therapies. This study may be useful to improve prognosis in this group of aHUS patients.
Assuntos
Síndrome Hemolítico-Urêmica Atípica , Complicações na Gravidez , Microangiopatias Trombóticas , Adulto , Anticorpos Monoclonais Humanizados/uso terapêutico , Síndrome Hemolítico-Urêmica Atípica/epidemiologia , Síndrome Hemolítico-Urêmica Atípica/genética , Síndrome Hemolítico-Urêmica Atípica/imunologia , Síndrome Hemolítico-Urêmica Atípica/terapia , Cesárea , Ativação do Complemento , Proteínas Inativadoras do Complemento C3b/genética , Fator H do Complemento/genética , Feminino , Conversão Gênica , Humanos , Imunossupressores/uso terapêutico , Mutação , Paridade , Troca Plasmática , Período Pós-Parto , Gravidez , Complicações na Gravidez/epidemiologia , Complicações na Gravidez/genética , Complicações na Gravidez/imunologia , Complicações na Gravidez/terapia , Sistema de Registros , Diálise Renal , Estudos Retrospectivos , Fatores de Risco , Espanha/epidemiologia , Microangiopatias Trombóticas/epidemiologia , Microangiopatias Trombóticas/genética , Microangiopatias Trombóticas/imunologia , Microangiopatias Trombóticas/terapia , Resultado do TratamentoRESUMO
Proteinuria is the main predictor of kidney graft loss. However, there is little information regarding the consequences of nephrotic proteinuria (NP) and nephrotic syndrome (NS) after a kidney transplant. We aimed to describe the clinical and histopathological characteristics of kidney recipients with nephrotic-range proteinuria and compare the graft surveillance between those who developed NS and those who did not. A total of 204 patients (18.6% of kidney transplants in the study period) developed NP, and 68.1% of them had NS. Of the 110 patients who underwent a graft biopsy, 47.3% exhibited ABMR, 21.8% the recurrence of glomerulonephritis, 9.1% IFTA, and 7.3% de novo glomerulonephritis. After a median follow-up of 97.5 months, 64.1% experienced graft loss. The graft survival after the onset of NP declined from 75.8% at 12 months to 38% at 5 years, without significant differences between those with and those without NS. Patients who developed NS fewer than 3 months after the onset of NP exhibited a significantly higher risk of death-censored graft loss (HR: 1.711, 95% CI: 1.147-2.553) than those without NS or those with late NS. In conclusion, NP and NS are frequent conditions after a kidney transplant, and they imply extremely poor graft outcomes. The time from the onset of NP to the development of NS is related to graft survival.
RESUMO
BACKGROUND: Recommendations of the use of antibody induction treatments in kidney transplant recipients (KTR) are based on moderate quality and historical studies. This systematic review aims to reevaluate, based on actual studies, the effects of different antibody preparations when used in specific KTR subgroups. METHODS: We searched MEDLINE and CENTRAL and selected randomized controlled trials (RCT) and observational studies looking at different antibody preparations used as induction in KTR. Comparisons were categorized into different KTR subgroups: standard, high risk of rejection, high risk of delayed graft function (DGF), living donor, and elderly KTR. Two authors independently assessed the risk of bias. RESULTS: Thirty-seven RCT and 99 observational studies were finally included. Compared to anti-interleukin-2-receptor antibodies (IL2RA), anti-thymocyte globulin (ATG) reduced the risk of acute rejection at two years in standard KTR (RR 0.74, 95%CI 0.61-0.89) and high risk of rejection KTR (RR 0.55, 95%CI 0.43-0.72), but without decreasing the risk of graft loss. We did not find significant differences comparing ATG vs. alemtuzumab or different ATG dosages in any KTR group. CONCLUSIONS: Despite many studies carried out on induction treatment in KTR, their heterogeneity and short follow-up preclude definitive conclusions to determine the optimal induction therapy. Compared with IL2RA, ATG reduced rejection in standard-risk, highly sensitized, and living donor graft recipients, but not in high DGF risk or elderly recipients. More studies are needed to demonstrate beneficial effects in other KTR subgroups and overall patient and graft survival.
Assuntos
Soro Antilinfocitário , Transplante de Rim , Humanos , Idoso , Soro Antilinfocitário/uso terapêutico , Imunossupressores/uso terapêutico , Transplante de Rim/efeitos adversos , Alemtuzumab , Anticorpos , Rejeição de Enxerto , Linfócitos , Transplantados , Sobrevivência de EnxertoRESUMO
Measuring the non-pathogenic Torque Teno Virus (TTV) load allows assessing the net immunosuppressive state after kidney transplantation (KTx). Currently, it is not known how exposure to maintenance immunosuppression affects TTV load. We hypothesized that TTV load is associated with the exposure to mycophenolic acid (MPA) and tacrolimus. We performed a prospective study including 54 consecutive KTx. Blood TTV load was measured by an in-house PCR at months 1 and 3. Together with doses and trough blood levels of tacrolimus and MPA, we calculated the coefficient of variability (CV), time in therapeutic range (TTR) and concentration/dose ratio (C/D) of tacrolimus, and the MPA-area under the curve (AUC-MPA) at the third month. TTV load at the first and third month discriminated those patients at risk of developing opportunistic infections between months 1 and 3 (AUC-ROC 0.723, 95%CI 0.559-0.905, p = 0.023) and between months 3 and 6 (AUC-ROC 0.778, 95%CI 0.599-0.957, p = 0.028), respectively, but not those at risk of acute rejection. TTV load did not relate to mean tacrolimus blood level, CV, TTR, C/D and AUC-MPA. To conclude, although TTV is a useful marker of net immunosuppressive status after KTx, it is not related to exposure to maintenance immunosuppression.
RESUMO
Background and objectives: Acute kidney injury (AKI) is common among hospitalized patients with COVID-19 and associated with worse prognosis. The Spanish Society of Nephrology created the AKI-COVID Registry to characterize the population admitted for COVID-19 that developed AKI in Spanish hospitals. The need of renal replacement therapy (RRT) therapeutic modalities, and mortality in these patients were assessed. Material and method: In a retrospective study, we analyzed data from the AKI-COVID Registry, which included patients hospitalized in 30 Spanish hospitals from May 2020 to November 2021. Clinical and demographic variables, factors related to the severity of COVID-19 and AKI, and survival data were recorded. A multivariate regression analysis was performed to study factors related to RRT and mortality. Results: Data from 730 patients were recorded. A total of 71.9% were men, with a mean age of 70 years (60-78), 70.1% were hypertensive, 32.9% diabetic, 33.3% with cardiovascular disease and 23.9% had some degree of chronic kidney disease (CKD). Pneumonia was diagnosed in 94.6%, requiring ventilatory support in 54.2% and admission to the ICU in 44.1% of cases.The median time from the onset of COVID-19 symptoms to the appearance of AKI (37.1% KDIGO I, 18.3% KDIGO II, 44.6% KDIGO III) was 6 days (4-10). A total of 235 (33.9%) patients required RRT: 155 patients with continuous renal replacement therapy, 89 alternate-day dialysis, 36 daily dialysis, 24 extended hemodialysis and 17 patients with hemodiafiltration. Smoking habit (OR 3.41), ventilatory support (OR 20.2), maximum creatinine value (OR 2.41) and time to AKI onset (OR 1.13) were predictors of the need for RRT; age was a protective factor (0.95). The group without RRT was characterized by older age, less severe AKI, shorter kidney injury onset and recovery time (p < 0.05). 38.6% of patients died during hospitalization; serious AKI and RRT were more frequent in the death group. In the multivariate analysis, age (OR 1.03), previous chronic kidney disease (OR 2.21), development of pneumonia (OR 2.89), ventilatory support (OR 3.34) and RRT (OR 2.28) were predictors of mortality while chronic treatment with ARBs was identified as a protective factor (OR 0.55). Conclusions: Patients with AKI during hospitalization for COVID-19 had a high mean age, comorbidities and severe infection. We defined two different clinical patterns: an AKI of early onset, in older patients that resolves in a few days without the need for RRT; and another more severe pattern, with greater need for RRT, and late onset, which was related to greater severity of the infectious disease. The severity of the infection, age and the presence of CKD prior to admission were identified as risk factors for mortality in these patients. In addition chronic treatment with ARBs was identified as a protective factor for mortality.
RESUMO
OBJECTIVES: The number of kidney transplants obtained from controlled donations after circulatory death is increasing, with long-term outcomes similar to those obtained with donations after brain death. Extraction using normothermic regional perfusion can improve results with controlled donors after circulatory death; however, information on the histological impact and extraction procedure is scarce. MATERIALS AND METHODS: We retrospectively investigated all kidney transplants performed from October 2014 to December 2019, in which a follow-up kidney biopsy had been performed at 1-year follow-up, comparing controlled procedures with donors after circulatory death and normothermic regional perfusion versus donors after brain death. Interstitial fibrosis/tubular atrophy was assessed by adding the values of interstitial fibrosis and tubular atrophy, according to the Banff classification of renal allograft pathology. RESULTS: When we compared histological data from 66 transplants with donations after brain death versus 24 transplants with donations after circulatory death and normothermic regional perfusion, no differences were found in the degree of fibrosis in the 1-year follow-up biopsy (1.7 ± 1.3 vs 1.7 ± 1.1; P = .971) or in the ratio of patients with increased fibrosis calculated as interstitial fibrosis/tubular atrophy >2 (18% vs 13%; P = .522). In our multivariate analysis, which included acute rejection, expanded criteria donation, and the type of donation, no variable was independently related to an increased risk of interstitial fibrosis/tubular atrophy >2. CONCLUSIONS: The outcomes of kidney grafts procured in our center using controlled procedures with donors after circulatory death and normothermic regional perfusion were indistinguishable from those obtained from donors after brain death, showing the same degree of fibrosis in the 1-year posttransplant surveillance biopsy. Our data support the conclusion that normothermic regional perfusion should be the method of choice for extraction in donors after circulatory death.
Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Transplante de Rim/efeitos adversos , Transplante de Rim/métodos , Morte Encefálica , Estudos Retrospectivos , Sobrevivência de Enxerto , Preservação de Órgãos/efeitos adversos , Preservação de Órgãos/métodos , Perfusão/efeitos adversos , Perfusão/métodos , Doadores de Tecidos , Fibrose , Biópsia , Atrofia/etiologia , MorteRESUMO
Tacrolimus has a narrow therapeutic margin. Maintaining tacrolimus blood levels in the appropriate range is difficult because of its intrapatient variability. In fact, greater blood level variability has been related to worse kidney graft outcome, but only measuring variability does not consider the therapeutic range goal. Determining the time in therapeutic range (TTR) using the Rosendaal method allows dose optimization by considering the adverse events associated with both supratherapeutic and subtherapeutic doses. Some previous studies in kidney and lung transplantation have shown that the measurement of TTR has been related to the subsequent graft outcome. We performed a single-center, observational study including 215 consecutive kidney transplants performed in our center. The percentage of time that the patient remained with levels above 6 ng/mL between months 3 and 12 (%TTR3-12) was calculated using the Rosendaal method. A lower %TTR3-12 was associated with a higher risk of acute rejection (area under the receiver operating characteristic curve, 0.614; 95% confidence interval [CI], 0.513-0.714; P = .018) and with a higher risk of having a 1-year glomerular filtration rate < 30 mL/min/1.73 m2 (area under the receiver operating characteristic curve, 0.676; 95% CI, 0.542-0.811; P = .014). The lowest tertile of %TTR3-12 was independently associated with a higher risk of death-censored graft loss (hazard ratio, 10.773; 95% CI, 1.315-88.264; P = .027) after adjusting by 1-year glomerular filtration rate, expanded criteria donation, and acute rejection throughout the first year. To conclude, measuring TTR after kidney transplant is an easy way to estimate the time of exposure to adequate levels of tacrolimus and relates to kidney graft outcome.
Assuntos
Rejeição de Enxerto , Tacrolimo , Humanos , Tacrolimo/uso terapêutico , Rejeição de Enxerto/prevenção & controle , Sobrevivência de Enxerto , Imunossupressores/uso terapêutico , Estudos Retrospectivos , RimRESUMO
BACKGROUND: Regulatory T (Treg) cells play a role in limiting kidney transplant rejection and can potentially promote long-term transplant tolerance. There are no large prospective studies demonstrating the utility of peripheral blood Treg cells as biomarkers for long-term graft outcome in kidney transplantation. The aim of our study was to analyze the influence of the absolute number of peripheral blood Treg cells after transplantation on long-term death-censored graft survival. METHODS: We monitored the absolute numbers of Treg cells by flow cytometry in nonfrozen samples of peripheral blood in 133 kidney transplant recipients, who were prospectively followed up to 2 years after transplantation. Death-censored graft survival was determined retrospectively in January 2017. RESULTS: The mean time of clinical follow-up was 7.4 ± 2.9 years and 24.1% patients suffered death-censored graft loss (DCGL). Patients with high Treg cells 1 year after transplantation and above the median value (14.57 cells/mm3), showed better death-censored graft survival (5-year survival, 92.5% vs 81.4%, Log-rank P = .030). One-year Treg cells showed a receiver operating characteristic - area under curve of 63.1% (95% confidence interval, 52.9-73.2%, P = 0.026) for predicting DCGL. After multivariate Cox regression analysis, an increased number of peripheral blood Treg cells was a protective factor for DCGL (hazard ratio, 0.961, 95% confidence interval, 0.924-0.998, P = 0.041), irrespectively of 1-year proteinuria and renal function. CONCLUSIONS: Peripheral blood absolute numbers of Treg cells 1 year after kidney transplantation predict a better long-term graft outcome and may be used as prognostic biomarkers.
RESUMO
INTRODUCTION: Online haemodiafiltration (OL-HDF) has been associated with increased survival. To date, the influence of the inner diameter of the hollow fibres of the dialyser on convective volume has not been well established. The objective of the study was to evaluate the effect of increasing the inner diameter of the dialyser on the convective volume and removal capacity. MATERIAL AND METHODS: We included 16 patients in posdilutional OL-HDF with autosubstitution. Each patient was analysed in 4 sessions in which the inner diameter varied; 185µm (FX60 Cordiax and FX80 Cordiax) versus 210µm (FX600 Cordiax and FX800 Cordiax). Different solutes were measured at the beginning and end of each dialysis session. RESULTS: No differences in the convective volume were found with an increased inner diameter: 32.3±3.1 vs. 31.8±3.6 l/session (FX60 vs. FX600) and 33.7±4.3 vs. 33.5±3.8 l/session (FX80 vs. FX800). The reduction percentages also did not differ: urea 83.7±4.5 vs. 84.1±3.4 for FX60 and FX600, and 82.7±4.1 vs. 83.6±3.8 for FX80 vs. FX800; creatinine similar 78.2±5.6 vs. 77.8±4.6 y 77.1±5.4 vs. 78.1±4.9; ß2-microglobulin 82.2±4.3 vs. 82.9±4.2, and 82.9±4.7 vs. 84.0±3.8; myoglobin 71.0±10 vs. 70.2±9 and 72.8±11 vs. 75.0±10; prolactin 70.4±9 vs. 68.1±9, and 72.2±10 vs. 73.4±8.2; and α1-microglobulin 22.9±10 vs. 21.6±10, and 26.5±12 vs. 28.8±11, respectively. CONCLUSION: The increase in the inner diameter of the hollow fibres did not result in improved convective volume and removal capacity.
Assuntos
Hemodiafiltração/instrumentação , Adulto , Idoso , Idoso de 80 Anos ou mais , Proteínas Sanguíneas/análise , Convecção , Creatinina/análise , Desenho de Equipamento , Feminino , Humanos , Falência Renal Crônica/sangue , Falência Renal Crônica/terapia , Masculino , Pessoa de Meia-Idade , Prolactina/análise , Reologia , Ureia/análiseRESUMO
BACKGROUND: Acute kidney injury (AKI) occurs in more than half critically ill patients admitted in intensive care units (ICU) and increases the mortality risk. The main cause of AKI in ICU is sepsis. AKI severity and other related variables such as recurrence of AKI episodes may influence mortality risk. While AKI recurrence after hospital discharge has been recently related to an increased risk of mortality, little is known about the rate and consequences of AKI recurrence during the ICU stay. Our hypothesis is that AKI recurrence during ICU stay in septic patients may be associated to a higher mortality risk. METHODS: We prospectively enrolled all (405) adult patients admitted to the ICU of our hospital with the diagnosis of severe sepsis/septic shock for a period of 30 months. Serum creatinine was measured daily. 'In-ICU AKI recurrence' was defined as a new spontaneous rise of ≥0.3 mg/dl within 48 h from the lowest serum creatinine after the previous AKI episode. RESULTS: Excluding 5 patients who suffered the AKI after the initial admission to ICU, 331 patients out of the 400 patients (82.8%) developed at least one AKI while they remained in the ICU. Among them, 79 (19.8%) developed ≥2 AKI episodes. Excluding 69 patients without AKI, in-hospital (adjusted HR = 2.48, 95% CI 1.47-4.19), 90-day (adjusted HR = 2.54, 95% CI 1.55-4.16) and end of follow-up (adjusted HR = 1.97, 95% CI 1.36-2.84) mortality rates were significantly higher in patients with recurrent AKI, independently of sex, age, mechanical ventilation necessity, APACHE score, baseline estimated glomerular filtration rate, complete recovery and KDIGO stage. CONCLUSIONS: AKI recurred in about 20% of ICU patients after a first episode of sepsis-related AKI. This recurrence increases the mortality rate independently of sepsis severity and of the KDIGO stage of the initial AKI episode. ICU physicians must be aware of the risks related to AKI recurrence while multiple episodes of AKI should be highlighted in electronic medical records and included in the variables of clinical risk scores.
RESUMO
BACKGROUND: Post-transplant proteinuria is associated with lower graft and patient survival. Renin-angiotensin-aldosterone system blockers are used to reduce proteinuria and improve renal outcome. Although it is known that a high salt intake blunts the antiproteinuric effect of ACEI and ARB drugs in non-transplant patients, this effect has not been studied in kidney transplant recipients. OBJECTIVE: To analyse the relationship between sodium intake and the antiproteinuric effect of ACEI/ARB drugs in kidney transplant recipients. METHODS: We selected 103 kidney transplant recipients receiving ACEI/ARB drugs for more than 6 months due to proteinuria>1 g/day. Proteinuria was analysed at baseline and at 6 months after starting ACEI/ARB treatment. Salt intake was estimated by urinary sodium to creatinine ratio (uNa/Cr). RESULTS: Proteinuria fell to less than 1g/day in 46 patients (44.7%). High uNa/Cr was associated with a smaller proteinuria decrease (r=-0.251, P=.011). The percentage proteinuria reduction was significantly lower in patients in the highest uNa/Cr tertile [63.9% (IQR 47.1%), 60.1% (IQR 55.4%), 38.9% (IQR 85.5%), P=.047]. High uNa/Cr independently relates (OR 2.406 per 100 mEq/g, 95% CI: 1.008-5.745, P=.048) to an antiproteinuric response <50% after renin-angiotensin-aldosterone system blockade. CONCLUSIONS: A high salt intake results in a smaller proteinuria decrease in kidney transplant recipients with proteinuria treated with ACEI/ARB drugs.
Assuntos
Inibidores da Enzima Conversora de Angiotensina/uso terapêutico , Transplante de Rim , Proteinúria/complicações , Sistema Renina-Angiotensina , Sódio na Dieta/administração & dosagem , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-IdadeRESUMO
BACKGROUND: This observational study was conducted to investigate the use and effectiveness of calcium acetate/magnesium carbonate (CaMg) in the treatment of hyperphosphataemia in dialysis patients in real-world clinical practice. METHODS: 120 adult CKD patients on dialysis who received CaMg alone or in combination with other phosphate binders were followed-up for 3-12 months. Serum phosphorus, calcium, magnesium, parathyroid hormone and albumin concentration was measured at baseline and after 3, 6 and 12 months respectively. In addition, CaMg dosage, use of concurrent phosphate binders, vitamin D and cinacalcet was documented. Patients were evaluated in 2 subgroups – CaMg alone (n=79) vs. CaMg + concurrent phosphate binder (n=41). RESULTS: In both subgroups serum phosphorus levels decreased significantly from baseline at 3, 6 and 12 months of CaMg treatment. The percentage achievement of recommended serum phosphorus targets improved after CaMg initiation. At month 6, a total of 78% were within the Kidney Disease Outcomes Quality Initiative (K/DOQI) target range. Total corrected serum calcium increased during CaMg treatment, but mildly exceeded the upper limit of normal in three patients only. Asymptomatic significant increases in magnesium (p<0.001) were observed in the monotherapy group at 3, 6 and 12 months. A total of 80 patients (67%) experienced episodes of mild hypermagnesaemia (>2.6mg/mL, 1.05mmol/L). CONCLUSIONS: This analysis of current clinical practice shows that – consistent with findings from a randomised controlled trial – CaMg treatment leads to marked improvement in serum phosphorus levels, helping patients in trying to achieve K/DOQI and KDIGO (Kidney Disease Improving Global Outcome) targets.
Assuntos
Acetatos/uso terapêutico , Hiperfosfatemia/tratamento farmacológico , Magnésio/uso terapêutico , Diálise Renal , Compostos de Cálcio/uso terapêutico , Feminino , Seguimentos , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores de Tempo , Resultado do TratamentoRESUMO
The incidence of stroke is higher substantially among hemodialysis patients than in the overall population. In this observational cohort study, we analysed data from incident hemodialysis patients at Valdecilla University Hospital in Santander (Spain) during a 40-year period (1971-2011). A total number of 1453 patients were started on hemodialysis The total follow-up period was 4982.22 patients/year, with 84 patients having stroke. The cumulative incidence of stroke in our patients was 5.8%, with an incidence rate of 1686 strokes per 100 000 patient-years. The incidence rate in the first year was 1803 strokes per 100 000 patients-year, 6.5% higher than its average over the period studied. In the remaining period, the rates ranged between 356 and 1626 strokes per 100 000 patients-year. Significative factors related to stroke were: diabetes, myocardial infarction or angina, hypertension, arteriosclerosis/intermittent claudication, history of stroke before the HD and atrial fibrillation. Haemoglobin levels in the cohort stroke were virtually identical to those of the not stroke cohort (11.92±2.07 g/dL, compared to 11, 68±2.12 g/dL). Finally, 60.7% of the population of the stroke cohort received erythropoietin with mean dose of 9611 IU/week, compared to 51.9% and a dose of 9544 IU/week in the not stroke cohort, without significative differences among groups. In conclusion, in haemodialysis population the incidence of stroke is 7-10 times higher than in the general population. It is associated with well known factors for stroke but not with haemoglobin levels or erythropoietin dose.