Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
1.
Kidney360 ; 3(5): 872-882, 2022 05 26.
Artigo em Inglês | MEDLINE | ID: mdl-36128496

RESUMO

Background: Icodextrin has been shown in randomized controlled trials to benefit fluid management in peritoneal dialysis (PD). We describe international icodextrin prescription practices and their relationship to clinical outcomes. Methods: We analyzed data from the prospective, international PDOPPS, from Australia/New Zealand, Canada, Japan, the United Kingdom, and the United States. Membrane function and 24-hour ultrafiltration according to icodextrin and glucose prescription was determined at baseline. Using an instrumental variable approach, Cox regression, stratified by country, was used to determine any association of icodextrin use to death and permanent transfer to hemodialysis (HDT), adjusted for demographics, comorbidities, serum albumin, urine volume, transplant waitlist status, PD modality, center size, and study phase. Results: Icodextrin was prescribed in 1986 (35%) of 5617 patients, >43% of patients in all countries, except in the United States, where it was only used in 17% and associated with a far greater use of hypertonic glucose. Patients on icodextrin had more coronary artery disease and diabetes, longer dialysis vintage, lower residual kidney function, faster peritoneal solute transfer rates, and lower ultrafiltration capacity. Prescriptions with or without icodextrin achieved equivalent ultrafiltration (median 750 ml/d [interquartile range 300-1345 ml/d] versus 765 ml/d [251-1345 ml/d]). Icodextrin use was not associated with mortality (HR=1.03; 95% CI, 0.72 to 1.48) or HDT (HR 1.2; 95% CI, 0.92 to 1.57). Conclusions: There are large national and center differences in icodextrin prescription, with the United States using significantly less. Icodextrin was associated with hypertonic glucose avoidance but equivalent ultrafiltration, which may affect any potential survival advantage or HDT.


Assuntos
Soluções para Diálise , Diálise Renal , Soluções para Diálise/uso terapêutico , Glucose/uso terapêutico , Humanos , Icodextrina , Estudos Prospectivos , Albumina Sérica
2.
Kidney Med ; 4(2): 100395, 2022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-35243307

RESUMO

RATIONALE & OBJECTIVE: Potential surrogate end points for kidney failure have been proposed in chronic kidney disease (CKD); however, they must be evaluated to ensure accurate, powerful, and harmonized research, particularly among patients with advanced CKD. The aim of the current study was to investigate the power and predictive ability of surrogate kidney failure end points in a population with moderate-to-advanced CKD. STUDY DESIGN: Analysis of longitudinal data of a large multinational CKD observational study (Chronic Kidney Disease Outcomes and Practice Patterns Study). SETTING & PARTICIPANTS: CKD stage 3-5 patients from Brazil, France, Germany, and the United States. OUTCOMES: Reaching an estimated glomerular filtration rate (eGFR) < 15 mL/min/1.73 m2 or eGFR decline of ≥40%, and composite end points of these individual end points. ANALYTICAL APPROACH: Each end point was used as a time-varying indicator in the Cox model to predict the time to kidney replacement therapy (KRT; dialysis or transplant) and was compared by the number of events and prediction accuracy. RESULTS: 8,211 patients had a median baseline eGFR of 27 mL/min/1.73 m2 (interquartile range, 21-36 mL/min/1.73 m2) and 1,448 KRT events over a median follow-up of 2.7 years (interquartile range, 1.2-3.0 years). Among CKD stage 4 patients, the eGFR < 15 mL/min/1.73 m2 end point had higher prognostic ability than 40% eGFR decline, but the end points were similar for CKD stage 3 patients. The combination of eGFR < 15 mL/min/1.73 m2 and 40% eGFR decline had the highest prognostic ability for predicting KRT, regardless of the CKD stage. Including KRT in the composite can increase the number of events and, therefore, the power. LIMITATIONS: Variable visit frequency resulted in variable eGFR measurement frequency. CONCLUSIONS: The composite end point can be useful for CKD progression studies among patients with advanced CKD. Harmonized use of this approach has the potential to accelerate the translation of new discoveries to clinical practice by identifying risk factors and treatments for kidney failure.

3.
J Cachexia Sarcopenia Muscle ; 12(4): 855-865, 2021 08.
Artigo em Inglês | MEDLINE | ID: mdl-34060245

RESUMO

BACKGROUND: Wasting is a common complication of kidney failure that leads to weight loss and poor outcomes. Recent experimental data identified parathyroid hormone (PTH) as a driver of adipose tissue browning and wasting, but little is known about the relations among secondary hyperparathyroidism, weight loss, and risk of mortality in dialysis patients. METHODS: We included 42,319 chronic in-centre haemodialysis patients from the Dialysis Outcomes and Practice Patterns Study phases 2-6 (2002-2018). Linear mixed models were used to estimate the association between baseline PTH and percent weight change over 12 months, adjusting for country, demographics, comorbidities, and labs. Accelerated failure time models were used to assess 12 month weight loss as a mediator between baseline high PTH and mortality after 12 months. RESULTS: Baseline PTH was inversely associated with 12 month weight change: 12 month weight loss >5% was observed in 21%, 18%, 18%, 17%, 15%, and 14% of patients for PTH ≥600 pg/mL, 450-600, 300-450, 150-300, 50-150, and <50 pg/mL, respectively. In adjusted analyses, 12 month weight change compared with PTH 150-299 pg/mL was -0.60%, -0.12%, -0.10%, +0.15%, and +0.35% for PTH ≥600, 450-600, 300-450, 50-150, and <50 pg/mL, respectively. This relationship was robust regardless of recent hospitalization and was more pronounced in persons with preserved appetite. During follow-up after the 12 month weight measure [median, 1.0 (interquartile range, 0.6-1.7) years; 6125 deaths], patients with baseline PTH ≥600 pg/mL had 11% [95% confidence interval (CI), 9-13%] shorter lifespan, and 18% (95% CI, 14-23%) of this effect was mediated through weight loss ≥2.5%. CONCLUSIONS: Secondary hyperparathyroidism may be a novel mechanism of wasting, corroborating experimental data, and, among chronic dialysis patients, this pathway may be a mediator between elevated PTH levels and mortality. Future research should determine whether PTH-lowering therapy can limit weight loss and improve longer term dialysis outcomes.


Assuntos
Hiperparatireoidismo Secundário , Redução de Peso , Humanos , Hiperparatireoidismo Secundário/epidemiologia , Hiperparatireoidismo Secundário/etiologia , Hormônio Paratireóideo , Diálise Renal/efeitos adversos
5.
J Am Soc Nephrol ; 30(1): 127-135, 2019 01.
Artigo em Inglês | MEDLINE | ID: mdl-30559143

RESUMO

BACKGROUND: Population rates of obesity, hypertension, diabetes, age, and race can be used in simulation models to develop projections of ESRD incidence and prevalence. Such projections can inform long-range planning for ESRD resources needs. METHODS: We used an open compartmental simulation model to estimate the incidence and prevalence of ESRD in the United States through 2030 on the basis of wide-ranging projections of population obesity and ESRD death rates. Population trends in age, race, hypertension, and diabetes were on the basis of data from the Centers for Disease Control and Prevention's National Health and Nutrition Examination Survey and the US Census. RESULTS: The increase in ESRD incidence rates within age and race groups has leveled off and/or declined in recent years, but our model indicates that population changes in age and race distribution, obesity and diabetes prevalence, and ESRD survival will result in a 11%-18% increase in the crude incidence rate from 2015 to 2030. This incidence trend along with reductions in ESRD mortality will increase the number of patients with ESRD by 29%-68% during the same period to between 971,000 and 1,259,000 in 2030. CONCLUSIONS: The burden of ESRD will increase in the United States population through 2030 due to demographic, clinical, and lifestyle shifts in the population and improvements in RRT. Planning for ESRD resource allocation should allow for substantial continued growth in the population of patients with ESRD. Future interventions should be directed to preventing the progression of CKD to kidney failure.


Assuntos
Causas de Morte , Diabetes Mellitus/epidemiologia , Hipertensão/epidemiologia , Falência Renal Crônica/epidemiologia , Obesidade/epidemiologia , Idoso , Idoso de 80 Anos ou mais , Comorbidade , Diabetes Mellitus/diagnóstico , Feminino , Humanos , Hipertensão/diagnóstico , Incidência , Falência Renal Crônica/fisiopatologia , Masculino , Pessoa de Meia-Idade , Modelos Estatísticos , Obesidade/diagnóstico , Valor Preditivo dos Testes , Prevalência , Análise de Sobrevida , Fatores de Tempo , Estados Unidos/epidemiologia
6.
Biometrics ; 74(2): 734-743, 2018 06.
Artigo em Inglês | MEDLINE | ID: mdl-28771674

RESUMO

We propose a C-index (index of concordance) applicable to recurrent event data. The present work addresses the dearth of measures for quantifying a regression model's ability to discriminate with respect to recurrent event risk. The data which motivated the methods arise from the Dialysis Outcomes and Practice Patterns Study (DOPPS), a long-running prospective international study of end-stage renal disease patients on hemodialysis. We derive the theoretical properties of the measure under the proportional rates model (Lin et al., 2000), and propose computationally convenient inference procedures based on perturbed influence functions. The methods are shown through simulations to perform well in moderate samples. Analysis of hospitalizations among a cohort of DOPPS patients reveals substantial improvement in discrimination upon adding country indicators to a model already containing basic clinical and demographic covariates, and further improvement upon adding a relatively large set of comorbidity indicators.


Assuntos
Simulação por Computador , Interpretação Estatística de Dados , Hospitalização/estatística & dados numéricos , Falência Renal Crônica , Adulto , Comorbidade , Demografia/estatística & dados numéricos , Humanos , Falência Renal Crônica/terapia , Modelos Estatísticos , Recidiva , Análise de Regressão , Diálise Renal
7.
BMC Nephrol ; 18(1): 330, 2017 Nov 09.
Artigo em Inglês | MEDLINE | ID: mdl-29121874

RESUMO

BACKGROUND: Anemia management protocols in hemodialysis (HD) units differ conspicuously regarding optimal intravenous (IV) iron dosing; consequently, patients receive markedly different cumulative exposures to IV iron and erythropoiesis-stimulating agents (ESAs). Complementary to IV iron safety studies, our goal was to gain insight into optimal IV iron dosing by estimating the effects of IV iron doses on Hgb, TSAT, ferritin, and ESA dose in common clinical practice. METHODS: 9,471 HD patients (11 countries, 2009-2011) in the DOPPS, a prospective cohort study, were analyzed. Associations of IV iron dose (3-month average, categorized as 0, <300, ≥300 mg/month) with 3-month change in Hgb, TSAT, ferritin, and ESA dose were evaluated using adjusted GEE models. RESULTS: Relative change: Monotonically positive associations between IV iron dose and Hgb, TSAT, and ferritin change, and inverse associations with ESA dose change, were observed across multiple strata of prior Hgb, TSAT, and ferritin levels. Absolute change: TSAT, ferritin, and ESA dose changes were nearest zero with IV iron <300 mg/month, rather than 0 mg/month or ≥300 mg/month by maintenance or replacement dosing. Findings were robust to numerous sensitivity analyses. CONCLUSIONS: Though residual confounding cannot be ruled out in this observational study, findings suggest that IV iron dosing <300 mg/month, as commonly seen with maintenance dosing of 100-200 mg/month, may be a more effective approach to support Hgb than the higher IV iron doses (300-400 mg/month) often given in many European and North American hemodialysis clinics. Alongside studies supporting the safety of IV iron in 100-200 mg/month dose range, these findings help guide the rational dosing of IV iron in anemia management protocols for everyday hemodialysis practice.


Assuntos
Anemia/sangue , Anemia/tratamento farmacológico , Gerenciamento Clínico , Ferro/administração & dosagem , Ferro/sangue , Diálise Renal/tendências , Administração Intravenosa , Idoso , Anemia/epidemiologia , Estudos de Coortes , Feminino , Humanos , Internacionalidade , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Diálise Renal/efeitos adversos , Resultado do Tratamento
9.
Transplantation ; 101(5): e170-e177, 2017 05.
Artigo em Inglês | MEDLINE | ID: mdl-28221244

RESUMO

BACKGROUND: The association of HLA mismatching with kidney allograft survival has been well established. We examined whether amino acid (AA) mismatches (MMs) at the antigen recognition site of HLA molecules represent independent and incremental risk factors for kidney graft failure (GF) beyond those MMs assessed at the antigenic (2-digit) specificity. METHODS: Data on 240 024 kidney transplants performed between 1987 and 2009 were obtained from the Scientific Registry of Transplant Recipients. We imputed HLA-A, -B, and -DRB1 alleles and corresponding AA polymorphisms from antigenic specificity through the application of statistical and population genetics inferences. GF risk was evaluated using Cox proportional-hazards regression models adjusted for covariates including patient and donor risk factors and HLA antigen MMs. RESULTS: We show that estimated AA MMs at particular positions in the peptide-binding pockets of HLA-DRB1 molecule account for a significant incremental risk that was independent of the well-known association of HLA antigen MMs with graft survival. A statistically significant linear relationship between the estimated number of AA MMs and risk of GF was observed for HLA-DRB1 in deceased donor and living donor transplants. This relationship was strongest during the first 12 months after transplantation (hazard ratio, 1.30 per 15 DRB1 AA MM; P < 0.0001). CONCLUSIONS: This study shows that independent of the well-known association of HLA antigen (2-digit specificity) MMs with kidney graft survival, estimated AA MMs at peptide-binding sites of the HLA-DRB1 molecule account for an important incremental risk of GF.


Assuntos
Rejeição de Enxerto/imunologia , Sobrevivência de Enxerto/imunologia , Antígenos HLA/genética , Transplante de Rim , Polimorfismo Genético , Seguimentos , Marcadores Genéticos , Antígenos HLA/imunologia , Humanos , Modelos Lineares , Modelos de Riscos Proporcionais
12.
Am J Kidney Dis ; 69(3): 367-379, 2017 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-27866963

RESUMO

BACKGROUND: High interdialytic weight gain (IDWG) is associated with adverse outcomes in hemodialysis (HD) patients. We identified temporal and regional trends in IDWG, predictors of IDWG, and associations of IDWG with clinical outcomes. STUDY DESIGN: Analysis 1: sequential cross-sections to identify facility- and patient-level predictors of IDWG and their temporal trends. Analysis 2: prospective cohort study to assess associations between IDWG and mortality and hospitalization risk. SETTING & PARTICIPANTS: 21,919 participants on HD therapy for 1 year or longer in the Dialysis Outcomes and Practice Patterns Study (DOPPS) phases 2 to 5 (2002-2014). PREDICTORS: Analysis 1: study phase, patient demographics and comorbid conditions, HD facility practices. Analysis 2: relative IDWG, expressed as percentage of post-HD weight (<0%, 0%-0.99%, 1%-2.49%, 2.5%-3.99% [reference], 4%-5.69%, and ≥5.7%). OUTCOMES: Analysis 1: relative IDWG as a continuous variable using linear mixed models; analysis 2: mortality; all-cause and cause-specific hospitalization using Cox regression, adjusting for potential confounders. RESULTS: From phase 2 to 5, IDWG declined in the United States (-0.29kg; -0.5% of post-HD weight), Canada (-0.25kg; -0.8%), and Europe (-0.22kg; -0.5%), with more modest declines in Japan and Australia/New Zealand. Among modifiable factors associated with IDWG, the most notable was facility mean dialysate sodium concentration: every 1-mEq/L greater dialysate sodium concentration was associated with 0.13 (95% CI, 0.11-0.16) greater relative IDWG. Compared to relative IDWG of 2.5% to 3.99%, there was elevated risk for mortality with relative IDWG≥5.7% (adjusted HR, 1.23; 95% CI, 1.08-1.40) and elevated risk for fluid-overload hospitalization with relative IDWG≥4% (HRs of 1.28 [95% CI, 1.09-1.49] and 1.64 [95% CI, 1.27-2.13] for relative IDWGs of 4%-5.69% and ≥5.7%, respectively). LIMITATIONS: Possible residual confounding. No dietary salt intake data. CONCLUSIONS: Reductions in IDWG during the past decade were partially explained by reductions in dialysate sodium concentration. Focusing quality improvement strategies on reducing occurrences of high IDWG may improve outcomes in HD patients.


Assuntos
Falência Renal Crônica/terapia , Diálise Renal , Aumento de Peso , Feminino , Hospitalização/estatística & dados numéricos , Humanos , Falência Renal Crônica/mortalidade , Masculino , Pessoa de Meia-Idade , Avaliação de Resultados em Cuidados de Saúde , Padrões de Prática Médica , Estudos Prospectivos , Fatores de Tempo
13.
Semin Dial ; 27(1): 72-7, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-24400803

RESUMO

Hemodialysis (HD) catheter-related infection (CRI) and septicemia contribute to adverse outcomes. The impact of seasonality and prophylactic dialysis practices during high-risk periods remain unexplored. This multicenter study analyzed DOPPS data from 12,122 HD patients (from 442 facilities) to determine the association between seasonally related climatic variables and CRI and septicemia. Climatic variables were determined by linkage to National Climatic Data Center of National Oceanic and Atmospheric Administration data. Catheter care protocols were examined to determine if they could mitigate infection risk during high-risk seasons. Survival models were used to estimate the adjusted hazard ratio (AHR) of septicemia by season and by facility catheter dressing protocol. The overall catheter-related septicemia rate was 0.47 per 1000 catheter days. It varied by season, with an AHR for summer of 1.46 (95% CI: 1.19-1.80) compared with winter. Septicemia was associated with temperature (AHR = 1.07; 95% CI: 1.02-1.13; p < 0.001). Dressing protocols using chlorhexidine (AHR of septicemia = 0.55; 95% CI: 0.39-0.78) were associated with fewest episodes of CRI or septicemia. Higher catheter-related septicemia in summer may be due to seasonal conditions (e.g., heat, perspiration) that facilitate bacterial growth and compromise protective measures. Extra vigilance and use of chlorhexidine-based dressing protocols may provide prophylaxis against CRI and septicemia.


Assuntos
Infecções Relacionadas a Cateter/epidemiologia , Cateteres de Demora/efeitos adversos , Diálise Renal , Estações do Ano , Sepse/epidemiologia , Temperatura , Anti-Infecciosos Locais/administração & dosagem , Bandagens , Infecções Relacionadas a Cateter/prevenção & controle , Clorexidina/administração & dosagem , Bases de Dados Factuais , Humanos , Povidona-Iodo/administração & dosagem , Modelos de Riscos Proporcionais , Sepse/prevenção & controle
14.
Kidney Int ; 85(1): 158-65, 2014 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-23802192

RESUMO

Mortality rates for maintenance hemodialysis patients are much higher than the general population and are even greater soon after starting dialysis. Here we analyzed mortality patterns in 86,886 patients in 11 countries focusing on the early dialysis period using data from the Dialysis Outcomes and Practice Patterns Study, a prospective cohort study of in-center hemodialysis. The primary outcome was all-cause mortality, using time-dependent Cox regression, stratified by study phase adjusted for age, sex, race, and diabetes. The main predictor was time since dialysis start as divided into early (up to 120 days), intermediate (121-365 days), and late (over 365 days) periods. Mortality rates (deaths/100 patient-years) were 26.7 (95% confidence intervals 25.6-27.9), 16.9 (16.2-17.6), and 13.7 (13.5-14.0) in the early, intermediate, and late periods, respectively. In each country, mortality was higher in the early compared to the intermediate period, with a range of adjusted mortality ratios from 3.10 (2.22-4.32) in Japan to 1.15 (0.87-1.53) in the United Kingdom. Adjusted mortality rates were similar for intermediate and late periods. The ratio of elevated mortality rates in the early to the intermediate period increased with age. Within each period, mortality was higher in the United States than in most other countries. Thus, internationally, the early hemodialysis period is a high-risk time for all countries studied, with substantial differences in mortality between countries. Efforts to improve outcomes should focus on the transition period and the first few months of dialysis.


Assuntos
Falência Renal Crônica/mortalidade , Diálise Renal/mortalidade , Fatores Etários , Idoso , Feminino , Humanos , Internacionalidade , Falência Renal Crônica/terapia , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos , Fatores Sexuais
15.
Diabetes Care ; 35(12): 2527-32, 2012 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-22912431

RESUMO

OBJECTIVE: Lowering hemoglobin A(1c) to <7% reduces the risk of microvascular complications of diabetes, but the importance of maintaining this target in diabetes patients with kidney failure is unclear. We evaluated the relationship between A(1c) levels and mortality in an international prospective cohort study of hemodialysis patients. RESEARCH DESIGN AND METHODS: Included were 9,201 hemodialysis patients from 12 countries (Dialysis Outcomes and Practice Patterns Study 3 and 4, 2006-2010) with type 1 or type 2 diabetes and at least one A(1c) measurement during the first 8 months after study entry. Associations between A(1c) and mortality were assessed with Cox regression, adjusting for potential confounders. RESULTS: The association between A(1c) and mortality was U-shaped. Compared with an A(1c) of 7-7.9%, the hazard ratios (95% CI) for A(1c) levels were 1.35 (1.09-1.67) for <5%, 1.18 (1.01-1.37) for 5-5.9%, 1.21 (1.05-1.41) for 6-6.9%, 1.16 (0.94-1.43) for 8-8.9%, and 1.38 (1.11-1.71) for ≥9.0%, after adjustment for age, sex, race, BMI, serum albumin, years of dialysis, serum creatinine, 12 comorbid conditions, insulin use, hemoglobin, LDL cholesterol, country, and study phase. Diabetes medications were prescribed for 35% of patients with A(1c) <6% and not prescribed for 29% of those with A(1c) ≥9%. CONCLUSIONS: A(1c) levels strongly predicted mortality in hemodialysis patients with type 1 or type 2 diabetes. Mortality increased as A(1c) moved further from 7-7.9%; thus, target A(1c) in hemodialysis patients may encompass values higher than those recommended by current guidelines. Modifying glucose-lowering medicines for dialysis patients to target A(1c) levels within this range may be a modifiable practice to improve outcomes.


Assuntos
Diabetes Mellitus Tipo 1/metabolismo , Diabetes Mellitus Tipo 1/mortalidade , Diabetes Mellitus Tipo 2/metabolismo , Diabetes Mellitus Tipo 2/mortalidade , Hemoglobinas Glicadas/metabolismo , Diálise Renal/mortalidade , Feminino , Humanos , Masculino , Pessoa de Meia-Idade
17.
Kidney Int ; 68(1): 330-7, 2005 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-15954924

RESUMO

BACKGROUND: The international Dialysis Outcomes and Practice Patterns Study (DOPPS I and II) allows description of variations in kidney transplantation and wait-listing from nationally representative samples of 18- to 65-year-old hemodialysis patients. The present study examines the health status and socioeconomic characteristics of United States patients, the role of for-profit versus not-for-profit status of dialysis facilities, and the likelihood of transplant wait-listing and transplantation rates. METHODS: Analyses of transplantation rates were based on 5267 randomly selected DOPPS I patients in dialysis units in the United States, Europe, and Japan who received chronic hemodialysis therapy for at least 90 days in 2000. Left-truncated Cox regression was used to assess time to kidney transplantation. Logistic regression determined the odds of being transplant wait-listed for a cross-section of 1323 hemodialysis patients in the United States in 2000. Furthermore, kidney transplant wait-listing was determined in 12 countries from cross-sectional samples of DOPPS II hemodialysis patients in 2002 to 2003 (N= 4274). RESULTS: Transplantation rates varied widely, from very low in Japan to 25-fold higher in the United States and 75-fold higher in Spain (both P values <0.0001). Factors associated with higher rates of transplantation included younger age, nonblack race, less comorbidity, fewer years on dialysis, higher income, and higher education levels. The likelihood of being wait-listed showed wide variation internationally and by United States region but not by for-profit dialysis unit status within the United States. CONCLUSION: DOPPS I and II confirmed large variations in kidney transplantation rates by country, even after adjusting for differences in case mix. Facility size and, in the United States, profit status, were not associated with varying transplantation rates. International results consistently showed higher transplantation rates for younger, healthier, better-educated, and higher income patients.


Assuntos
Falência Renal Crônica/economia , Falência Renal Crônica/epidemiologia , Transplante de Rim/estatística & dados numéricos , Diálise Renal/estatística & dados numéricos , Listas de Espera , Adolescente , Adulto , Distribuição por Idade , Idoso , Europa (Continente)/epidemiologia , Feminino , Acessibilidade aos Serviços de Saúde/estatística & dados numéricos , Nível de Saúde , Unidades Hospitalares de Hemodiálise/estatística & dados numéricos , Hospitais com Fins Lucrativos/estatística & dados numéricos , Hospitais Filantrópicos/estatística & dados numéricos , Humanos , Japão/epidemiologia , Falência Renal Crônica/cirurgia , Falência Renal Crônica/terapia , Transplante de Rim/economia , Masculino , Pessoa de Meia-Idade , Grupos Raciais/estatística & dados numéricos , Diálise Renal/economia , Classe Social , Estados Unidos/epidemiologia
18.
Am J Transplant ; 5(4 Pt 2): 950-7, 2005 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-15760420

RESUMO

This article provides detailed explanations of the methods frequently employed in outcomes analyses performed by the Scientific Registry of Transplant Recipients (SRTR). All aspects of the analytical process are discussed, including cohort selection, post-transplant follow-up analysis, outcome definition, ascertainment of events, censoring, and adjustments. The methods employed for descriptive analyses are described, such as unadjusted mortality rates and survival probabilities, and the estimation of covariant effects through regression modeling. A section on transplant waiting time focuses on the kidney and liver waiting lists, pointing out the different considerations each list requires and the larger questions that such analyses raise. Additionally, this article describes specialized modeling strategies recently designed by the SRTR and aimed at specific organ allocation issues. The article concludes with a description of simulated allocation modeling (SAM), which has been developed by the SRTR for three organ systems: liver, thoracic organs, and kidney-pancreas. SAMs are particularly useful for comparing outcomes for proposed national allocation policies. The use of SAMs has already helped in the development and implementation of a new policy for liver candidates with high MELD scores to be offered organs regionally before the organs are offered to candidates with low MELD scores locally.


Assuntos
Transplante de Rim/estatística & dados numéricos , Transplante de Fígado/estatística & dados numéricos , Pesquisa , Interpretação Estatística de Dados , Sobrevivência de Enxerto , Humanos , Seleção de Pacientes , Listas de Espera
19.
Am J Kidney Dis ; 44(5 Suppl 2): 39-46, 2004 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-15486873

RESUMO

BACKGROUND: Nutritional markers are important predictors of morbidity and mortality in dialysis patients. The Clinical Practice Guidelines for Nutrition in Chronic Renal Failure provides guidelines for assessing nutritional status that were evaluated using data from the Dialysis Outcomes and Practice Patterns Study (DOPPS). METHODS: The level of various nutritional markers (serum albumin, modified subjective global assessment, serum creatinine, normalized protein catabolic rate [nPCR], and body mass index) were described for representative samples of patients and facilities from 7 countries (France, Germany, Italy, Spain, Japan, United Kingdom, and United States) participating in the DOPPS. RESULTS: A strong inverse association was observed between mortality and serum albumin, with a mortality risk 1.38 times higher for patients with serum albumin concentration less than 3.5 g/dL (35 g/L). There were significant differences by country in the proportion of moderately and severely malnourished patients as determined by the modified subjective global assessment score. In the US sample, severely and moderately malnourished patients had a higher mortality risk compared with those not malnourished, 33% and 5% higher, respectively. An inverse relationship exists between serum creatinine concentration and mortality, with a mortality risk 60% to 70% higher in the lowest quartile group compared with the highest quartile group in Europe and the United States. Levels of nPCR varied significantly between European countries, and there was no association between mortality and nPCR in US data. After adjustment for demographic and comorbidity factors, the mortality risk decreased as body mass index increased in both US and European samples. CONCLUSION: DOPPS data highlight the importance of routine assessment of nutritional status, using multiple parameters, in clinical practice to improve patient care.


Assuntos
Fenômenos Fisiológicos da Nutrição , Diálise Renal , Índice de Massa Corporal , Creatinina/sangue , Humanos , Falência Renal Crônica/terapia , Estado Nutricional , Avaliação de Processos e Resultados em Cuidados de Saúde , Guias de Prática Clínica como Assunto , Diálise Renal/normas , Albumina Sérica
20.
Am J Kidney Dis ; 44(5 Suppl 2): 47-53, 2004 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-15486874

RESUMO

Analyses based on the National Cooperative Dialysis Study (NCDS) provided the impetus for routine quantification of delivered dialysis dose in hemodialysis practice throughout the world, by suggesting minimum targets for small solute (urea) clearance. Morbidity and mortality in dialysis populations remain high despite many technological advances in dialysis delivery. A number of observational studies reported association between higher dose of dialysis as measured by Kt/V urea or urea reduction ratio with lower mortality risk. During the 1990s, a steady increase in dialysis dose and a modest reduction in mortality on dialysis were observed. However, observational studies only reveal associations and are limited by selection bias and confounding. The Kidney Disease Outcomes Quality Initiative guidelines on dialysis adequacy are based on results of observational studies and expert opinion. Since the NCDS, the HEMO Study was the first major randomized clinical trial designed to study the effect of dose of dialysis and dialyzer flux on patient outcomes. Despite adequate separation of dose and flux, however, results of the trial did not prove a beneficial effect of higher dose. The Dialysis Outcomes and Practice Patterns Study (DOPPS), in a major international effort designed to examine the effect of practice patterns on outcomes, has made significant contributions to the topic of dialysis dose. The following review critically examines data from observational studies, including the DOPPS, and from the HEMO Study, emphasizing important lessons from both, and discusses future paradigms for achieving dialysis adequacy to improve patient outcomes.


Assuntos
Avaliação de Processos e Resultados em Cuidados de Saúde , Diálise Renal , Humanos , Falência Renal Crônica/terapia , Guias de Prática Clínica como Assunto , Diálise Renal/normas
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA