RESUMO
OBJECTIVE: The objective of this study was to examine whether more advanced kidney failure is associated with sedentary behavior and whether demographics, comorbidity, nutritional, and inflammatory markers explain this association. DESIGN: Observational study. SETTING: Outpatients recruited from outpatient clinics and dialysis units. SUBJECTS: One hundred sixty patients with chronic kidney disease (CKD) or receiving maintenance hemodialysis (MHD). METHODS: Standardized questionnaires including Baecke physical activity questionnaire, standardized anthropometry examination, and blood draw. MAIN OUTCOME MEASURE: Sedentary behavior (defined as answering "very often" for "during leisure time I watch television" or answering "never" for "during leisure time I walk") and being physically active (top 25th percentile of the total Baecke score). RESULTS: Nineteen percent of CKD and 50% of MHD patients were sedentary (P < .001) and 38.8% of CKD and 11.3% of MHD patients were physically active. In separate multivariable logistic regression models, compared with CKD patients, MHD patients were more sedentary (odds ratio 3.84; 95% confidence interval, 1.18-12.51) and less physically active (odds ratio 0.07; 95% confidence interval 0.01-0.40) independent of demographics, comorbidity, smoking, body size, serum high sensitivity C-reactive protein (hsCRP) and albumin. Congestive heart failure, peripheral vascular disease, and higher body mass index were independently associated with sedentary behavior, whereas younger age, lower body mass index, lower serum hsCRP, and higher serum albumin were associated with being physically active. CONCLUSIONS: Sedentary behavior is highly prevalent among diabetic CKD or MHD patients. The strong association of MHD status with sedentary behavior is not explained by demographics, smoking, comorbidity, nutritional, and inflammatory markers. Interventions targeting obesity might improve sedentary behavior and physical activity, whereas interventions targeting inflammation might improve physical activity in these populations.
Assuntos
Complicações do Diabetes/complicações , Diálise Renal , Insuficiência Renal Crônica/complicações , Comportamento Sedentário , Idoso , Comorbidade , Complicações do Diabetes/sangue , Feminino , Humanos , Inflamação/sangue , Masculino , Pessoa de Meia-Idade , Estado Nutricional , Insuficiência Renal Crônica/sangue , Inquéritos e QuestionáriosRESUMO
BACKGROUND: The effect of lipid-lowering therapy on clinical outcomes in peritoneal dialysis patients has not been carefully addressed. STUDY DESIGN: Secondary analysis of a retrospective cohort study. SETTING & PARTICIPANTS: Data from 1,053 incident peritoneal dialysis patients from the US Renal Data System prospective Dialysis Morbidity and Mortality Wave 2 study. PREDICTOR: Use of lipid-modifying medications (93% statins, 7% other medications). OUTCOMES & MEASUREMENTS: Cox regression with propensity score adjustment was used to evaluate time to cardiovascular or all-cause mortality during a 2-year follow-up period. Subgroups based on predefined cutoff values for serum total cholesterol or triglycerides, presence of diabetes, and comorbidity index were analyzed separately. RESULTS: Use of lipid-modifying medications was associated with decreased all-cause (hazard ratio [HR], 0.74; 95% confidence interval, 0.56 to 0.98) and cardiovascular (HR, 0.67; 95% confidence interval, 0.47 to 0.95) mortality compared with no use of lipid-modifying medications. In subgroup analyses, use of lipid-modifying medications was associated with decreased all-cause mortality (HR, 0.46; 95% confidence interval, 0.22 to 0.95) in the subgroups with cholesterol levels of 226 to 275 mg/dL (HR, 0.27; 95% confidence interval, 0.09 to 0.80) and cholesterol levels greater than 275 mg/dL and cardiovascular mortality (HR, 0.31; 95% confidence interval, 0.11 to 0.85) in the subgroup with cholesterol levels of 226 to 275 mg/dL. Use of lipid-modifying medications also was associated with decreased cardiovascular mortality (HR, 0.64; 95% confidence interval, 0.41 to 0.99) in patients with diabetes and decreased all-cause (HR, 0.65; 95% confidence interval, 0.45 to 0.94) and cardiovascular mortality (HR, 0.55; 95% confidence interval, 0.35 to 0.87) in those with Charlson Comorbidity Index score higher than 2. LIMITATIONS: Observational study with retrospective design. Considerable amount of missing data and limited amount of information for the extreme values of cholesterol and triglycerides. CONCLUSIONS: These observational data suggest that lipid-modifying medication therapy may be associated with improved clinical outcomes in peritoneal dialysis patients.
Assuntos
Inibidores de Hidroximetilglutaril-CoA Redutases/uso terapêutico , Hipolipemiantes/uso terapêutico , Falência Renal Crônica/mortalidade , Diálise Peritoneal , Adulto , Idoso , Doenças Cardiovasculares/mortalidade , Comorbidade , Nefropatias Diabéticas/epidemiologia , Nefropatias Diabéticas/mortalidade , Nefropatias Diabéticas/terapia , Feminino , Humanos , Hipercolesterolemia/tratamento farmacológico , Hipercolesterolemia/epidemiologia , Falência Renal Crônica/epidemiologia , Falência Renal Crônica/terapia , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Análise de Sobrevida , Resultado do TratamentoRESUMO
BACKGROUND: In observational studies, higher uric acid levels are associated with metabolic syndrome, diabetes, and kidney disease. OBJECTIVE: The objective of this study is to examine whether reduction of plasma uric acid with febuxostat, a xanthine oxido reductase inhibitor, impacts adipose tissue oxidative stress, adipokines, and markers of systemic inflammation or kidney fibrosis. DESIGN: This was a double-blinded randomized controlled trial. SETTING: Academic university setting was used. PATIENTS: Overweight or obese adults with hyperuricemia and type 2 diabetic nephropathy were included. MEASUREMENTS: Adipose tissue thiobarbituric acid reducing substances (TBARS) and adiponectin concentrations and urinary transforming growth factor-ß (TGF-ß) were primary endpoints. Plasma C-reactive protein, high molecular weight-adiponectin, interleukin-6, tumor necrosis factor-α, and TBARS and albuminuria were among predefined secondary endpoints. METHODS: Participants were randomly assigned to febuxostat (n = 40) or matching placebo (n = 40) and followed for 24 weeks. RESULTS: Baseline plasma uric acid levels were 426 ± 83 µmol/L; 95% completed the study. Estimated glomerular filtration rate (eGFR) declined from 54 ± 17 mL/min/1.73 m2 at baseline to 51 ± 17 mL/min/1.73 m2 at 24 weeks (P = .05). In separate mixed-effects models, compared with placebo, febuxostat reduced uric acid by 50% (P < .001) but had no significant effects on subcutaneous adipose tissue TBARS (-7.4%, 95% confidence interval [CI], 57.4%-101.4%) or adiponectin (6.7%, 95% CI, 26.0%-53.8%) levels or urinary TGF-ß/creatinine ratio (18.0%, 95% CI, 10.0%-54.8%) or secondary endpoints. LIMITATIONS: Relatively modest sample size and short duration of follow-up. CONCLUSIONS: In this population with progressive diabetic nephropathy, febuxostat effectively reduced plasma uric acid. However, no detectable effects were observed for the prespecified primary or secondary endpoints. TRIAL REGISTRATION: The study was registered in clinicaltrials.gov (NCT01350388).
MISE EN CONTEXTE: Dans les études observationnelles, des taux élevés d'acide urique sont associés à un syndrome métabolique, au diabète et à l'insuffisance rénale. OBJECTIFS DE L'ÉTUDE: Cette étude visait à déterminer si l'utilisation de febuxostat, un inhibiteur de la xanthine oxydoréductase, pour réduire le taux d'acide urique dans le plasma avait une incidence sur le stress oxydatif du tissu adipeux, les adipokines, les marqueurs de l'inflammation systémique ou sur la fibrose kystique. TYPE D'ÉTUDE: Il s'agit d'un essai à double insu, randomisé et contrôlé. CADRE: L'étude s'est effectuée en contexte universitaire. PATIENTS: Les participants à cette étude étaient des adultes obèses ou en surpoids, présentant une hyperuricémie ainsi qu'une néphropathie diabétique de type 2. MESURES: Les critères principaux incluaient la concentration d'adiponectine et de substances réagissant avec l'acide thiobarbiturique (TBARS) dans les tissus adipeux, de même que le facteur de croissance transformant urinaire (TGF-ß). Les critères secondaires incluaient les protéines C-réactives du plasma, l'adiponectine de poids moléculaire élevé, l'interleukine-6, le facteur de nécrose tumorale alpha, les TBARS ainsi que l'albuminurie. MÉTHODOLOGIE: On a prescrit, de façon aléatoire, du febuxostat (n = 40) ou un placebo (n = 40) aux participants, et ces derniers ont été suivis sur une période de 24 semaines. RÉSULTATS: Les valeurs initiales d'acide urique dans le plasma se situaient à 426 ± 83 µmol/L. La grande majorité des participants (95%) a complété l'étude. Le débit de filtration glomérulaire estimé a chuté de 54 ± 17 mL/min/1,73 m2, sa valeur moyenne au début de l'étude, à 51 ± 17 mL/min/1,73 m2 au bout des 24 semaines (P = 0,05). Dans les modèles à effet fixe séparés, lorsque comparé au placebo, le fébuxostat a réduit l'acide urique de 50% (P < 0,001), mais n'a eu aucun effet significatif sur les TBARS des tissus adipeux sous-cutanés (−7,4%, I.C. à 95% entre −57,4 et 101,4%), ni sur le niveau d'adiponectine (6,7%, I.C. à 95% entre −26,0% et 53,8%) ou sur le ratio TGF-ß/créatinine (18,0%, I.C. à 95% entre −10,0% et 54,8%). Il n'a pas non plus eu d'effets significatifs sur les critères secondaires. LIMITES DE L'ÉTUDE: La taille relativement modeste de l'échantillon, de même que la courte durée du suivi constituent les limites de l'étude. CONCLUSIONS: Dans la population observée, soit des patients atteints de néphropathie diabétique progressive, l'administration de fébuxostat a réduit de façon efficace les taux plasmatiques d'acide urique. Par ailleurs, aucun effet apparent n'a été observé sur les critères primaires et secondaires préétablis. ENREGISTREMENT DE L'ESSAI: Cette étude a été enregistrée sur clinicaltrials.gov (NCT01350388).
RESUMO
BACKGROUND: The shortage of organ donors for kidney transplants has made the expansion of the kidney donor pool a clinically significant issue. Previous studies suggest that kidneys from donors with a history of intravenous (IV) drug, cigarette, and/or alcohol use are considered to be a risky choice. However, these kidneys could potentially be used and expand the kidney supply pool if no evidence shows their association with adverse transplant outcomes. METHODS: This study analyzed the United Network for Organ Sharing dataset from 1994 to 1999 using Kaplan-Meier survival analysis and Cox modeling. The effects on transplant outcome (graft and recipient survival) were examined with respect to the donors' IV drug use, cigarette smoking, and alcohol dependency. Covariates including the recipient variables, the donor variables, and the transplant procedure variables were included in the Cox models. RESULTS: The results show that the donors' history of cigarette smoking is a statistically significant risk factor for both graft survival (hazard ratio=1.05, P<0.05) and recipient survival (1.06, P<0.05), whereas neither IV drug use nor alcohol dependency had significant adverse impact on graft or recipient survival. CONCLUSIONS: Assuming that adequate testing for potential infections is performed, there is no evidence to support avoiding the kidneys from donors with IV drug use or alcohol dependency in transplantation. Utilizing these kidneys would clearly expand the potential pool of donor organs.
Assuntos
Alcoolismo/complicações , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Transplante de Rim , Fumar/efeitos adversos , Abuso de Substâncias por Via Intravenosa/complicações , Doadores de Tecidos , Feminino , Seguimentos , Rejeição de Enxerto/epidemiologia , Humanos , Incidência , Transplante de Rim/mortalidade , Transplante de Rim/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida , Fatores de Tempo , Estados Unidos/epidemiologia , Listas de EsperaRESUMO
Cardiovascular disease (CVD) leads to increased mortality rates among renal transplant recipients; however, its effect on allograft survival has not been well studied. The records from the United States Renal Data System and the United Network for Organ Sharing from January 1, 1995, through December 31, 2002, were examined in this retrospective study. The outcome variables were allograft survival time and recipient survival time. The primary variable of interest was CVD, defined as the presence of at least one of the following: cardiac arrest, myocardial infarction, dysrhythmia, congestive heart failure, ischemic heart disease, peripheral vascular disease, and unstable angina. The Cox models were adjusted for potential confounding factors. Of the 105,181 patients in the data set, 20,371 had a diagnosis of CVD. The presence of CVD had an adverse effect on allograft survival time (HR 1.12, p < 0.001) and recipient survival time (HR 1.41, p < 0.001). Among the subcategories, congestive heart failure (HR 1.14, p < 0.005) and dysrhythmia (HR 1.26, p < 0.05) had adverse effects on allograft survival time. In addition to increasing mortality rates, CVD at the time of end-stage renal disease onset is also a significant risk factor for renal allograft failure. Further research is needed to evaluate the role of specific forms of CVD in allograft and recipient outcome.
Assuntos
Doenças Cardiovasculares/complicações , Transplante de Rim/mortalidade , Transplante de Rim/estatística & dados numéricos , Adulto , Arritmias Cardíacas/complicações , Estudos de Casos e Controles , Criança , Bases de Dados Factuais , Feminino , Seguimentos , Sobrevivência de Enxerto , Insuficiência Cardíaca/complicações , Humanos , Doadores Vivos , Masculino , Estudos Retrospectivos , Fatores de Risco , Análise de Sobrevida , Fatores de Tempo , Transplante Homólogo , Resultado do Tratamento , Estados UnidosRESUMO
BACKGROUND: End-stage renal disease is associated with illness-induced disruptions that challenge patients and their families to accommodate and adapt. However, the impact of patients' marital status on kidney transplant outcome has never been studied. This project, based on data from United States Renal Data System (USRDS), helps to answer how marriage affects renal transplant outcome. METHODS: Data have been collected from USRDS on all kidney/kidney-pancreas allograft recipients between January 1, 1995 and June 30, 2002, who were 18 yr old or older and had information about their marital status prior to the kidney transplantation (n = 2061). Survival analysis was performed using Kaplan-Meier methods and Cox proportional hazards modeling to control for confounding variables. RESULTS: Overall findings of this study suggest that being married has a significant protective effect on death-censored graft survival [Hazard Ratio (HR) 0.80, p < 0.05] but a non-significant effect on recipient survival (HR 0.85, p = 0.122). When stratified by gender, the effect was still present in males for death-censored graft survival (HR 0.75, p < 0.05), but not for recipient survival (HR 0.86, p = 0.24). The effect was not observed in females, where neither graft (HR 0.90, p = 0.55) nor recipient (HR 0.8, p = 0.198) survival had an association with marital status. In subgroup analysis similar association was found in the recipients of a single transplant. CONCLUSION: Based on our analysis, being married in the pre-transplant period is associated with positive outcome for the graft, but not for the recipient survival. When analyzed separately, the effect is present in male, but not in female recipients.
Assuntos
Transplante de Rim/psicologia , Estado Civil , Adulto , Etnicidade , Feminino , Humanos , Falência Renal Crônica/etiologia , Falência Renal Crônica/cirurgia , Transplante de Rim/mortalidade , Masculino , Prontuários Médicos , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Grupos Raciais , Análise de Sobrevida , Doadores de Tecidos/estatística & dados numéricos , Resultado do TratamentoRESUMO
BACKGROUND: The role of traditional risk factors, including plasma lipids, in the pathogenesis of cardiovascular (CV) disease in chronic dialysis patients is unclear. Previous studies have suggested that lower serum total cholesterol (TC) is associated with higher mortality in patients on chronic haemodialysis (HD). Whether this relationship is specific to the HD population or is common to the uraemic state is unclear. The present study evaluated the association of serum TC and triglycerides with clinical outcomes in chronic peritoneal dialysis (PD) patients. METHODS: Data of 1053 PD patients from the United States Renal Data System (USRDS) prospective Dialysis Morbidity and Mortality Study Wave 2 were examined. Cox regression was used to evaluate the relationship between lipid levels and mortality. RESULTS: Patients with TC levels < or =125 mg/dl (3.24 mmol/l) had a statistically significant increased risk of an all-cause mortality, including those taking or not taking lipid-modifying medications, compared with the reference of 176-225 mg/dl (4.54-5.83 mmol/l). In stratified analysis, this association was demonstrated in patients with serum albumin >3.0 g/dl (30 g/l), but not with albumin < or =3.0 g/dl. Compared with patients with triglyceride levels of 201-300 mg/dl (2.27-3.39 mmol/l), a statistically significant reduction of all-cause, but not CV, mortality was observed in patients with triglyceride levels of 101-200 mg/dl (1.14-2.26 mmol/l), as well as in the subgroup with serum albumin levels <3.0 g/dl (30 g/l) and triglycerides of < or =100 mg/dl (1.13 mmol/l) and 101-200 mg/dl (1.14-2.26 mmol/l). CONCLUSIONS: While confounding factors and causal pathways have not been clearly identified, aggressive lowering of plasma cholesterol in PD patients is not supported by this study, however, treatment of hypertriglyceridaemia may be warranted with triglyceride levels >200 mg/dl (2.26 mmol/l).
Assuntos
Falência Renal Crônica/sangue , Lipídeos/sangue , Diálise Renal/métodos , Adulto , Idoso , Colesterol/sangue , Estudos de Coortes , Feminino , Humanos , Estimativa de Kaplan-Meier , Falência Renal Crônica/mortalidade , Falência Renal Crônica/terapia , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Diálise Renal/estatística & dados numéricos , Fatores de Risco , Taxa de Sobrevida , Triglicerídeos/sangueRESUMO
BACKGROUND/AIM: While the familial nature of chronic kidney disease (CKD) has been recognized, it has primarily been defined from studies of first-degree relatives of selected sets of cases. The goal of this study is an evaluation of the familial clustering of end-stage renal disease (ESRD) and CKD mortality using a population-based genealogy of Utah. This is the first population-based analysis of the familial component of ESRD and non-ESRD CKD. METHODS: We have defined two distinct patient groups for this analysis, using individuals with death certificates in the Utah Population Database indicating ESRD (n = 192) and non-ESRD CKD (n = 335) as the cause of death. Two measures of familiality were used: (1) relative risk (RR) of CKD or ESRD death in relatives of cases and (2) an average relatedness statistic, i.e., the Genealogical Index of Familiality. RESULTS: The RR for dying with ESRD among the first-degree relatives of individuals dying with ESRD is estimated to be 10.1 (p = 0.0007, 95% confidence interval CI 2.76-25.95), but is not significantly elevated among second-degree relatives. The RR for dying with non-ESRD CKD among first- and second-degree relatives of individuals dying with non-ESRD CKD was 3.89 (p = 0.0051, 95% CI 1.43-8.46) and 3.11 (p = 0.04, 95% CI 0.85-7.95), respectively. The Genealogical Index of Familiality statistic demonstrated that the individuals dying with ESRD are significantly more related than expected in this population (p = 0.013); significant excess relatedness was also observed for individuals dying with non-ESRD CKD (p = 0.006), suggesting a familial component for both, with evidence for common environmental and genetic effects. CONCLUSION: The results of this analysis of individuals dying with ESRD and non-ESRD CKD supports a significant and independent familial component to both conditions, suggesting a heritable factor playing a role.
Assuntos
Genealogia e Heráldica , Falência Renal Crônica/genética , Falência Renal Crônica/mortalidade , Linhagem , População , Causas de Morte , Bases de Dados Genéticas , Atestado de Óbito , Estudos de Avaliação como Assunto , Predisposição Genética para Doença/epidemiologia , Humanos , Falência Renal Crônica/epidemiologia , Sistema de Registros , Fatores de Risco , Utah/epidemiologiaRESUMO
BACKGROUND: There has been a general trend towards shortened length of post-kidney transplant hospitalization (LOH). The decision regarding patients's discharge from the hospital theoretically may be based on several factors, including, but not limited to, patient well being, insurance status, family situation and other, mostly socio-economic factors, as opposed to hard medical evidence. However, the appropriate LOH in kidney transplant recipients is not well studied regarding long-term outcomes. METHODS: This study retrospectively analysed the association between LOH and graft and recipient survival based on United States Renal Data System dataset. In total, 100,762 patients who underwent transplant during 1995-2002 were included. Kaplan-Meier survival analysis and Cox models were applied to the whole patient cohort and on sub-groups stratified by the presence of delayed graft function, patient comorbidity index and donor type (deceased or living). RESULTS: In recipient survival, both short (<4 d) and long (>5 d) LOH showed a significant adverse effect (p < 0.01) on survival times. In the analysis of graft survival, long LOH (>or=2 wk) also showed significant adverse effects (p < 0.001) on survival times. However, short LOH (<4 d) did not reach statistical significance, although it was still associated with adverse effects on graft survival. These observations were consistent across the whole patient cohort and sub-groups stratified by the presence of delayed graft function, patient comorbidity index and donor type. CONCLUSION: Clinical considerations should be used to make the decision regarding appropriate time of post-kidney transplant recipient discharge. Based on this study, shorter than four d post-kidney transplant hospitalization may potentially be harmful to long-term graft and recipient survival.
Assuntos
Sobrevivência de Enxerto/fisiologia , Transplante de Rim/fisiologia , Tempo de Internação , Adulto , Feminino , Humanos , Transplante de Rim/mortalidade , Masculino , Pessoa de Meia-Idade , Alta do Paciente , Análise de SobrevidaRESUMO
There is controversy regarding the influence of genetic versus environmental factors on kidney transplant outcome in minority groups. The goal of this project was to evaluate the role of certain socioeconomic factors in allograft and recipient survival. Graft and recipient survival data from the United States Renal Data System were analyzed using Cox modeling with primary variables of interest, including recipient education level, citizenship, and primary source of pay for medical service. College (hazard ratio [HR] 0.93, P < 0.005) and postcollege education (HR 0.85, P < 0.005) improved graft outcome in the whole group and in patients of white race. Similar trends were observed for recipient survival (HR 0.9, P < 0.005 for college; HR 0.88, P = 0.09 for postcollege education) in the whole population and in white patients. Resident aliens had a significantly better graft outcome in the entire patient population (HR 0.81, P < 0.001) and in white patients in subgroup analysis (HR 0.823, P < 0.001) compared with US citizens. A similar effect was observed for recipient survival. Using Medicare as a reference group, there is a statistically significant benefit to graft survival from having private insurance in the whole group (HR 0.87, P < 0.001) and in the black (HR 0.8, P < 0.001) and the white (HR 0.89, P < 0.001) subgroups; a similar effect of private insurance is observed on recipient survival in the entire group of patients and across racial groups. Recipients with higher education level, resident aliens, and patients with private insurance have an advantage in the graft and recipient outcomes independent of racial differences.
Assuntos
Transplante de Rim/mortalidade , Adulto , Feminino , Humanos , Masculino , Fatores Socioeconômicos , Taxa de Sobrevida , Resultado do TratamentoRESUMO
Data of long-term immunosuppressive protocol comparison are lacking. The goal of this study was to compare kidney transplant outcome using three common immunosuppressive protocols. A retrospective study was performed of the graft and recipient survival using US Renal Data System data (n = 31,012) between January 1, 1995, and December 31, 1999, with the follow-up through December 31, 2000, on prednisone + cyclosporine + mycophenolate mofetil (PCM; n = 17,108), prednisone + tacrolimus + mycophenolate mofetil (PTM; n = 7225), or prednisone + cyclosporine + azathioprine (PCA; n = 6679). Compared with PCM, there is an increased risk for allograft failure associated with PTM (hazard ratio [HR] 1.09; P < 0.05) and PCA (HR 1.15; P < 0.001). Similar associations were demonstrated in the following subgroups: Early (before 1997) and late (in or after 1997) transplant periods, in living-donor transplants, and in adult and kidney-only recipients. This association also was found between PCA regimen and graft survival in the entire patient population (HR 1.15; P < 0.001) and in the studied subgroups. PCA (HR 1.15; P < 0.005), but not PTM (HR 1.01; P = 0.816), regimen was associated with increased recipient mortality in the entire patient population and in patient subgroups. Secondary outcomes (serum creatinine values at given time points, acute rejection rate, and posttransplantation malignancies) are also discussed. These data suggest that a PCM regimen is associated with lower risk for graft failure compared with a PTM regimen and with lower risk for graft failure and recipient death compared with a PCA regimen.