Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
1.
Am J Kidney Dis ; 45(1): 127-35, 2005 Jan.
Article in English | MEDLINE | ID: mdl-15696452

ABSTRACT

BACKGROUND: Benefits in terms of reductions in mortality corresponding to improvements in Kidney Disease Outcomes Quality Initiative (K/DOQI) compliance for adequacy of dialysis dose and anemia control have not been documented in the literature. We studied changes in achieving K/DOQI guidelines at the facility level to determine whether those changes are associated with corresponding changes in mortality. METHODS: Adjusted mortality and fractions of patients achieving K/DOQI guidelines for urea reduction ratios (URRs; > or =65%) and hematocrit levels (> or =33%) were computed for 2,858 dialysis facilities from 1999 to 2002 using national data for patients with end-stage renal disease. Linear and Poisson regression were used to study the relationship between K/DOQI compliance and mortality and between changes in compliance and changes in mortality. RESULTS: In 2002, facilities in the lowest quintile of K/DOQI compliance for URR and hematocrit guidelines had 22% and 14% greater mortality rates (P < 0.0001) than facilities in the highest quintile, respectively. A 10-percentage point increase in fraction of patients with a URR of 65% or greater was associated with a 2.2% decrease in mortality (P = 0.0006), and a 10-percentage point increase in percentage of patients with a hematocrit of 33% or greater was associated with a 1.5% decrease in mortality (P = 0.003). Facilities in the highest tertiles of improvement for URR and hematocrit had a change in mortality rates that was 15% better than those observed for facilities in the lowest tertiles (P < 0.0001). CONCLUSION: Both current practice and changes in practices with regard to achieving anemia and dialysis-dose guidelines are associated significantly with mortality outcomes at the dialysis-facility level.


Subject(s)
Anemia/prevention & control , Renal Dialysis/mortality , Urea/blood , Guideline Adherence , Hematocrit/standards , Hematocrit/statistics & numerical data , Hemodialysis Units, Hospital/trends , Humans , Kidney Failure, Chronic/blood , Kidney Failure, Chronic/therapy , Practice Guidelines as Topic/standards , Proportional Hazards Models , Quality Assurance, Health Care/methods , Quality Assurance, Health Care/statistics & numerical data , Renal Dialysis/standards , Retrospective Studies , Survival Analysis , United States/epidemiology
2.
JAMA ; 294(21): 2726-33, 2005 Dec 07.
Article in English | MEDLINE | ID: mdl-16333008

ABSTRACT

CONTEXT: Transplantation using kidneys from deceased donors who meet the expanded criteria donor (ECD) definition (age > or =60 years or 50 to 59 years with at least 2 of the following: history of hypertension, serum creatinine level >1.5 mg/dL [132.6 micromol/L], and cerebrovascular cause of death) is associated with 70% higher risk of graft failure compared with non-ECD transplants. However, if ECD transplants offer improved overall patient survival, inferior graft outcome may represent an acceptable trade-off. OBJECTIVE: To compare mortality after ECD kidney transplantation vs that in a combined standard-therapy group of non-ECD recipients and those still receiving dialysis. DESIGN, SETTING, AND PATIENTS: Retrospective cohort study using data from a US national registry of mortality and graft outcomes among kidney transplant candidates and recipients. The cohort included 109,127 patients receiving dialysis and added to the kidney waiting list between January 1, 1995, and December 31, 2002, and followed up through July 31, 2004. MAIN OUTCOME MEASURE: Long-term (3-year) relative risk of mortality for ECD kidney recipients vs those receiving standard therapy, estimated using time-dependent Cox regression models. RESULTS: By end of follow-up, 7790 ECD kidney transplants were performed. Because of excess ECD recipient mortality in the perioperative period, cumulative survival did not equal that of standard-therapy patients until 3.5 years posttransplantation. Long-term relative mortality risk was 17% lower for ECD recipients (relative risk, 0.83; 95% confidence interval, 0.77-0.90; P<.001). Subgroups with significant ECD survival benefit included patients older than 40 years, both sexes, non-Hispanics, all races, unsensitized patients, and those with diabetes or hypertension. In organ procurement organizations (OPOs) with long median waiting times (>1350 days), ECD recipients had a 27% lower risk of death (relative risk, 0.73; 95% confidence interval, 0.64-0.83; P<.001). In areas with shorter waiting times, only recipients with diabetes demonstrated an ECD survival benefit. CONCLUSIONS: ECD kidney transplants should be offered principally to candidates older than 40 years in OPOs with long waiting times. In OPOs with shorter waiting times, in which non-ECD kidney transplant availability is higher, candidates should be counseled that ECD survival benefit is observed only for patients with diabetes.


Subject(s)
Donor Selection/standards , Kidney Transplantation/mortality , Adolescent , Adult , Aged , Algorithms , Child , Child, Preschool , Cohort Studies , Female , Humans , Infant , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Male , Middle Aged , Proportional Hazards Models , Renal Dialysis , Retrospective Studies , Survival Analysis , Waiting Lists
3.
Am J Kidney Dis ; 40(2): 381-4, 2002 Aug.
Article in English | MEDLINE | ID: mdl-12148112

ABSTRACT

The standardized mortality ratio (SMR) has been used to provide information about adjusted survival outcomes at dialysis facilities. There has been concern that high rates of transplantation could unjustly lead to unfavorable SMR profiles for individual dialysis units because healthier patients would be removed from dialysis therapy, leaving less healthy patients in the dialysis pool. We correlated 1999 overall adjusted SMR and 1999 standardized transplantation ratio (STR) weighted for mortality patient count and count of first transplantations of patients younger than 65 years. A total of 2,362 facilities were included in analyses. We found no correlation between rates of transplantation (by STR) and overall mortality profile (by SMR) based on Pearson's correlation coefficients (r), either unweighted, weighted by number of patients included in the 1999 mortality calculation (SMR), or weighted by number of patients included in the 1999 transplantation calculation (r = -0.016, r = -0.015, and r = -0.015, respectively; P > 0.40 for each). Sensitivity analyses using SMR and STR over 3- and 3.5-year periods (January 1997 to June 2000) also showed no correlation between SMR and STR, respectively. We conclude that reported standardized rates for transplantation do not correlate with those reported for mortality by dialysis facilities.


Subject(s)
Ambulatory Care Facilities/standards , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Kidney Transplantation/mortality , Kidney Transplantation/standards , Age Factors , Aged , Ambulatory Care Facilities/statistics & numerical data , Data Interpretation, Statistical , Health Services Research/standards , Health Services Research/statistics & numerical data , Humans , Kidney Failure, Chronic/therapy , Kidney Transplantation/statistics & numerical data , Quality of Health Care/statistics & numerical data , Renal Dialysis/mortality , Renal Dialysis/standards , Renal Dialysis/statistics & numerical data , Sex Factors , United States
4.
Am J Kidney Dis ; 43(6): 1014-23, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15168381

ABSTRACT

BACKGROUND: Several observational studies reported lower mortality risk among hemodialysis patients treated with doses greater than the standard dose. The present study evaluates, with observational data, the secondary randomized Hemodialysis (HEMO) Study finding that greater dialysis dose may benefit women, but not men. METHODS: Data from 74,120 US hemodialysis patients starting end-stage renal disease therapy were analyzed. Patients were classified into 1 of 5 categories of hemodialysis dose according to their average urea reduction ratio (URR), and their relative risk (RR) for mortality was evaluated by using Cox proportional hazards models. Similar analyses using equilibrated Kt/V were completed for 10,816 hemodialysis patients in the Dialysis Outcomes and Practice Patterns Study (DOPPS) in 7 countries. RESULTS: For both men and women, RR was substantially lower in the URR 70%-to-75% category compared with the URR 65%-to-70% category. Among women, RR in the URR greater-than-75% category was significantly lower compared with the URR 70%-to-75% group (P < 0.0001); however, no further association with mortality risk was observed for the greater-than-75% category among men (P = 0.22). RR associated with doses greater than the Kidney Disease Outcomes Quality Initiative guidelines (URR > or = 65%) was significantly different for men compared with women (P < 0.01). Similar differences by sex were observed in DOPPS analyses. CONCLUSION: The agreement of these observational studies with the HEMO Study supports the existence of a survival benefit from greater dialysis doses for women, but not for men. Responses to greater dialysis dose by sex deserve additional study to explain these differences.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Aged , Female , Humans , Male , Middle Aged , Outcome and Process Assessment, Health Care/methods , Outcome and Process Assessment, Health Care/statistics & numerical data , Proportional Hazards Models , Sex Distribution , Survival Rate
5.
Am J Transplant ; 5(4 Pt 2): 934-49, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15760419

ABSTRACT

Using OPTN/SRTR data, this article reviews the state of thoracic organ transplantation in 2003 and the previous decade. Time spent on the heart waiting list has increased significantly over the last decade. The percentage of patients awaiting heart transplantation for >2 years increased from 23% in 1994 to 49% by 2003. However, there has been a general decline in heart waiting list death rates over the decade. In 2003, the lung transplant waiting list reached a record high of 3,836 registrants, up slightly from 2002 and more than threefold since 1994. One-year patient survival for those receiving lungs in 2002 was 82%, a statistically significant improvement from 2001 (78%). The number of patients awaiting a heart-lung transplant, declining since 1998, reached 189 in 2003. Adjusted patient survival for heart-lung recipients is consistently worse than the corresponding rate for isolated lung recipients, primarily due to worse outcomes for heart-lung recipients with congenital heart disease. A new lung allocation system, approved in June 2004, derives from the survival benefit of transplantation with consideration of urgency based on waiting list survival, instead of being based solely on waiting time. A goal of the policy is to minimize deaths on the waiting list.


Subject(s)
Heart Transplantation/statistics & numerical data , Lung Transplantation/statistics & numerical data , Female , Forecasting , Graft Survival , Heart Diseases/mortality , Heart Diseases/surgery , Humans , Male , Racial Groups/statistics & numerical data , Tissue and Organ Procurement/standards , Tissue and Organ Procurement/trends , United States , Waiting Lists
6.
Am J Transplant ; 5(4 Pt 2): 950-7, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15760420

ABSTRACT

This article provides detailed explanations of the methods frequently employed in outcomes analyses performed by the Scientific Registry of Transplant Recipients (SRTR). All aspects of the analytical process are discussed, including cohort selection, post-transplant follow-up analysis, outcome definition, ascertainment of events, censoring, and adjustments. The methods employed for descriptive analyses are described, such as unadjusted mortality rates and survival probabilities, and the estimation of covariant effects through regression modeling. A section on transplant waiting time focuses on the kidney and liver waiting lists, pointing out the different considerations each list requires and the larger questions that such analyses raise. Additionally, this article describes specialized modeling strategies recently designed by the SRTR and aimed at specific organ allocation issues. The article concludes with a description of simulated allocation modeling (SAM), which has been developed by the SRTR for three organ systems: liver, thoracic organs, and kidney-pancreas. SAMs are particularly useful for comparing outcomes for proposed national allocation policies. The use of SAMs has already helped in the development and implementation of a new policy for liver candidates with high MELD scores to be offered organs regionally before the organs are offered to candidates with low MELD scores locally.


Subject(s)
Kidney Transplantation/statistics & numerical data , Liver Transplantation/statistics & numerical data , Research , Data Interpretation, Statistical , Graft Survival , Humans , Patient Selection , Waiting Lists
7.
Ann Surg ; 242(3): 314-23, discussion 323-5, 2005 Sep.
Article in English | MEDLINE | ID: mdl-16135918

ABSTRACT

OBJECTIVE: The objective of this study was to characterize the patient population with respect to patient selection, assess surgical morbidity and graft failures, and analyze the contribution of perioperative clinical factors to recipient outcome in adult living donor liver transplantation (ALDLT). SUMMARY BACKGROUND DATA: Previous reports have been center-specific or from large databases lacking detailed variables. The Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) represents the first detailed North American multicenter report of recipient risk and outcome aiming to characterize variables predictive of graft failure. METHODS: Three hundred eighty-five ALDLT recipients transplanted at 9 centers were studied with analysis of over 35 donor, recipient, intraoperative, and postoperative variables. Cox regression models were used to examine the relationship of variables to the risk of graft failure. RESULTS: Ninety-day and 1-year graft survival were 87% and 81%, respectively. Fifty-one (13.2%) grafts failed in the first 90 days. The most common causes of graft failure were vascular thrombosis, primary nonfunction, and sepsis. Biliary complications were common (30% early, 11% late). Older recipient age and length of cold ischemia were significant predictors of graft failure. Center experience greater than 20 ALDLT was associated with a significantly lower risk of graft failure. Recipient Model for End-stage Liver Disease score and graft size were not significant predictors. CONCLUSIONS: This multicenter A2ALL experience provides evidence that ALDLT is a viable option for liver replacement. Older recipient age and prolonged cold ischemia time increase the risk of graft failure. Outcomes improve with increasing center experience.


Subject(s)
Liver Transplantation/statistics & numerical data , Living Donors , Postoperative Complications , Adult , Age Factors , Cohort Studies , Female , Graft Survival , Humans , Liver Failure/surgery , Male , Middle Aged , North America/epidemiology , Patient Selection , Retrospective Studies , Survival Analysis , Treatment Outcome
8.
Kidney Int ; 62(1): 329-38, 2002 Jul.
Article in English | MEDLINE | ID: mdl-12081595

ABSTRACT

BACKGROUND: Creating a functioning initial arteriovenous (AV) access for aging and diabetic end-stage renal disease (ESRD) hemodialysis patients has been a challenge. METHODS: This study describes 748 consecutive primary AV access creations and their primary (unassisted) and secondary (assisted) access survival at a single center. Twenty-four percent of the patients had diabetes as their cause of ESRD and the average age was 59.6 years. No patient receiving an initial AV access required synthetic graft material. All received an AV fistula. Three types of fistulae were created and their distribution varied significantly for diabetic and non-diabetic patients (respective percentages): forearm AV fistula (24%, 62%), perforating vein fistula (PVF) at the elbow (48%, 21%) and non-PVF at the elbow (29%, 17%). RESULTS: Results of access survival for age groups <65 and 65+ years, male and female, diabetic and non-diabetic subgroups ranged from 51 to 75% for unassisted and from 75 to 96% for assisted two year access survival. PVF appeared to be advantageous over non-PVF access at the elbow. First intervention for peripheral steal syndrome was required at a rate of 7 and 0.6 per 100 patient-years at risk for diabetic and non-diabetic patients, respectively. The thrombosis rates per patient year of 0.03 for non-diabetics and 0.07 for diabetics are superior to previously published results for AV fistulae or for a combined AV fistula-AV graft approach. CONCLUSIONS: Potential explanations for these excellent results among elderly and diabetic patients include preoperative evaluation, exclusive use of native vessels, a variable surgical approach including PVF, and the experience of a single operator.


Subject(s)
Arteriovenous Shunt, Surgical/methods , Renal Dialysis/methods , Age Factors , Aged , Arteriovenous Shunt, Surgical/adverse effects , Female , Humans , Kidney Failure, Chronic/therapy , Male , Middle Aged , Retrospective Studies
9.
Am J Transplant ; 4(3): 373-7, 2004 Mar.
Article in English | MEDLINE | ID: mdl-14961989

ABSTRACT

We sought to determine which type of donor graft provides children and young adults with the best outcomes following liver transplantation. Using the US Scientific Registry of Transplant Recipients database, we identified 6467 recipients of first liver transplants during 1989-2000 aged < 30 years. We used Cox models to examine adjusted patient and graft outcomes by age (< 2, 2-10, 11-16, 17-29) and donor graft type (deceased donor full size (DD-F), split (DD-S), living donor (LD)]. For patients aged < 2, LD grafts had a significantly lower risk of graft failure than DD-S (RR = 0.49, p < 0.0001) and DD-F (RR = 0.70, p = 0.02) and lower mortality risk than DD-S (RR = 0.71, p = 0.08) during the first year post-transplant. In contrast, older children exhibited a higher risk of graft loss and a trend toward higher mortality associated with LD transplants. In young adults, DD-S transplants were associated with poor outcomes. Three-year follow up yielded similar graft survival results but no significant differences in mortality risk by graft type within age group. For recipients aged < 2, LD transplants provide superior graft survival than DD-F or DD-S and trend toward better patient survival than DD-S. Living donor is the preferred donor source in the most common pediatric age group (< 2 years) undergoing liver transplantation.


Subject(s)
Graft Survival/immunology , Liver Transplantation/immunology , Liver/surgery , Adolescent , Adult , Age Factors , Child , Child, Preschool , Humans , Infant , Liver Transplantation/mortality , Time Factors
10.
Am J Transplant ; 4 Suppl 9: 54-71, 2004.
Article in English | MEDLINE | ID: mdl-15113355

ABSTRACT

Analysis of the OPTN/SRTR database demonstrates that, in 2002, pediatric recipients accounted for 7% of all recipients, while pediatric individuals accounted for 14% of deceased organ donors. For children fortunate enough to receive a transplant, there has been continued improvement in outcomes following all forms of transplantation. Current 1-year graft survival is generally excellent, with survival rates following transplantation in many cases equaling or exceeding those of all other recipients. In renal transplantation, despite excellent early graft survival, there is evidence that long-term graft survival for adolescent recipients is well below that of other recipients. A causative role for noncompliance is possible. While the significant improvements in graft and patient survival are laudable, waiting list mortality remains excessive. Pediatric candidates awaiting liver, intestine, and thoracic transplantation face mortality rates generally greater than those of their adult counterparts. This finding is particularly pronounced in patients aged 5 years and younger. While mortality awaiting transplantation is an important consideration in refining organ allocation strategies, it is important to realize that other issues, in addition to mortality, are critical for children. Consideration of the impact of end-stage organ disease on growth and development is often equally important, both while awaiting and after transplantation.


Subject(s)
Transplantation/statistics & numerical data , Age Distribution , Child , Databases, Factual , Heart Transplantation/mortality , Heart Transplantation/statistics & numerical data , Humans , Immunosuppression Therapy/methods , Intestines/transplantation , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Liver Transplantation/mortality , Liver Transplantation/statistics & numerical data , Lung Transplantation/mortality , Lung Transplantation/statistics & numerical data , Survival Analysis , Tissue Donors/statistics & numerical data , Transplantation/trends , Transplantation, Homologous/mortality , Transplantation, Homologous/statistics & numerical data , Waiting Lists
11.
Am J Transplant ; 4 Suppl 9: 106-13, 2004.
Article in English | MEDLINE | ID: mdl-15113359

ABSTRACT

It is highly desirable to base decisions designed to improve medical practice or organ allocation policies on the analyses of the most recent data available. Yet there is often a need to balance this desire with the added value of evaluating long-term outcomes (e.g. 5-year mortality rates), which requires the use of data from earlier years. This article explains the methods used by the Scientific Registry of Transplant Recipients in order to achieve these goals simultaneously. The analysis of waiting list and transplant outcomes depends strongly on statistical methods that can combine data from different cohorts of patients that have been followed for different lengths of time. A variety of statistical methods have been designed to address these goals, including the Kaplan-Meier estimator, Cox regression models, and Poisson regression. An in-depth description of the statistical methods used for calculating waiting times associated with the various types of organ transplants is provided. Risk of mortality and graft failure, adjusted analyses, cohort selection, and the many complicating factors surrounding the calculation of follow-up time for various outcomes analyses are also examined.


Subject(s)
Research/trends , Transplantation/methods , Cohort Studies , Humans , Patient Selection , Research Design , Transplantation/mortality , Transplantation/statistics & numerical data , Treatment Failure , Treatment Outcome , Waiting Lists
SELECTION OF CITATIONS
SEARCH DETAIL