Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 198
Filter
1.
Transplant Proc ; 44(7): 2223-6, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22974959

ABSTRACT

To maximize deceased donation, it is necessary to facilitate organ recovery from expanded criteria donors (ECDs). Utilization of donors meeting the kidney definition for ECDs increases access to kidney transplantation and reduces waiting times; however, ECDs often do not proceed to kidney recovery. Based on a prospective study of three Organ Procurement Organizations in the United States, we describe the characteristics of donors meeting the Organ Procurement and Transplant Network (OPTN) ECD kidney definition (donor age 60+ or donor age 50-60 years with two of the following: final serum creatinine > 1.5 mg/dL, history of hypertension, or death from cerebral vascular accident) who donated a liver without kidney recovery. ECDs with organs recovered between February 2003 and September 2005 by New England Organ Bank, Gift of Life Michigan, and LifeChoice Donor Services were studied (n = 324). All donors were declared dead by neurological criteria. Data on a wide range of donor characteristics were collected, including donor demographics, medical history, cause of death, donor status during hospitalization, serological status, and donor kidney quality. Logistic regression models were used to identify donor characteristics predictive of liver-alone donation. Seventy-four of the 324 donors fulfilling the ECD definition for kidneys donated a liver alone (23%). History of diabetes, final serum creatinine > 1.5 mg/dL, age 70+, and presence of proteinuria were associated with liver-alone donation in univariate models. On multivariate analysis, only final serum creatinine > 1.5 mg/dL and age 70+ were independently predictive of liver donation alone. Older age and elevated serum creatinine may be perceived as stronger contraindications to kidney donation than the remaining elements of the ECD definition. It is likely that at least a proportion of these liver-alone donors represent missed opportunities for kidney transplantation.


Subject(s)
Kidney Transplantation , Tissue Donors , Cohort Studies , Humans , Middle Aged , Predictive Value of Tests , Prospective Studies
2.
Am J Transplant ; 11(8): 1712-8, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21672159

ABSTRACT

In 2003, the US kidney allocation system was changed to eliminate priority for HLA-B similarity. We report outcomes from before and after this change using data from the Scientific Registry of Transplant Recipients (SRTR). Analyses were based on 108 701 solitary deceased donor kidney recipients during the 6 years before and after the policy change. Racial/ethnic distributions of recipients in the two periods were compared (chi-square); graft failures were analyzed using Cox models. In the 6 years before and after the policy change, the overall number of deceased donor transplants rose 23%, with a larger increase for minorities (40%) and a smaller increase for non-Hispanic whites (whites) (8%). The increase in the proportion of transplants for non-whites versus whites was highly significant (p < 0.0001). Two-year graft survival improved for all racial/ethnic groups after implementation of this new policy. Findings confirmed prior SRTR predictions. Following elimination of allocation priority for HLA-B similarity, the deficit in transplantation rates among minorities compared with that for whites was reduced but not eliminated; furthermore, there was no adverse effect on graft survival.


Subject(s)
HLA-B Antigens/immunology , Health Policy , Histocompatibility Testing , Kidney Transplantation , Graft Survival , Humans , Population Groups , Tissue Donors , United States
5.
Am J Transplant ; 8(4 Pt 2): 988-96, 2008 Apr.
Article in English | MEDLINE | ID: mdl-18336701

ABSTRACT

Transplant tourism, where patients travel to foreign countries specifically to receive a transplant, is poorly characterized. This study examined national data to determine the minimum scope of this practice. US national waiting list removal data were analyzed. Waiting list removals for transplant without a corresponding US transplant in the database were reviewed via a data validation query to transplant centers to identify foreign transplants. Additionally, waiting list removal records with text field entries indicating a transplant abroad were identified. We identified 373 foreign transplants (173 directly noted; 200 from data validation); most (89.3%) were kidney transplants. Between 2001 and 2006, the annual number of waiting list removals for transplant abroad increased. Male sex, Asian race, resident and nonresident alien status and college education were significantly and independently associated with foreign transplant. Recipients from 34 states, plus the District of Columbia, received foreign transplants in 35 countries, led by China, the Philippines and India. Transplants in foreign countries among waitlisted candidates in the US are increasingly performed. The data reported here represent the minimum number of cases and the full extent of this practice cannot be determined using existing data. Additional reporting requirements are needed.


Subject(s)
Transplantation/statistics & numerical data , Waiting Lists , Asia , Geography , Humans , Registries/statistics & numerical data , Tissue and Organ Procurement/statistics & numerical data , Travel , United States
6.
Am J Transplant ; 8(4): 783-92, 2008 Apr.
Article in English | MEDLINE | ID: mdl-18294347

ABSTRACT

We examined factors associated with expanded criteria donor (ECD) kidney discard. Scientific Registry of Transplant Recipients (SRTR)/Organ Procurement and Transplantation Network (OPTN) data were examined for donor factors using logistic regression to determine the adjusted odds ratio (AOR) of discard of kidneys recovered between October 1999 and June 2005. Logistic and Cox regression models were used to determine associations with delayed graft function (DGF) and graft failure. Of the 12,536 recovered ECD kidneys, 5139 (41%) were discarded. Both the performance of a biopsy (AOR = 1.21, p = 0.02) and the degree of glomerulosclerosis (GS) on biopsy were significantly associated with increased odds of discard. GS was not consistently associated with DGF or graft failure. The discard rate of pumped ECD kidneys was 29.7% versus 43.6% for unpumped (AOR = 0.52, p < 0.0001). Among pumped kidneys, those with resistances of 0.26-0.38 and >0.38 mmHg/mL/min were discarded more than those with resistances of 0.18-0.25 mmHg/mL/min (AOR = 2.5 and 7.9, respectively). Among ECD kidneys, pumped kidneys were less likely to have DGF (AOR = 0.59, p < 0.0001) but not graft failure (RR = 0.9, p = 0.27). Biopsy findings and machine perfusion are important correlates of ECD kidney discard; corresponding associations with graft failure require further study.


Subject(s)
Kidney , Patient Selection , Tissue Donors/supply & distribution , Biopsy , Cadaver , Death , Humans , Kidney/cytology , Kidney/pathology , Kidney Transplantation/statistics & numerical data , Liver , Liver Transplantation/statistics & numerical data , Living Donors/supply & distribution , Perfusion/methods , Registries , Treatment Outcome , United States , Waiting Lists
7.
Nefrologia ; 27(4): 496-504, 2007.
Article in Spanish | MEDLINE | ID: mdl-17944588

ABSTRACT

The increased mortality risk in hemodialysis (HD) patients unable to meet six targets in different areas of HD practice has been reported previously. Using a prevalent cross-sectional sample of Spanish HD patients (n = 613) from the second stage of the Dialysis Outcomes and Practice Patterns Study to determine the percentage with low dialysis dose, hyperphosphatemia, hypercalcemia, hypoalbuminemia, anemia, and catheter use and based on the mortality hazard ratios and the total HD population in Spain, according to the Spanish Society of Nephrology Report, we estimated the number of patient life years that could potentially be gained in our country. These characteristics of HD practice were selected because each is modifiable through changes in practice, each is associated with mortality, and each has a large number of patients outside the target guidelines. The targets that define "within guidelines" are as follows: dialysis dose (single pool Kt/V >1.2), anemia (hemoglobin >110 g/L), albumin after standardization (>40 g/L), serum phosphorus (1.1-1.5 mmol/L), serum calcium (2.1-2.4 mmol/L), and facility catheter use (<10%). Cox proportional hazards regression models were used to calculate the relative risk of mortality for all patients outside each guideline. In all models, calcium values were adjusted for low serum albumin. A separate Cox survival model adjusted for all six HD practices simultaneously to account for correlation that may exist between some facility practices. All models were adjusted for age, sex, race, time on ESRD, and 14 summary comorbid conditions. Patient years attributable to each of the six practice patterns were estimated and are reported here as the potential patient years gained. Comparison of the estimates by individual guideline shows that, in Spain, increasing patient albumin above 40 g/L in all patients would lead to an estimated gain of 9,269 patient years (a 7.9% increase). Additionally, if all facilities could decrease catheter use to less than 10%, 2,842 patient years could be gained (a 2.4% increase). Though it may be an unrealistic goal, if all Spanish patients currently outside the guidelines achieved all six target levels, an estimated 17,300 life years could be gained over the next five years (a 15% increase). A more achievable goal of bringing 50% of patients who are currently outside targets within targets would result in 9,266 life years gained. In conclusion, this analysis suggests large opportunities to improve HD patient care in Spain.


Subject(s)
Kidney Failure, Chronic/therapy , Practice Patterns, Physicians' , Renal Dialysis/standards , Guideline Adherence , Humans , Kidney Failure, Chronic/mortality , Prospective Studies , Risk Assessment , Spain , Time Factors
8.
Acta Clin Belg ; 62(2): 102-10, 2007.
Article in English | MEDLINE | ID: mdl-17547291

ABSTRACT

BACKGROUND: Various organizations have published clinical practice guidelines for the care of haemodialysis patients. However, it is unknown to what extent improving or even reaching perfect compliance with guidelines would improve the survival of HD patients in Belgium. METHODS: Using data from the second phase of the Dialysis Outcomes and Practice Patterns Study (DOPPS), the proportion of haemodialysis patients failing to meet six key practice targets (Kt/V > or = 1.2, haemoglobin > or =11 g/dl, phosphate 1.1-1.5 mmol/l, calcium 2.1-2, 4 mmol/l, albumin > or =40 g/l, and facility catheter use < or =10%) was calculated along with the relative risk of mortality associated with being outside these targets. The life years potentially gained from adherence to the six targets, both separately and all six together were then estimated. RESULTS: The percentage of patients outside the targets were as follows: 30.3%, Kt/V; 33.6%, haemoglobin; 56.2%, phosphate; 58.2%, calcium; 67.1%, albumin; and 91.1%, catheter. Estimated patient life years gained with improved compliance with guidelines was highest for albumin (3.670) and catheter use (2.331) but still substantial for the other four targets (ranging from 551 to 1.258). The total of patient years gained if 100% of patients have all six practices brought within target reaches 7.516 years. A conservative estimate of 50% of patients within all targets still yields an improvement of survival of 3.958 patient years. CONCLUSION: This analysis suggests large opportunities to improve HD patient care in Belgium. The avoidance of HD catheters, with the use of AV fistulas whenever possible, should be given a high priority. Admittedly, these calculations assume causality or partial causality that has not been definitively proven. Still, if causality is only partial, the results emphasize that the improvement of patient care through adherence to targets of clinical guidelines might be substantial and all Belgian nephrologists and staff members of dialysis units should carefully pursue every potential effort.


Subject(s)
Guideline Adherence , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Life Expectancy , Renal Dialysis , Belgium , Cross-Sectional Studies , Humans , Practice Guidelines as Topic
9.
G Ital Nefrol ; 24(3): 221-9, 2007.
Article in Italian | MEDLINE | ID: mdl-17554734

ABSTRACT

Knowing the relative risk (RR) of mortality associated with being outside the guideline targets and the percentage of patients in this situation, it is possible to estimate the number of patient life years that could be gained from adhering to guideline recommendations. We used a prevalent cross-sectional sample of 576 Italian patients from the Dialysis Outcomes and Practices Patterns Study (DOPPS) phase II (2002-2004) to determine the percentage of patients who failed to meet the Italian Society of Nephrology's targets for dialysis dose (spKt/V ≥ 1.3), anemia management (hemoglobin ≥ 11 g/dL), and mineral metabolism (serum calcium and phosphorus: ≤ 2.6 and ≤ 1.8 mmol/L, respectively), and the National Kidney Foundation's Kidney Disease Outcomes Quality Initiative (K/DOQI) targets for nutritional status (serum albumin ≥ 4 g/dL) and vascular access (facility catheter use ≤ 10%). We used a larger random sample of DOPPS patients to establish the adjusted RRs of mortality associated with the 6 examined targets. The percentage of patients outside the targets and the adjusted RRs were 34% and 1.12 for dialysis dose, 37.7% and 1.20 for anemia management, 40.8% and 1.14 for phosphorus, 14.4% and 1.22 for calcium, 62.5% and 1.46 for albumin, and 40.1% and 1.20 for facility catheter use. The adjusted sum of life years potentially gained by complete adherence to all 6 guidelines was 25,156 over a period of 5 years (2006-2010); a more conservative estimate, modeling life years potentially gained by bringing half of all patients outside targets within them, was 13,382. In conclusion, this analysis suggests opportunities to improve hemodialysis patient care in Italy. The magnitude of potential savings in life years should encourage greater adherence to guidelines and practices that are significantly associated with better survival.


Subject(s)
Guideline Adherence , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Renal Dialysis/standards , Cross-Sectional Studies , Humans , Italy , Prospective Studies , Survival Rate
11.
Am J Transplant ; 7(5 Pt 2): 1412-23, 2007.
Article in English | MEDLINE | ID: mdl-17428289

ABSTRACT

This article focuses on geographic variability in patient access to kidney transplantation in the United States. It examines geographic differences and trends in access rates to kidney transplantation, in the component rates of wait-listing, and of living and deceased donor transplantation. Using data from Centers for Medicare and Medicaid Services and the Organ Procurement and Transplantation Network/Scientific Registry of Transplant Recipients, we studied 700,000+ patients under 75, who began chronic dialysis treatment, received their first living donor kidney transplant, or were placed on the waiting list pre-emptively. Relative rates of wait-listing and transplantation by State were calculated using Cox regression models, adjusted for patient demographics. There were geographic differences in access to the kidney waiting list and to a kidney transplant. Adjusted wait-list rates ranged from 37% lower to 64% higher than the national average. The living donor rate ranged from 57% lower to 166% higher, while the deceased donor transplant rate ranged from 60% lower to 150% higher than the national average. In general, States with higher wait-listing rates tended to have lower transplantation rates and States with lower wait-listing rates had higher transplant rates. Six States demonstrated both high wait-listing and deceased donor transplantation rates while six others, plus D.C. and Puerto Rico, were below the national average for both parameters.


Subject(s)
Health Services Accessibility , Kidney Transplantation/statistics & numerical data , Living Donors/statistics & numerical data , Tissue Donors/statistics & numerical data , Cadaver , Family , Geography , Humans , Racial Groups , United States , Waiting Lists
12.
Am J Transplant ; 7(5): 1140-7, 2007 May.
Article in English | MEDLINE | ID: mdl-17331109

ABSTRACT

Nearly one-quarter of the kidney transplant waiting list is composed of repeat transplantation candidates. Survival following retransplantation using expanded criteria donor (ECD) kidneys has not been adequately studied. Using data from the Scientific Registry of Transplant Recipients, we analyzed mortality after retransplantation with ECD and non-ECD deceased-donor kidneys. Adult patients who experienced graft failure and were relisted for transplantation between 1995 and 2004 were studied (n=9641). Follow-up began at the date of relisting and continued until death or the end of the observation period (December 31, 2004), with censoring at living-donor transplantation. Sequential stratification (an extension of Cox regression) was used to compare mortality between patients receiving an ECD retransplant and those remaining on the waiting list or receiving a non-ECD retransplant (conventional therapy). Of 2908 retransplantations, 292 used ECD kidneys. Survival after ECD retransplantation was approximately equal to that of conventional therapy, with an adjusted hazard ratio of 0.98 (p=0.88). In contrast, non-ECD retransplant recipients experienced a significant reduction in mortality (HR=0.44; p<0.0001). Based on these national data, recipients of ECD retransplantation do not have a survival advantage relative to conventional therapy, whereas non-ECD retransplantation is associated with a significant survival advantage.


Subject(s)
Kidney Transplantation/mortality , Living Donors , Tissue Donors/classification , Adult , Age Factors , Eligibility Determination , Graft Survival , Humans , Kidney Transplantation/methods , Middle Aged , Registries/statistics & numerical data , Reoperation/mortality , Retrospective Studies , Survival Analysis , Tissue and Organ Procurement/methods , Treatment Outcome , United States , Waiting Lists
13.
Diabetologia ; 50(6): 1170-7, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17393134

ABSTRACT

AIMS/HYPOTHESIS: There are few data on the target level of glycaemic control among patients with diabetes on haemodialysis. We investigated the impact of glycaemic control on mortality risk among diabetic patients on haemodialysis. SUBJECTS AND METHODS: Data were analysed from the Dialysis Outcomes Practice Pattern Study (DOPPS) for randomly selected patients on haemodialysis in Japan. The diagnosis of diabetes at baseline and information on clinical events during follow-up were abstracted from the medical records. A Cox proportional hazards model was used to evaluate the association between presence or absence of diabetes, glycaemic control (HbA(1c) quintiles) and mortality risk. RESULTS: Data from 1,569 patients with and 3,342 patients without diabetes on haemodialysis were analysed. Among patients on haemodialysis, those with diabetes had a higher mortality risk than those without (multivariable hazard ratio 1.37, 95% CI 1.08-1.74). Compared with those in the bottom quintile of HbA(1c) level, the multivariable-adjusted hazard ratio for mortality was not increased in the bottom second to fourth quintiles of HbA(1c) (HbA(1c) 5.0-5.5% to 6.2-7.2%), but was significantly increased to 2.36 (95% CI 1.02-5.47) in the fifth quintile (HbA(1c) > or = 7.3%). The effect of poor glycaemic control did not statistically correlate with baseline mortality risk (p = 0.27). CONCLUSIONS/INTERPRETATION: Among dialysis patients, poorer glycaemic control in those with diabetes was associated with higher mortality risk. This suggests a strong effect of poor glycaemic control above an HbA(1c) level of about 7.3% on mortality risk, and that this effect does not appear to be influenced by baseline comorbidity status.


Subject(s)
Blood Glucose/metabolism , Diabetic Nephropathies/blood , Diabetic Nephropathies/therapy , Kidney Failure, Chronic/blood , Aged , Body Mass Index , Diabetic Nephropathies/mortality , Female , Glycated Hemoglobin/analysis , Humans , Japan/epidemiology , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/therapy , Male , Middle Aged , Proportional Hazards Models , Risk Factors , Survival Analysis
14.
Am J Transplant ; 6(10): 2470-5, 2006 Oct.
Article in English | MEDLINE | ID: mdl-16939519

ABSTRACT

The ability of the model for end-stage liver disease (MELD) score to accurately predict death among liver transplant candidates allows for evaluation of geographic differences in transplant access for patients with similar death risk. Adjusted models of time to transplant and death for adult liver transplant candidates listed between 2002 and 2003 were developed to test for differences in MELD score among Organ Procurement and Transplantation Network (OPTN) regions and Donation Service Areas (DSA). The average MELD and relative risk (RR) of death varied somewhat by region (from 0.82 to 1.28), with only two regions having significant differences in RRs. Greater variability existed in adjusted transplant rates by region; 7 of 11 regions differed significantly from the national average. Simulation results indicate that an allocation system providing regional priority to candidates at MELD scores > or = 15 would increase the median MELD score at transplant and reduce the total number of deaths across DSA quintiles. Simulation results also indicate that increasing priority to higher MELD candidates would reduce the percentage variation among DSAs of transplants to patients with MELD scores > or = 15. The variation decrease was due to increasing the MELD score at time of transplantation in the DSAs with the lowest MELD scores at transplant.


Subject(s)
Liver Failure/epidemiology , Liver Transplantation/statistics & numerical data , Models, Statistical , Adult , Humans , Incidence , Liver Failure/surgery , Retrospective Studies , Survival Rate/trends , United States/epidemiology , Waiting Lists
15.
Kidney Int ; 69(7): 1222-8, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16609686

ABSTRACT

Longer treatment time (TT) and slower ultrafiltration rate (UFR) are considered advantageous for hemodialysis (HD) patients. The study included 22,000 HD patients from seven countries in the Dialysis Outcomes and Practice Patterns Study (DOPPS). Logistic regression was used to study predictors of TT > 240 min and UFR > 10 ml/h/kg bodyweight. Cox regression was used for survival analyses. Statistical adjustments were made for patient demographics, comorbidities, dose of dialysis (Kt/V), and body size. Europe and Japan had significantly longer (P < 0.0001) average TT than the US (232 and 244 min vs 211 in DOPPS I; 235 and 240 min vs 221 in DOPPS II). Kt/V increased concomitantly with TT in all three regions with the largest absolute difference observed in Japan. TT > 240 min was independently associated with significantly lower relative risk (RR) of mortality (RR = 0.81; P = 0.0005). Every 30 min longer on HD was associated with a 7% lower RR of mortality (RR = 0.93; P < 0.0001). The RR reduction with longer TT was greatest in Japan. A synergistic interaction occurred between Kt/V and TT (P = 0.007) toward mortality reduction. UFR > 10 ml/h/kg was associated with higher odds of intradialytic hypotension (odds ratio = 1.30; P = 0.045) and a higher risk of mortality (RR = 1.09; P = 0.02). Longer TT and higher Kt/V were independently as well as synergistically associated with lower mortality. Rapid UFR during HD was also associated with higher mortality risk. These results warrant a randomized clinical trial of longer dialysis sessions in thrice-weekly HD.


Subject(s)
Renal Dialysis/methods , Ultrafiltration/methods , Adult , Databases, Factual , Humans , Renal Dialysis/mortality , Survival Analysis , Time Factors , Treatment Outcome
17.
Kidney Int ; 69(11): 2087-93, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16641921

ABSTRACT

Hemodiafiltration (HDF) is used sporadically for renal replacement therapy in Europe but not in the US. Characteristics and outcomes were compared for patients receiving HDF versus hemodialysis (HD) in five European countries in the Dialysis Outcomes and Practice Patterns Study. The study followed 2165 patients from 1998 to 2001, stratified into four groups: low- and high-flux HD, and low- and high-efficiency HDF. Patient characteristics including age, sex, 14 comorbid conditions, and time on dialysis were compared between each group using multivariate logistic regression. Cox proportional hazards regression assessed adjusted differences in mortality risk. Prevalence of HDF ranged from 1.8% in Spain to 20.1% in Italy. Compared to low-flux HD, patients receiving low-efficiency HDF had significantly longer average duration of end-stage renal disease (7.0 versus 4.7 years), more history of cancer (15.4 versus 8.7%), and lower phosphorus (5.3 versus 5.6 mg/dl); patients receiving high-efficiency HDF had significantly more lung disease (15.5 versus 10.2%) and received a higher single-pool Kt/V (1.44 versus 1.35). High-efficiency HDF patients had lower crude mortality rates than low-flux HD patients. After adjustment, high-efficiency HDF patients had a significant 35% lower mortality risk than those receiving low-flux HD (relative risk=0.65, P=0.01). These observational results suggest that HDF may improve patient survival independently of its higher dialysis dose. Owing to possible selection bias, the potential benefits of HDF must be tested by controlled clinical trials before recommendations can be made for clinical practice.


Subject(s)
Hemodiafiltration , Renal Dialysis/mortality , Europe , Female , Follow-Up Studies , Humans , Male , Middle Aged , Risk Factors
18.
Am J Transplant ; 6(1): 109-14, 2006 Jan.
Article in English | MEDLINE | ID: mdl-16433764

ABSTRACT

There is a paucity of comparative studies on country-specific outcomes in kidney transplantation. We compared post-transplant mortality among primary, adult, solitary kidney transplant recipients (KTR) from the United States (n = 70 708) and Canada (n = 5773), between January 1, 1991 and December 31, 1998, using data from the Scientific Registry of Transplant Recipients and the Canadian Organ Replacement Register. Multivariable Cox regression revealed higher adjusted post-transplant mortality among U.S. (vs. Canadian) KTR (HR = 1.35 [95% CI 1.24, 1.47; p < 0.005]). Mortality risk in the first post-transplant year was similar in both countries but higher in the United States beyond the first year (HR = 1.49-1.53; p < 0.005). There was no difference in mortality among patients transplanted within 1 year of starting dialysis, but mortality was increased in U.S. (vs. Canadian) patients after 1-2 and 4+ years on dialysis (HR = 1.36-1.66; p < 0.005). Greater mortality was also seen in U.S. patients with diabetes mellitus and/or graft failure. In conclusion, there are considerable differences in the survival of KTR in the United States and Canada. A detailed examination of factors contributing to this variation may yield important insights into improving outcomes for all KTR.


Subject(s)
Kidney Transplantation/mortality , Adolescent , Adult , Aged , Canada , Female , Humans , Male , Middle Aged , United States
19.
Am J Transplant ; 6(2): 281-91, 2006 Feb.
Article in English | MEDLINE | ID: mdl-16426312

ABSTRACT

A national conference on organ donation after cardiac death (DCD) was convened to expand the practice of DCD in the continuum of quality end-of-life care. This national conference affirmed the ethical propriety of DCD as not violating the dead donor rule. Further, by new developments not previously reported, the conference resolved controversy regarding the period of circulatory cessation that determines death and allows administration of pre-recovery pharmacologic agents, it established conditions of DCD eligibility, it presented current data regarding the successful transplantation of organs from DCD, it proposed a new framework of data reporting regarding ischemic events, it made specific recommendations to agencies and organizations to remove barriers to DCD, it brought guidance regarding organ allocation and the process of informed consent and it set an action plan to address media issues. When a consensual decision is made to withdraw life support by the attending physician and patient or by the attending physician and a family member or surrogate (particularly in an intensive care unit), a routine opportunity for DCD should be available to honor the deceased donor's wishes in every donor service area (DSA) of the United States.


Subject(s)
Death, Sudden, Cardiac , Tissue and Organ Procurement/ethics , Adolescent , Adult , Child , Humans , Liver Transplantation/mortality , Liver Transplantation/statistics & numerical data , Middle Aged , Patient Selection
20.
Clin Nephrol ; 63(5): 335-45, 2005 May.
Article in English | MEDLINE | ID: mdl-15909592

ABSTRACT

BACKGROUND: Mortality in severe acute renal failure (ARF) requiring renal replacement therapy (RRT) approximates 50% and varies with clinical severity. Continuous RRT (CRRT) has theoretical advantages over intermittent hemodialysis (IHD) for critical patients, but a survival advantage with CRRT is yet to be clearly demonstrated. To date, no prospective controlled trial has sufficiently answered this question, and the present prospective outcome study attempts to compare survival with CRRT versus that with IHD. METHODS: Multivariable Cox-proportional hazards regression was used to analyze the impact of RRT modality choice (CRRT vs. IHD) on in-hospital and 100-day mortality among ARF patients receiving RRT during 2000 and 2001 at University of Michigan, using an "intent-to-treat" analysis adjusted for multiple comorbidity and severity factors. RESULTS: Overall in-hospital mortality before adjustment was 52%. Triage to CRRT (vs IHD) was associated with higher severity and unadjusted relative rate (RR) of in-hospital death (RR = 1.62, p = 0.001, n = 383). Adjustment for comorbidity and severity of illness reduced the RR of death for patients triaged to CRRT and suggested a possible survival advantage (RR = 0.81, p = 0.32). Analysis restricted to patients in intensive care for more than five days who received at least 48 hours of total RRT, showed the RR of in-hospital mortality with CRRT to be nearly 45% lower than IHD (RR = 0.56, n = 222), a difference in RR that indicates a strong trend for in-hospital mortality with borderline statistical significance (p = 0.069). Analysis of 100-day mortality also suggested a potential survival advantage for CRRT in all cohorts, particularly among patients in intensive care for more than five days who received at least 48 h of RRT (RR = 0.60, p = 0.062, n = 222). CONCLUSION: Applying the present methodology to outcomes at a single tertiary medical center, CRRT may appear to afford a survival advantage for patients with severe ARF treated in the ICU. Unless and until a prospective controlled trial is realized, the present data suggest potential survival advantages of CRRT and support broader application of CRRT among such critically ill patients.


Subject(s)
Acute Kidney Injury/diagnosis , Acute Kidney Injury/therapy , Renal Replacement Therapy/methods , APACHE , Acute Kidney Injury/mortality , Adult , Aged , Cohort Studies , Critical Care/methods , Female , Follow-Up Studies , Hemofiltration/methods , Humans , Intensive Care Units , Kidney Function Tests , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Prospective Studies , Renal Dialysis/methods , Risk Assessment , Severity of Illness Index , Survival Rate , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...