Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 31
Filter
1.
Transplantation ; 106(1): 18-19, 2022 01 01.
Article in English | MEDLINE | ID: mdl-33982914
2.
Transplant Rev (Orlando) ; 33(4): 191-199, 2019 10.
Article in English | MEDLINE | ID: mdl-31377099

ABSTRACT

The mammalian target of rapamycin (mTOR) inhibitor, everolimus, in combination with reduced-exposure calcineurin inhibitor (CNI), has been demonstrated in clinical trials to have comparable efficacy in low-to-moderate immunological risk kidney transplant recipients to the Standard of Care, mycophenolic acid (MPA) in combination with standard-exposure CNI. Current treatment guidelines consider mTOR inhibitors to be a second-line therapy in the majority of cases; however, given that everolimus-based regimens are associated with a reduced rate of viral infections after transplantation, their wider use could have great benefits for kidney transplant patients. In this evidence-based practice guideline, we consider the de novo use of everolimus in kidney transplant recipients. The main outcomes of our consideration of the available evidence are that: 1. Everolimus, in combination with reduced-exposure CNI and low dose steroids, is a suitable regimen for the prophylaxis of kidney transplant rejection in the majority of low-to-moderate immunological risk adult patients, with individualized management; 2. Induction with either basiliximab or rabbit anti-thymocyte globulin is an effective therapy for kidney transplant recipients when initiating an everolimus-based, reduced-exposure CNI regimen; and 3. An individualized approach should be adopted when managing kidney transplant recipients on everolimus-based therapy.


Subject(s)
Calcineurin Inhibitors/administration & dosage , Everolimus/administration & dosage , Immunosuppressive Agents/administration & dosage , Kidney Transplantation/methods , Practice Guidelines as Topic , Drug Therapy, Combination , Evidence-Based Practice , Female , Follow-Up Studies , Graft Rejection , Graft Survival , Humans , Immunosuppression Therapy/methods , Kidney Transplantation/adverse effects , Male , Precision Medicine/methods , Risk Assessment , Treatment Outcome
3.
Transpl Int ; 31(4): 424-435, 2018 04.
Article in English | MEDLINE | ID: mdl-29265514

ABSTRACT

Development of donor-specific antibodies (DSA) after renal transplantation is known to be associated with worse graft survival, yet determining which specificities in which recipients are the most deleterious remains under investigation. This study evaluated the relationship of the complement binding capacity of post-transplant de novo anti-human leukocyte antigen (HLA) antibodies with subsequent clinical outcome. Stored sera from 265 recipients previously identified as having de novo DSA were retested for DSA and their C3d binding capacity using Luminex-based solid-phase assays. Most recipients had anti-HLA class II-reactive DSA (class I = 12.5%, class II = 68.7%, class I and class II = 18.9%). The recipients that had C3d binding DSA (67.5%) had a significantly higher incidence of antibody-mediated rejection and any rejection. They also had significantly lower kidney survival, with the lowest survival in those that had both anti-HLA class I and class II C3d binding DSA. Concurrent biopsy comparison revealed a 96.2% positive predictive value and 47.4% negative predictive value for C4d peritubular capillary (Ptc) deposition. Anti-HLA class I and class II C3d binding DSA carried a twofold and 1.5-fold increased risk of kidney loss, respectively, in multivariate analysis.


Subject(s)
Complement C3d/metabolism , HLA Antigens/metabolism , Kidney Transplantation , Transplantation Immunology , Adult , Antibody Specificity , Complement C4b/metabolism , Female , Graft Survival , HLA Antigens/analysis , Humans , Immunoglobulin G/metabolism , Male , Middle Aged , Nephritis/immunology , Peptide Fragments/metabolism , Retrospective Studies
4.
J Vasc Surg ; 65(4): 1089-1103.e1, 2017 Apr.
Article in English | MEDLINE | ID: mdl-28222990

ABSTRACT

OBJECTIVE: The Kidney Disease Outcome Quality Initiative and Fistula First Breakthrough Initiative call for the indiscriminate creation of arteriovenous fistulas (AVFs) over arteriovenous grafts (AVGs) without providing patient-specific criteria for vascular access selection. Although the U.S. AVF rate has increased dramatically, several reports have found that this singular focus on increasing AVFs has resulted in increased AVF nonmaturation/early failure and a high prevalence of catheter dependence. The objective of this study was to determine the appropriateness of vascular access procedures in clinical scenarios constructed with combinations of relevant factors potentially influencing outcomes. METHODS: The RAND/UCLA Appropriateness Method was used. Accordingly, a comprehensive literature search was performed and a synthesis of results compiled. The RAND/UCLA Appropriateness Method was applied to 2088 AVF and 1728 AVG clinical scenarios with varying patient characteristics. Eleven international vascular access experts rated the appropriateness of each scenario in two rounds. On the basis of the distribution of the panelists' scores, each scenario was determined to be appropriate, inappropriate, or indeterminate. RESULTS: Panelists achieved agreement in 2964 (77.7%) scenarios; 860 (41%) AVF and 588 (34%) AVG scenarios were scored appropriate, 686 (33%) AVF and 480 (28%) AVG scenarios were scored inappropriate, and 542 (26%) AVF and 660 (38%) AVG scenarios were indeterminate. Younger age, larger outflow vein diameter, normal or obese body mass index (vs morbidly obese), larger inflow artery diameter, and higher patient functional status were associated with appropriateness of AVF creation. Older age, dialysis dependence, and smaller vein size were associated with appropriateness of AVG creation. Gender, diabetes, and coronary artery disease were not associated with AVF or AVG appropriateness. Dialysis status was not associated with AVF appropriateness. Body mass index and functional status were not associated with AVG appropriateness. To simulate the surgeon's decision-making, scenarios were combined to create situations with the same patient characteristics and both AVF and AVG options for access. Of these 864 clinical situations, 311 (36%) were rated appropriate for AVG but inappropriate or indeterminate for AVF. CONCLUSIONS: The results of this study indicate that patient-specific situations exist wherein AVG is as appropriate as or more appropriate than AVF. These results provide patient-specific recommendations for clinicians to optimize vascular access selection criteria, to standardize care, and to inform payers and policy. Indeterminate scenarios will guide future research.


Subject(s)
Arteriovenous Shunt, Surgical , Blood Vessel Prosthesis Implantation , Kidney Diseases/therapy , Patient Selection , Renal Dialysis , Upper Extremity/blood supply , Aged , Aged, 80 and over , Arteriovenous Shunt, Surgical/adverse effects , Arteriovenous Shunt, Surgical/standards , Blood Vessel Prosthesis Implantation/adverse effects , Blood Vessel Prosthesis Implantation/standards , Female , Guideline Adherence , Humans , Kidney Diseases/diagnosis , Male , Middle Aged , Practice Guidelines as Topic , Practice Patterns, Physicians' , Risk Assessment , Risk Factors , Treatment Outcome , Unnecessary Procedures
5.
Transplantation ; 101(6): 1373-1380, 2017 06.
Article in English | MEDLINE | ID: mdl-27482960

ABSTRACT

BACKGROUND: Scientific Registry of Transplant Recipients report cards of US organ transplant center performance are publicly available and used for quality oversight. Low center performance (LP) evaluations are associated with changes in practice including reduced transplant rates and increased waitlist removals. In 2014, Scientific Registry of Transplant Recipients implemented new Bayesian methodology to evaluate performance which was not adopted by Center for Medicare and Medicaid Services (CMS). In May 2016, CMS altered their performance criteria, reducing the likelihood of LP evaluations. METHODS: Our aims were to evaluate incidence, survival rates, and volume of LP centers with Bayesian, historical (old-CMS) and new-CMS criteria using 6 consecutive program-specific reports (PSR), January 2013 to July 2015 among adult kidney transplant centers. RESULTS: Bayesian, old-CMS and new-CMS criteria identified 13.4%, 8.3%, and 6.1% LP PSRs, respectively. Over the 3-year period, 31.9% (Bayesian), 23.4% (old-CMS), and 19.8% (new-CMS) of centers had 1 or more LP evaluation. For small centers (<83 transplants/PSR), there were 4-fold additional LP evaluations (52 vs 13 PSRs) for 1-year mortality with Bayesian versus new-CMS criteria. For large centers (>183 transplants/PSR), there were 3-fold additional LP evaluations for 1-year mortality with Bayesian versus new-CMS criteria with median differences in observed and expected patient survival of -1.6% and -2.2%, respectively. CONCLUSIONS: A significant proportion of kidney transplant centers are identified as low performing with relatively small survival differences compared with expected. Bayesian criteria have significantly higher flagging rates and new-CMS criteria modestly reduce flagging. Critical appraisal of performance criteria is needed to assess whether quality oversight is meeting intended goals and whether further modifications could reduce risk aversion, more efficiently allocate resources, and increase transplant opportunities.


Subject(s)
Hospitals, High-Volume/standards , Hospitals, Low-Volume/standards , Kidney Transplantation/standards , Process Assessment, Health Care/standards , Quality Improvement/standards , Quality Indicators, Health Care/standards , Bayes Theorem , Centers for Medicare and Medicaid Services, U.S. , Hospitals, High-Volume/statistics & numerical data , Hospitals, Low-Volume/statistics & numerical data , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Models, Statistical , Process Assessment, Health Care/statistics & numerical data , Program Evaluation , Quality Improvement/statistics & numerical data , Quality Indicators, Health Care/statistics & numerical data , Time Factors , Treatment Outcome , United States , Waiting Lists
6.
Clin Transplant ; 30(8): 940-5, 2016 08.
Article in English | MEDLINE | ID: mdl-27218658

ABSTRACT

BACKGROUND: Deceased donor (DD) kidney quality is determined by calculating the Kidney Donor Profile Index (KDPI). Optimizing high KDPI (≥85%) DD transplant outcome is challenging. This retrospective study was performed to review our high KDPI DD transplant results to identify clinical practices that can improve future outcomes. METHODS: We retrospectively calculated the KDPI for 895 DD kidney recipients transplanted between 1/2002 and 11/2013. Age, race, body mass index (BMI), retransplantation, gender, diabetes (DM), dialysis time, and preexisting coronary artery disease (CAD) (previous myocardial infarction (MI), coronary artery bypass (CABG), or stenting) were determined for all recipients. RESULTS: About 29.7% (266/895) of transplants were from donors with a KDPI ≥85%. By Cox regression older age, diabetes, female gender, and dialysis time >4 years correlated with shorter patient survival time. Diabetics with CAD who received a high KDPI donor kidney had a significantly increased risk of death (HR 4.33 (CI 1.82-10.30), P=.001) compared to low KDPI kidney recipients. The Kaplan-Meier survival curve for diabetic recipients of high KDPI kidneys was significantly worse if they had preexisting CAD (P<.001 by log-rank test). CONCLUSION: Patient survival using high KDPI donor kidneys may be improved by avoiding diabetic candidates with preexisting CAD.


Subject(s)
Diabetes Mellitus/mortality , Kidney Failure, Chronic/surgery , Kidney Transplantation/mortality , Registries , Tissue Donors , Tissue and Organ Procurement/methods , Transplant Recipients , Adolescent , Adult , Aged , Aged, 80 and over , Donor Selection , Female , Humans , Kidney Failure, Chronic/mortality , Male , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends , United States/epidemiology , Young Adult
7.
Clin Transplant ; 29(12): 1119-27, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26382932

ABSTRACT

BACKGROUND: De novo donor-specific antibodies (dnDSA) post-transplant correlate with a higher risk of immunologic graft injury and loss following kidney and pancreas transplantation. Post-transplant dnDSA can occur within the first post-transplant year. METHODS: In this study, 817 of 1290 kidney and simultaneous kidney/pancreas recipients were tested for dnDSA post-transplant. Recipient immunosuppressive treatment at one, three, six, and 12 months post-transplant was correlated with dnDSA incidence by univariate and multivariate analyses. RESULTS: The overall incidence of dnDSA was 21.3% detected a median of 3.5 yr post-transplant. By univariate analysis, the immunosuppressive treatment at all time points correlated with dnDSA (p < 0.01). Month 6 treatment correlated best in multivariable analysis (p = 0.004). At six months, recipients receiving rapamune/mycophenolic acid (Rapa/MPA) had the highest dnDSA incidence at five yr (25.3%) and last follow-up (30.7%), those treated with cyclosporine/rapamune (CNI/Rapa) had the lowest incidence at five yr (10.8%) and last follow-up (18.6%), and cyclosporine/mycophenolic acid (CNI/MPA) treatment had an intermediate incidence at five yr (16.7%) and last follow-up (20.4%) (p < 0.01). Six-month CNI/MPA and Rapa/MPA treatment significantly correlated with dnDSA (hazard ratios of 2.36 and 1.80, respectively) by Cox proportional hazards regression modeling. CONCLUSION: The risk of post-transplant dnDSA development correlates with early immunosuppressive management.


Subject(s)
Graft Rejection/immunology , Graft Survival/immunology , Immunosuppressive Agents/therapeutic use , Isoantibodies/immunology , Kidney Transplantation/adverse effects , Pancreas Transplantation/adverse effects , Adolescent , Adult , Aged , Cohort Studies , Female , Follow-Up Studies , Graft Rejection/blood , Graft Rejection/diagnosis , Humans , Isoantibodies/blood , Male , Middle Aged , Postoperative Complications , Prognosis , Risk Factors , Tissue Donors , Young Adult
8.
World J Transplant ; 5(4): 154-64, 2015 Dec 24.
Article in English | MEDLINE | ID: mdl-26722644

ABSTRACT

Organ preservation remains an important contributing factor to graft and patient outcomes. During donor organ procurement and transportation, cellular injury is mitigated through the use of preservation solutions in conjunction with hypothermia. Various preservation solutions and protocols exist with widespread variability among transplant centers. In this review of abdominal organ preservation solutions, evolution of transplantation and graft preservation are discussed followed by classification of preservation solutions according to the composition of electrolytes, impermeants, buffers, antioxidants, and energy precursors. Lastly, pertinent clinical studies in the setting of hepatic, renal, pancreas, and intestinal transplantation are reviewed for patient and graft survival as well as financial considerations. In liver transplants there may be some benefit with the use of histidine-tryptophan-ketoglutarate (HTK) over University of Wisconsin solution in terms of biliary complications and potential cost savings. Renal grafts may experience increased initial graft dysfunction with the use of Euro-Collins thereby dissuading its use in support of HTK which can lead to substantial cost savings. University of Wisconsin solution and Celsior are favored in pancreas transplants given the concern for pancreatitis and graft thrombosis associated with HTK. No difference was observed with preservation solutions with respect to graft and patient survival in liver, renal, and pancreas transplants. Studies involving intestinal transplants are sparse but University of Wisconsin solution infused intraluminally in combination with an intra-vascular washout is a reasonable option until further evidence can be generated. Available literature can be used to ameliorate extensive variation across centers while potentially minimizing graft dysfunction and improving associated costs.

9.
Transplantation ; 97(6): 686-93, 2014 Mar 27.
Article in English | MEDLINE | ID: mdl-24637867

ABSTRACT

BACKGROUND: The Scientific Registry of Transplant Recipients (SRTR) and the Centers for Medicare and Medicaid Services (CMS) determine expected graft survivals to identify potentially underperforming transplant centers. There has been recent interest in evaluating adjustments for comorbidities when performing these calculations. This study was performed to determine the influence that adjustment for pre-transplant cardiovascular disease comorbidity can have on risk-adjusted Cox models, such as those used by SRTR and CMS. METHODS: We analyzed Cox proportional hazards models for 1-year and 3-year graft survival for kidney recipients from a single center where cardiovascular disease covariates were added to a baseline model derived by using the SRTR calculated risk scores and including all standard SRTR parameters. RESULTS: Living and deceased donor recipient 1-year and living donor 3-year Cox models that included all seven covariates demonstrated 8% to 13% improved discrimination. Only the 1-year deceased donor recipient Cox model demonstrated significantly improved calibration (likelihood ratio test P=0.038). The expected graft losses increased by >30% for living donor recipients at 1 and 3 years and decreased by 2% to 4% for deceased donor recipients at 1 and 3 years. CONCLUSION: SRTR and CMS use of pre-transplant cardiovascular comorbidity adjustment might impact center performance evaluations.


Subject(s)
Cardiovascular Diseases/epidemiology , Graft Survival , Kidney Transplantation/adverse effects , Adolescent , Adult , Aged , Aged, 80 and over , Cardiovascular Diseases/mortality , Centers for Medicare and Medicaid Services, U.S. , Comorbidity , Female , Humans , Kidney Transplantation/mortality , Likelihood Functions , Living Donors , Male , Middle Aged , Proportional Hazards Models , Quality Indicators, Health Care , Registries , Risk Factors , Time Factors , Tissue and Organ Procurement , Treatment Outcome , United States/epidemiology , Young Adult
10.
Transplantation ; 94(4): 331-7, 2012 Aug 27.
Article in English | MEDLINE | ID: mdl-22850297

ABSTRACT

BACKGROUND: The Thymoglobulin Antibody Immunosuppression in Living Donor Recipients registry was established to assess clinical experience with rabbit antithymocyte globulin (rATG; Thymoglobulin) in living donor renal transplant recipients. METHODS: From 2003 to 2008, US transplant centers prospectively entered information on patients who received rATG induction. In addition to standard United Network for Organ Sharing registry data elements, information was collected regarding immunosuppression, viral prophylaxis, acute rejection, and adverse events. RESULTS: Data on 2322 patients from 49 transplant centers were enrolled and met inclusion criteria for analysis. Patient and graft survival were 99.3% and 99.0% at 6 months and 98.4% and 98.2% at 12 months as recorded in Thymoglobulin Antibody Immunosuppression in Living Donor Recipients registry and were 91.5% and 83.2% at 5 years by Kaplan-Meier estimates based on linked United Network for Organ Sharing registry records. Freedom from rejection was 93.6% through 5 years. Mean rATG cumulative dose was 5.29 mg/kg. More than one-third of patients (37.6%) were steroid-free at discharge, and nearly half of patients (48%) were steroid-free at 12 months. Before discharge, 3.2% experienced serious adverse events, with 11 events (0.005%) reported as possibly or probably related to rATG. Incidence of cytomegalovirus infection was 4.2% at 12 months, and 99.1% of patients were posttransplant lymphoproliferative disorder-free through 5 years. CONCLUSIONS: rATG induction in living donor renal transplantation is safe and associated with a low incidence of acute rejection and posttransplantation complications.


Subject(s)
Antilymphocyte Serum/therapeutic use , Immunosuppressive Agents/therapeutic use , Kidney Transplantation , Living Donors , Adult , Animals , Antilymphocyte Serum/adverse effects , Antiviral Agents/therapeutic use , Creatinine/blood , Female , Graft Survival , Humans , Male , Middle Aged , Rabbits , Registries
11.
Clin Transplant ; 22(1): 61-7, 2008.
Article in English | MEDLINE | ID: mdl-18217907

ABSTRACT

BACKGROUND: In the early post-transplant period, renal allograft rejection with diffuse peritubular capillary (PTC) C4d deposition predicts poor graft survival. In the late post-transplant setting, that is, one or more yr after transplantation, the implication of diffuse PTC C4d deposition is still a topic of debate. The purpose of our study was to see if diffuse PTC C4d deposition, in late acute rejection (LAR), occurring more than one yr post-transplant, has any impact on graft survival and function. METHODS: We selected cases, both cadaveric as well as living donor renal transplant recipients, in whom acute rejection with PTC C4d deposition was first detected after the first year post-transplant. Recipients with multiple acute rejection episodes during the first year post-transplant were excluded from the study. The first biopsy diagnosed with LAR was considered the index biopsy (n = 40). We formed two groups: group 1, C4d-positive LAR (n = 20), and group 2, C4d-negative LAR (n = 20). Groups were matched for maintenance and post-rejection immunosuppressive therapy, baseline serum creatinine levels before the time of the index biopsy, time from transplant to index biopsy, as well as chronic allograft damage index (CADI) score in the index biopsies. We compared the rate of graft loss, and the graft function of the surviving grafts at the end of the study period, as well as histologic parameters in the index biopsy specimens between the two groups. The mean follow-up period was 20 months. RESULTS: No significant differences in the rate of graft loss or graft function were found between groups 1 and 2 at the end of the follow-up period. Histologically, PTC margination and transplant glomerulopathy were more common in the C4d-positive group, and this difference was statistically significant. There was no statistically significant difference in the degree of plasma cell infiltrates. CONCLUSIONS: Unlike in the acute setting, the presence or absence of PTC C4d staining in renal allografts with LAR may not have a predictive value regarding graft outcome.


Subject(s)
Capillaries/metabolism , Complement C4b/metabolism , Graft Rejection/metabolism , Kidney Transplantation/physiology , Kidney Tubules/blood supply , Peptide Fragments/metabolism , Adult , Female , Graft Rejection/immunology , Graft Survival/physiology , Humans , Immunohistochemistry , Kidney Transplantation/immunology , Kidney Tubules/metabolism , Male , Middle Aged , Retrospective Studies , Time Factors , Transplantation, Homologous , Treatment Outcome
12.
Transplantation ; 84(9): 1131-7, 2007 Nov 15.
Article in English | MEDLINE | ID: mdl-17998868

ABSTRACT

BACKGROUND: Steroid-free immunosuppression is an attractive option because it avoids the many side effects of chronic corticosteroid use. It is especially attractive in pancreas recipients because it avoids the diabetogenic effects of steroids. METHODS: We evaluated the outcome of a steroid-free maintenance immunosuppressive protocol in pancreas transplant recipients. Between August 2003 and May 2006, a total of 97 pancreas transplant recipients received steroid-free maintenance immunosuppression, consisting of induction with thymoglobulin and prednisone for the first 5 days. Patients were maintained on sirolimus adjusted to a target rapamycin trough level and reduced-dose cyclosporine adjusted to target C2 levels. All pancreas transplants (n=124) performed in the previous 3 years and maintained on a steroid-based immunosuppressive protocol with cyclosporine and mycophenolate mofetil were used for comparison. RESULTS: One-year patient and death censored pancreas graft survival were 93.8% and 94.8% for the steroid free group versus 95.2% and 87.9% for the comparator group, respectively. The incidence of acute rejection was 9.3% in the steroid-free group versus 28.3% in the comparator group (P<0.01). No pancreas loss in the steroid-free group was caused by acute rejection, whereas seven (5.6%) patients in the comparator group lost their pancreases because of acute rejection (P<0.05). At 1 year after transplant, the mean serum glucose and creatinine levels were not different between the two groups. CONCLUSION: We conclude that excellent graft survival with a significantly lower incidence of acute rejection can be achieved using a steroid-free maintenance immunosuppressive protocol consisting of sirolimus and cyclosporine.


Subject(s)
Cyclosporine/therapeutic use , Graft Rejection/prevention & control , Immunosuppression Therapy/methods , Immunosuppressive Agents/therapeutic use , Pancreas Transplantation/immunology , Sirolimus/therapeutic use , Adrenal Cortex Hormones , Graft Survival/drug effects , Humans , Immunosuppressive Agents/pharmacokinetics , Leukocyte Count , Lipids/blood , Mycophenolic Acid/analogs & derivatives , Mycophenolic Acid/therapeutic use , Treatment Outcome
13.
Semin Vasc Surg ; 20(3): 164-6, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17884617

ABSTRACT

A functional vascular access is of critical importance to the hemodialysis patient, the patient's healthcare providers, and the hemodialysis treatment center. A poorly functioning or thrombosed vascular access can lead to increased morbidity, hospitalization, length of stay, and cost. This article reviews the increasing evidence supporting surveillance of arteriovenous (AV) hemodialysis access and the various strategies and techniques available for detection of a failing access.


Subject(s)
Arteriovenous Shunt, Surgical , Graft Occlusion, Vascular/diagnosis , Renal Dialysis/methods , Catheters, Indwelling , Graft Occlusion, Vascular/physiopathology , Humans , Kidney Failure, Chronic/therapy , Magnetic Resonance Angiography , Regional Blood Flow , Vascular Patency , Veins/physiopathology , Venous Pressure/physiology
14.
Clin Transplant ; 20(5): 537-46, 2006.
Article in English | MEDLINE | ID: mdl-16968478

ABSTRACT

Steroid-free maintenance immunosuppression is desirable to eliminate the side effects of chronic corticosteroid use. Complete steroid avoidance or rapid post-transplant steroid withdrawal has recently been used in renal transplant recipients with encouraging results. The present study evaluated the outcome of a steroid-free maintenance immunosuppressive protocol in kidney transplant recipients with at least one-yr follow up. Between April 2002 and October 2004, a total of 301 primary kidney transplant recipients received steroid-free maintenance immunosuppression. The regimen consisted of induction with thymogobulin and prednisone for the first five d. Patients were maintained on Sirolimus and Neoral. Neoral dose was adjusted to target C2 levels and the Sirolimus dose was adjusted to a target rapamycin trough level. All primary kidney transplants (n = 502) performed in the two yr (starting January 2000) prior to institution of the steroid-free regimen and thus maintained on a steroid-based immunosuppressive protocol were used for comparison. One-year patient and death censored graft survival were 93.1% and 98.1% for the steroid-free group vs. 95.2% and 95.2% for the comparator groups (p = ns). The incidence of biopsy-proven acute rejection was 4.9% in the steroid-free group vs. 9.4% in the comparator group (p < 0.01). Two (0.7%) of 301 patients in the steroid-free group lost their grafts because of acute rejection compared with nine (1.8%) patients in the comparator group (p < 0.05). At one-yr post-transplant the mean serum creatinine level was not different between the two groups. There were no significant differences in mean serum cholesterol and triglycerides levels as well as the percentage of patients on lipid lowering agents between the groups. White blood cell counts, daily doses of Neoral and weight gain were significantly lower in the steroid-free group vs. the comparator group. However, more patients in the steroid-free group required erythropoietin and iron therapy for anemia (p < 0.001). We conclude that excellent graft survival with a significantly lower incidence of acute rejection can be achieved using a steroid-free maintenance immunosuppressive protocol consisting of Neoral and Sirolimus.


Subject(s)
Immunosuppression Therapy/methods , Kidney Transplantation , Antibodies, Monoclonal/administration & dosage , Antilymphocyte Serum , Cholesterol/blood , Creatinine/blood , Cyclosporine/administration & dosage , Female , Follow-Up Studies , Graft Rejection , Graft Survival , Humans , Male , Middle Aged , Prednisone/administration & dosage , Retrospective Studies , Sirolimus/administration & dosage , Steroids/administration & dosage , Treatment Outcome , Triglycerides/blood
15.
Eur J Vasc Endovasc Surg ; 32(5): 545-8, 2006 Nov.
Article in English | MEDLINE | ID: mdl-16934500

ABSTRACT

There is increasing evidence that surveillance of AV access for haemodialysis prevents access thrombosis and improves the quality of care. This article reviews the evidence for surveillance and the various strategies and techniques available for detection of the failing access.


Subject(s)
Arteriovenous Shunt, Surgical , Graft Occlusion, Vascular/diagnosis , Population Surveillance , Renal Dialysis , Graft Occlusion, Vascular/physiopathology , Humans , Physical Examination , Regional Blood Flow , Vascular Patency , Veins/physiopathology , Venous Pressure
16.
Transplantation ; 80(7): 925-9, 2005 Oct 15.
Article in English | MEDLINE | ID: mdl-16249740

ABSTRACT

BACKGROUND: The deceased donor score (DDS), expanded criteria donor (ECD) definition, and resistive index (RI) were developed for pretransplant evaluation of donors. DDS and ECD are determined by a calculation of risk from donor variables, while RI is determined from flow characteristics of kidneys during machine preservation (MP). This study was designed to compare DDS, ECD status, and RI as predictors of outcome after deceased donor transplantation. We were also interested to see if DDS or ECD could identify kidneys most likely to benefit from MP. METHODS: We retrospectively reviewed 48,952 deceased donor renal transplants reported to UNOS from 1997-2002. DDS (0-39 pts.), ECD status (+ or -), and preservation technique (MP vs. cold storage [CS]) were determined in all cases. RI during MP was studied in a single-center cohort of 425 transplants. RESULTS: DDS was superior to ECD status and RI in its correlation with early and late renal function after transplantation. DDS identified a subgroup of ECD- kidneys, those with DDS > or = 20 pts, that functioned significantly below expectation and similar to ECD+ kidneys. Benefits of MP, which include improved early graft function and a trend towards longer graft survival, were greatest in the group of kidneys with DDS > or = 20 pts. CONCLUSIONS: DDS was the best predictor of outcome after deceased donor renal transplantation and may be useful in identifying kidneys most likely to benefit from MP.


Subject(s)
Graft Survival , Kidney Transplantation , Kidney , Organ Preservation/standards , Tissue Donors , Cadaver , Humans , Perfusion , Prognosis , Quality Control , Refrigeration
17.
Can J Surg ; 48(2): 123-30, 2005 Apr.
Article in English | MEDLINE | ID: mdl-15887792

ABSTRACT

UNLABELLED: Shortages of cadaveric kidneys for transplant into rising numbers of patients with end-stage renal failure have increased the demand for kidneys from live donors. The morbidity associated with traditional open donor nephrectomies (ODN) may discourage many candidates. The newer laparoscopic technique has been promoted as having less morbidity. OBJECTIVES: To evaluate outcomes of hand-assisted laparoscopic nephrectomies (HALN) and prospectively compare HALN and ODN. METHODS: After retrospectively reviewing donor and recipient outcomes in 33 HALN (December through August, 2000), we prospectively compared another 47 with 30 ODN performed from September 2000 through April 2001. RESULTS: All 80 HALN were successful, with no requirement to convert to an open procedure. Four donors experienced surgery-related complications: wound infection, retroperitoneal hematoma, prolonged ileus and early small-bowel obstruction, respectively. Two recipients had ureteral complications (1 stricture, 1 leak); 5 experienced delayed graft function, 2 requiring dialysis; and 2 kidneys were lost from infarction. The prospective comparison showed the operative time for HALN (mean 184 min, standard deviation [SD] 39 min) was significantly longer (143 [SD 27] min, p < 0.01), but resulted in less blood loss (p < 0.05). Lengths of time to warm ischemia/early graft function, resumption of oral intake/first bowel movement, and hospital discharge were similar. The abdominal-wall laxity and loss of cutaneous sensation from the flank incision experienced by many ODN patients after was uncommon in the HALN group. Three months after nephrectomy, donor complaints of incisional pain were less common after HALN (p < 0.01). CONCLUSIONS: HALN had good outcomes for donors and recipients, with quicker, more complete recoveries 3 months afterward.


Subject(s)
Laparoscopy , Nephrectomy/methods , Adult , Humans , Length of Stay , Living Donors , Male , Nephrectomy/adverse effects , Prospective Studies , Recovery of Function
18.
Clin Transpl ; : 111-7, 2005.
Article in English | MEDLINE | ID: mdl-17424729

ABSTRACT

The goals and outcomes of immunosuppression in renal transplantation have changed significantly over the last 30 years. When graft survival rates were relatively low and acute rejection was a frequent occurrence in the early era of transplantation, the goal of immunosuppression was to improve survival and reduce the rate of acute rejection. Today, with excellent graft survival rates and a low incidence of acute rejection, the goal of immunosuppression has shifted toward not only eliminating acute rejection, but also toward reducing the side effects of medications, and maintaining long-term graft function by decreasing chronic nephropathy. Between September 1982-December 2004, 3,211 primary kidney transplant procedures were performed at The Ohio State University. We excluded from analysis all combined transplants as well as patients who were involved in clinical research protocols. Our immunosuppressive protocol changed substantially over this 24-year period, which can be divided into 5 eras in time. Each era is defined by a distinct immunosuppressive protocol that resulted in an incremental improvement in outcomes of patient and graft survival rates. In the present study, the outcomes of each era in patients with previous kidney transplant only are compared and future directions are discussed. The incidence of acute rejection episodes and graft survival from each era are compared and demonstrate the substantial improvement in results that have been achieved over the past 24 years.


Subject(s)
Graft Survival/immunology , Immunosuppression Therapy/methods , Kidney Transplantation/immunology , Acute Disease , Cadaver , Graft Rejection/epidemiology , Hospitals, University , Humans , Immunosuppression Therapy/trends , Kidney Transplantation/mortality , Length of Stay , Living Donors , Ohio , Survival Analysis , Time Factors , Tissue Donors
20.
J Comput Assist Tomogr ; 28(5): 613-6, 2004.
Article in English | MEDLINE | ID: mdl-15480033

ABSTRACT

Liver transplant patients who present with abdominal pain after removal of the T-tube can be initially evaluated by contrast-enhanced magnetic resonance cholangiography (CEMRC) instead of abdominal computed tomography and hepatobiliary scintigraphy. In this article, 3 liver transplant patients who were evaluated by CEMRC after removal of the T-tube. CEMRC successfully identified the presence, location and extent of bile duct leaks, and can be performed as a diagnostic study in patients with suspected bile duct leaks.


Subject(s)
Bile Duct Diseases/diagnostic imaging , Cholangiopancreatography, Magnetic Resonance/methods , Device Removal , Edetic Acid/analogs & derivatives , Liver Transplantation , Pyridoxal Phosphate/analogs & derivatives , Adult , Bile Duct Diseases/etiology , Contrast Media , Drainage , Humans , Liver Transplantation/adverse effects , Male , Middle Aged , Ohio , Radiographic Image Interpretation, Computer-Assisted , Tomography, X-Ray Computed
SELECTION OF CITATIONS
SEARCH DETAIL