Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 55
Filtrar
Más filtros

Bases de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
Clin Transplant ; 38(5): e15319, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38683684

RESUMEN

OBJECTIVE: Longer end-stage renal disease time has been associated with inferior kidney transplant outcomes. However, the contribution of transplant evaluation is uncertain. We explored the relationship between time from evaluation to listing (ELT) and transplant outcomes. METHODS: This retrospective study included 2535 adult kidney transplants from 2000 to 2015. Kaplan-Meier survival curves, log-rank tests, and Cox regression models were used to compare transplant outcomes. RESULTS: Patient survival for both deceased donor (DD) recipients (p < .001) and living donor (LD) recipients (p < .0001) was significantly higher when ELT was less than 3 months. The risks of ELT appeared to be mediated by other risks in DD recipients, as adjusted models showed no associated risk of graft loss or death in DD recipients. For LD recipients, ELT remained a risk factor for patient death after covariate adjustment. Each month of ELT was associated with an increased risk of death (HR = 1.021, p = .04) but not graft loss in LD recipients in adjusted models. CONCLUSIONS: Kidney transplant recipients with longer ELT times had higher rates of death after transplant, and ELT was independently associated with an increased risk of death for LD recipients. Investigations on the impact of pretransplant evaluation on post-transplant outcomes can inform transplant policy and practice.


Asunto(s)
Supervivencia de Injerto , Fallo Renal Crónico , Trasplante de Riñón , Listas de Espera , Humanos , Trasplante de Riñón/mortalidad , Trasplante de Riñón/efectos adversos , Femenino , Masculino , Estudios Retrospectivos , Persona de Mediana Edad , Fallo Renal Crónico/cirugía , Estudios de Seguimiento , Factores de Riesgo , Listas de Espera/mortalidad , Pronóstico , Tasa de Supervivencia , Adulto , Rechazo de Injerto/etiología , Rechazo de Injerto/mortalidad , Donantes de Tejidos/provisión & distribución , Tasa de Filtración Glomerular , Pruebas de Función Renal , Donadores Vivos/provisión & distribución , Obtención de Tejidos y Órganos , Factores de Tiempo , Complicaciones Posoperatorias
2.
J Surg Res ; 248: 69-81, 2020 04.
Artículo en Inglés | MEDLINE | ID: mdl-31865161

RESUMEN

BACKGROUND: Kidneys from acute renal failure (ARF), expanded criteria donors (ECD), and donation after cardiac death (DCD) donors are often discarded due to concerns for delayed graft function (DGF) and graft failure. Induction immunosuppression may be used to minimize these risks, but practices vary widely. Furthermore, little is known regarding national outcomes of transplant recipients receiving induction immunosuppression for receipt of high-risk kidneys. MATERIALS AND METHODS: Using a center-level retrospective study, deceased donor transplants (115,485) from the Scientific Registry of Transplant Recipients from January 2003 to June 2016 were evaluated. Patients who received induction immunosuppression, including lymphocyte immune globulin, muromonab CD-3, IL-1 receptor antagonist, anti-thymocyte globulin, daclizumab, basiliximab, alemtuzumab, and rituximab, were included. Associations of center-level induction use with acute rejection in the first post-transplant year, graft failure, and patient mortality were evaluated using multivariable Cox and logistic regression. RESULTS: Among all kidneys, increasing percentage of center-level induction was associated with lower risk of graft failure, acute rejection, and patient mortality. In recipients of ARF kidneys, the beneficial association of induction on graft failure and acute rejection was greater than in those that received non-ARF kidneys. Marginally greater benefit of induction was seen for acute rejection in ECD compared to standard criteria donor (SCD) recipients and for graft failure in DCD compared to donors after brain death (DBD). No benefit of induction was detected for patient and graft survival in ECD recipients, acute rejection in DCD recipients, and patient survival in DGF recipients. No difference in the benefit of induction was detected in any other comparisons. CONCLUSIONS: While seemingly beneficial for recipients of all kidneys, induction has more robust associations with lower graft failure and acute rejection probability for recipients of ARF kidneys. Given the lack of observed benefit for ECD recipients, induction policies should be carefully considered in these patients.


Asunto(s)
Muerte , Terapia de Inmunosupresión , Trasplante de Riñón , Inmunología del Trasplante , Adulto , Aloinjertos , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Adulto Joven
3.
Lifetime Data Anal ; 26(3): 451-470, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-31576491

RESUMEN

In evaluating the benefit of a treatment on survival, it is often of interest to compare post-treatment survival with the survival function that would have been observed in the absence of treatment. In many practical settings, treatment is time-dependent in the sense that subjects typically begin follow-up untreated, with some going on to receive treatment at some later time point. In observational studies, treatment is not assigned at random and, therefore, may depend on various patient characteristics. We have developed semi-parametric matching methods to estimate the average treatment effect on the treated (ATT) with respect to survival probability and restricted mean survival time. Matching is based on a prognostic score which reflects each patient's death hazard in the absence of treatment. Specifically, each treated patient is matched with multiple as-yet-untreated patients with similar prognostic scores. The matched sets do not need to be of equal size, since each matched control is weighted in order to preserve risk score balancing across treated and untreated groups. After matching, we estimate the ATT non-parametrically by contrasting pre- and post-treatment weighted Nelson-Aalen survival curves. A closed-form variance is proposed and shown to work well in simulation studies. The proposed methods are applied to national organ transplant registry data.


Asunto(s)
Análisis de Supervivencia , Resultado del Tratamiento , Simulación por Computador , Humanos , Pronóstico , Estadísticas no Paramétricas
4.
Clin Transplant ; 33(6): e13542, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-30887610

RESUMEN

BACKGROUND: Intraoperative fluid management during laparoscopic donor nephrectomy (LDN) may have a significant effect on donor and recipient outcomes. We sought to quantify variability in fluid management and investigate its impact on donor and recipient outcomes. METHODS: A retrospective review of patients who underwent LDN from July 2011 to January 2016 with paired kidney recipients at a single center was performed. Patients were divided into tertiles of intraoperative fluid management (standard, high, and aggressive). Donor and recipient demographics, intraoperative data, and postoperative outcomes were analyzed. RESULTS: Overall, 413 paired kidney donors and recipients were identified. Intraoperative fluid management (mL/h) was highly variable with no correlation to donor weight (kg) (R = 0.017). The aggressive fluid management group had significantly lower recipient creatinine levels on postoperative day 1. However, no significant differences were noted in creatinine levels out to 6 months between groups. No significant differences were noted in recipient postoperative complications, graft loss, and death. There was a significant increase (P < 0.01) in the number of total donor complications in the aggressive fluid management group. CONCLUSIONS: Aggressive fluid management during LDN does not improve recipient outcomes and may worsen donor outcomes compared to standard fluid management.


Asunto(s)
Fluidoterapia/mortalidad , Cuidados Intraoperatorios/mortalidad , Fallo Renal Crónico/cirugía , Trasplante de Riñón/mortalidad , Laparoscopía/mortalidad , Nefrectomía/mortalidad , Complicaciones Posoperatorias/mortalidad , Adulto , Femenino , Estudios de Seguimiento , Tasa de Filtración Glomerular , Humanos , Pruebas de Función Renal , Donadores Vivos , Masculino , Persona de Mediana Edad , Pronóstico , Estudios Retrospectivos , Factores de Riesgo , Tasa de Supervivencia , Recolección de Tejidos y Órganos , Receptores de Trasplantes
5.
Clin Transplant ; 32(3): e13189, 2018 03.
Artículo en Inglés | MEDLINE | ID: mdl-29292535

RESUMEN

OBJECTIVE: Peritoneal dialysis (PD) patients have equivalent or slightly better kidney transplant outcomes when compared to hemodialysis (HD) patients. However, given the risk for postoperative infection, we sought to determine the risk factors for PD catheter-associated infections for patients who do not have the PD catheter removed at the time of engraftment. METHODS: Demographic and outcomes data were collected from 313 sequential PD patients who underwent kidney transplant from 2000 to 2015. Risk factors for postoperative peritonitis were analyzed using logistical regression. RESULTS: Of 329 patients with PD catheters at transplant, 16 PD catheters were removed at engraftment. Of the remaining 313 patients, 8.9% suffered post-transplant peritonitis. On univariate analysis, patients with peritonitis were significantly more likely to have used the PD catheter or HD within 6 weeks after transplant. Multivariate analysis had similar findings, with increased risk for those using the PD catheter after transplant, with a trend for those who underwent HD only within 6 weeks of transplant. CONCLUSION: These results suggest that delayed graft function requiring any type of dialysis is associated with increased post-transplant peritonitis risk.


Asunto(s)
Catéteres de Permanencia/efectos adversos , Fallo Renal Crónico/cirugía , Trasplante de Riñón/efectos adversos , Diálisis Peritoneal/efectos adversos , Peritonitis/etiología , Complicaciones Posoperatorias , Adulto , Femenino , Estudios de Seguimiento , Tasa de Filtración Glomerular , Humanos , Pruebas de Función Renal , Masculino , Persona de Mediana Edad , Pronóstico , Estudios Retrospectivos , Factores de Riesgo
6.
J Hepatol ; 67(3): 517-525, 2017 09.
Artículo en Inglés | MEDLINE | ID: mdl-28483678

RESUMEN

BACKGROUND & AIM: The goal of organ allocation is to distribute a scarce resource equitably to the sickest patients. In the United States, the Model for End-stage Liver Disease (MELD) is used to allocate livers for transplantation. Patients with greater MELD scores are at greater risk of death on the waitlist and are prioritized for liver transplant (LT). The MELD is capped at 40 however, and patients with calculated MELD scores >40 are not prioritized despite increased mortality. We aimed to evaluate waitlist and post-transplant survival stratified by MELD to determine outcomes in patients with MELD >40. METHODS: Using United Network for Organ Sharing data, we identified patients listed for LT from February 2002 through to December 2012. Waitlist candidates with MELD ⩾40 were followed for 30days or until the earliest occurrence of death or transplant. RESULTS: Of 65,776 waitlisted patients, 3.3% had MELD ⩾40 at registration, and an additional 7.3% had MELD scores increase to ⩾40 after waitlist registration. A total of 30,369 (46.2%) underwent LT, of which 2,615 (8.6%) had MELD ⩾40 at transplant. Compared to MELD 40, the hazard ratio of death within 30days of registration was 1.4 (95% CI 1.2-1.6) for patients with MELD 41-44, 2.6 (95% CI 2.1-3.1) for MELD 45-49, and 5.0 (95% CI 4.1-6.1) for MELD ⩾50. There was no difference in 1- and 3-year survival for patients transplanted with MELD >40 compared to MELD=40. A survival benefit associated with LT was seen as MELD increased above 40. CONCLUSIONS: Patients with MELD >40 have significantly greater waitlist mortality but comparable post-transplant outcomes to patients with MELD=40 and, therefore, should be given priority for LT. Uncapping the MELD will allow more equitable organ distribution aligned with the principle of prioritizing patients most in need. Lay summary: In the United States (US), organs for liver transplantation are allocated by an objective scoring system called the Model for End-stage Liver Disease (MELD), which aims to prioritize the sickest patients for transplant. The greater the MELD score, the greater the mortality without liver transplant. The MELD score, however, is artificially capped at 40 and thus actually disadvantages the sickest patients with end-stage liver disease. Analysis of the data advocates uncapping the MELD score to appropriately prioritize the patients most in need of a liver transplant.


Asunto(s)
Enfermedad Hepática en Estado Terminal/cirugía , Trasplante de Hígado , Obtención de Tejidos y Órganos , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Trasplante de Hígado/mortalidad , Masculino , Persona de Mediana Edad , Listas de Espera , Adulto Joven
7.
Liver Transpl ; 22(1): 71-9, 2016 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-26069168

RESUMEN

The survival benefit of simultaneous liver-kidney transplantation (SLKT) over liver transplantation alone (LTA) is unclear from the current literature. Additionally, the role of donor kidney quality, measured by the kidney donor risk index (KDRI), in survival benefit of SLKT is not studied. We compared survival benefit after SLKT and LTA among recipients with similar pretransplant renal dysfunction using novel methodology, specifically with respect to survival probability and area under the survival curve by dialysis status and KDRI. Data were obtained from the Scientific Registry of Transplant Recipients. The study cohort included patients with pre-liver transplantation (LT) renal dysfunction who were wait-listed and received either a SLKT (n = 1326) or a LTA (n = 4283) between March 1, 2002 and December 31, 2009. Inverse Probability of Treatment Weighting-SLKT and LTA survival curves, along with the 5-year area under the survival curve, were computed by dialysis status at transplant. The difference in the area under the curve represents the average additional survival time gained via SLKT over LTA. For patients not on dialysis, SLKT resulted in a significant 3.7-month gain in 5-year mean posttransplant survival time. The decrease in mortality rate differs significantly by KDRI, and an estimated 76% of SLKT recipients received a kidney with KDRI sufficiently low for mortality. The mortality decrease for SLKT was concentrated in the first year after transplant. The difference between SLKT and LTA 5-year mean posttransplant survival time was 1.4 months and was nonsignificant for patients on dialysis. In conclusion, the propensity score-adjusted survival among SLKT and LTA recipients was similar for those who were on dialysis at LT. Although statistically significant, the survival advantage of SLKT over LTA was of marginal clinical significance among patients not on dialysis and occurred only if the donor kidney was of sufficient quality. These results should be considered in the ongoing debate regarding the allocation of kidneys to extra-renal transplant candidates.


Asunto(s)
Trasplante de Riñón/mortalidad , Fallo Hepático/complicaciones , Trasplante de Hígado/mortalidad , Insuficiencia Renal/complicaciones , Femenino , Humanos , Fallo Hepático/cirugía , Masculino , Puntaje de Propensión , Modelos de Riesgos Proporcionales , Insuficiencia Renal/cirugía , Donantes de Tejidos , Estados Unidos/epidemiología
8.
Transfusion ; 56(12): 3073-3080, 2016 12.
Artículo en Inglés | MEDLINE | ID: mdl-27601087

RESUMEN

BACKGROUND: Therapeutic plasma exchange (TPE) is increasingly used for treatment of antibody-mediated rejection (AMR) after solid organ transplants. There is concern that TPE may increase risk of bleeding, although data are limited. After TPE, clot-based coagulation tests may not accurately represent the levels of coagulation factors due to the effect of citrate. We investigated protein levels of fibrinogen using antigen detection method (FibAg) and correlated results with a clot-based fibrinogen activity test (Fib). STUDY DESIGN AND METHODS: Nine kidney transplant recipients who received TPE for AMR were investigated. Fib, FibAg, prothrombin time/international normalized ratio (PT/INR), partial thromboplastin time (PTT), coagulation factor X chromogenic activity (CFX), and ionized calcium (iCa) were measured at pre- and post-TPE and 1, 3, 6, 9, 24, and 48 hours after the first TPE. RESULTS: Mean Fib/FibAg ratio before TPE was 1.08; therefore, all Fib values were normalized (n) by dividing by 1.08. Overall, the mean normalized Fib (nFib)/FibAg ratio at post-TPE was 0.89 and returned to close to 1.0 at 6 hours after the first TPE. Decreases in nFib, FibAg, and CFX and increases in PT/INR and PTT post-TPE were observed. The lowest Fib, FibAg, CFX, platelet, and iCa levels were still at levels that would be considered sufficient for hemostasis at all time points. CONCLUSION: The mean nFib/FibAg ratio after TPE was 0.89 and normalized in 6 hours, which demonstrates a persistent effect of citrate for up to 6 hours. Therefore, similar data observed in clot-based tests of PT/INR and PTT may be falsely elevated up to 6 hours after TPE due to the citrate effect.


Asunto(s)
Coagulación Sanguínea/efectos de los fármacos , Ácido Cítrico/farmacología , Trasplante de Riñón/efectos adversos , Intercambio Plasmático/efectos adversos , Pruebas de Coagulación Sanguínea/normas , Fibrinógeno/análisis , Hemostasis/efectos de los fármacos , Humanos , Factores de Tiempo
9.
Transfusion ; 55(4): 727-35; quiz 726, 2015 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-25385678

RESUMEN

BACKGROUND: Donor-specific antibodies (DSAs) to HLA antigens can cause acute antibody-mediated rejection (AMR) after kidney transplantation (Txp). Therapeutic plasma exchange (TPE) has been used for AMR treatment; however, DSA reduction rates are inconsistent. We investigated DSA reduction rates by HLA specificity and clinical outcome. STUDY DESIGN AND METHODS: Sixty-four courses of TPE for 56 kidney Txp recipients with high DSA were investigated. Dates of TPE procedures and Txp, patients' age, sex, race, creatinine (Cr), and mean fluorescent intensity (MFI) of DSA were retrieved. MFI reduction rate after one to three TPE and four to six TPE procedures were calculated by HLA DSA specificity in each patient, and the mean reduction rates were compared. The relationship of TPE treatment, MFI or Cr improvement rate, and graft age was also investigated. RESULTS: Patients received a mean 6.0 TPE procedures. Most received intravenous immunoglobulin after TPE and immunosuppressives. Forty-two cases (65.6%) had DSA to HLA Class I and 54 cases (84.4%) to Class II, including 32 cases (50.0%) to both. Mean MFI reduction rates after one to three TPE and four to six TPE procedures were 25.7 and 37.1% in HLA Class I, 25.1 and 34.2% in Class II, and 14.3 and 19.9% in DR51-53. The mean Cr improvements at the end of TPE and 3 and 6 months after TPE were 3.41, -0.37, and -0.72%, respectively. CONCLUSION: Six TPE procedures decreased DSA more than three TPE procedures, but reduction rate was lower by the second three TPE procedures than the first three TPE procedures. Although the mean Cr improvement was minimal, the treatment has good potential to stop further deterioration of kidney function. Better Cr improvement rate is correlated with the graft age.


Asunto(s)
Rechazo de Injerto/terapia , Antígenos HLA/inmunología , Isoanticuerpos/inmunología , Trasplante de Riñón , Intercambio Plasmático , Especificidad de Anticuerpos , Suero Antilinfocítico/uso terapéutico , Terapia Combinada , Creatinina/sangre , Rechazo de Injerto/sangre , Rechazo de Injerto/tratamiento farmacológico , Rechazo de Injerto/inmunología , Humanos , Inmunoglobulinas Intravenosas/uso terapéutico , Inmunosupresores/uso terapéutico , Isoanticuerpos/sangre , Plasmaféresis , Estudios Retrospectivos , Linfocitos T/inmunología , Resultado del Tratamiento
12.
Kidney Int ; 84(2): 390-6, 2013 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-23615503

RESUMEN

Chronic opioid usage (COU) for analgesia is common among patients with end-stage renal disease. In order to test whether a prior history of COU negatively affects post-kidney transplant outcomes, we retrospectively examined clinical outcomes in adult kidney transplant patients. Among 1064 adult kidney transplant patients, 452 (42.5%) reported the presence of various body pains and 108 (10.2%) reported a prior history of COU. While the overall death or kidney graft loss was not statistically different between patients with and without a history of COU, the cumulative mortality rate at 1, 3, and 5 years after transplantation, and during the entire study period, appeared significantly higher for patients with than without a history of COU (6.5, 18.5, and 20.4 vs. 3.2, 7.5, and 12.7%, respectively). Multivariate Cox regression analysis adjusted for potential confounding factors in entire cohorts and Cox regression analysis in 1:3 propensity-score matched cohorts suggest that a positive history of COU was significantly associated with nearly a 1.6- to 2-fold increase in the risk of death (hazard ratio 1.65, 95% confidence interval 1.04-2.60, and hazard ratio 1.92, 95% confidence interval 1.08-3.42, respectively). Thus, a history of chronic opioid usage prior to transplantation appears to be associated with increased mortality risk. Additional studies are warranted to confirm the observed association and to understand the mechanisms.


Asunto(s)
Analgésicos Opioides/efectos adversos , Dolor Crónico/tratamiento farmacológico , Fallo Renal Crónico/cirugía , Trasplante de Riñón/mortalidad , Adulto , Analgésicos Opioides/administración & dosificación , Distribución de Chi-Cuadrado , Esquema de Medicación , Femenino , Supervivencia de Injerto , Humanos , Trasplante de Riñón/efectos adversos , Modelos Logísticos , Masculino , Persona de Mediana Edad , Análisis Multivariante , Puntaje de Propensión , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Factores de Riesgo , Factores de Tiempo , Resultado del Tratamiento
13.
HPB (Oxford) ; 15(4): 286-93, 2013 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-23458449

RESUMEN

OBJECTIVES: Steroids are a mainstay of treatment in orthotopic liver transplantation (OLT) and are associated with significant morbidity. This trial was conducted to assess the efficacy of steroids avoidance. METHODS: Patients undergoing OLT between June 2002 and April 2005 were entered into a prospective, randomized trial of complete steroids avoidance and followed until November 2011. Recipients received either standard therapy (n = 50) or complete steroids avoidance (n = 50). Analyses were performed on an intention-to-treat basis. The mean follow-up of all recipients was 2095 ± 117 days. Sixteen (32%) recipients randomized to the steroids avoidance group ultimately received steroids for clinical indications. RESULTS: Incidences of diabetes and hypertension prior to or after OLT were similar in both groups, as was the incidence of rejection. Patient and graft survival rates at 1, 3 and 5 years were lower in the steroids avoidance group than in the standard therapy group (patient survival: 1-year, 80% versus 86%; 3-year, 68% versus 76%; 5-year, 60% versus 72%; graft survival: 1-year, 76% versus 76%; 3-year, 64% versus 74%; 5-year, 56% versus 72%), but the differences were not statistically different. CONCLUSIONS: Complete steroids avoidance provides liver transplant recipients with minimal benefit and appears to result in a concerning trend towards decreased graft and recipient survival. The present data support the use of at least a short course of steroids after liver transplantation.


Asunto(s)
Rechazo de Injerto/prevención & control , Supervivencia de Injerto , Inmunosupresores/uso terapéutico , Trasplante de Hígado/métodos , Adulto , Diabetes Mellitus Tipo 1/complicaciones , Femenino , Estudios de Seguimiento , Humanos , Hipertensión/complicaciones , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Factores de Riesgo , Análisis de Supervivencia , Resultado del Tratamiento
14.
J Surg Res ; 174(1): 166-75, 2012 May 01.
Artículo en Inglés | MEDLINE | ID: mdl-21276984

RESUMEN

BACKGROUND: CC chemokine receptor 5 (CCR5) plays an important role in mediating inflammation. We examined the effect of CCR5 on the immune response to adenovirus vectors and graft function in islet transplant model. MATERIALS AND METHODS: Syngeneic wild-type (WT) or CCR5-deficient (KO) mouse islets transduced with adenovirus encoding ß-gal were transplanted under the renal capsule. After transplant, blood glucose, glucose tolerance, graft cellular infiltration, transgene and chemokine/receptor expression, and systemic anti-adenoviral/-ß-gal immune response were evaluated. RESULTS: Diabetes was reversed in 1 d in both WT and KO untransduced recipients, while islets transduced with adenovirus failed to reverse diabetes until 10 d post-transplant in WT recipients or even longer (>15 d) in KO recipients (P < 0.05). A profound infiltration of CD4(+), CD8(+) cells and macrophages was observed in both WT and KO transduced grafts at 25 d. Though transgene expression was significantly reduced, insulin and ß-gal expression persisted over 3 mo. Glucose tolerance was impaired in all grafts in KO recipients compared with untransduced grafts in WT recipients at 25 d post-transplant, but was equivalent at 3 mo. Early expression of CCR2 mRNA was increased in transduced grafts in both WT and KO recipients. No systemic antivector immunity was demonstrated in any recipient group. CONCLUSIONS: Transduction of islets with adenovirus causes significant local inflammation in islet grafts and impairs early graft function in CCR5-deficient recipients, but long-term graft function is preserved. Thus, CCR5 absence does not prevent the local immune response to adenovirus transduction, and vector-associated graft dysfunction is not mediated by CCR5.


Asunto(s)
Adenoviridae/inmunología , Trasplante de Islotes Pancreáticos , Receptores CCR5/fisiología , Animales , Quimiocina CCL5/fisiología , Femenino , Vectores Genéticos/inmunología , Interferón gamma/biosíntesis , Ratones , Ratones Endogámicos C57BL , Ratones Noqueados , Receptores CCR5/deficiencia , Transducción Genética
15.
Clin Transplant ; 26(5): E536-43, 2012.
Artículo en Inglés | MEDLINE | ID: mdl-23061763

RESUMEN

Delayed graft function (DGF) is a common complication of deceased donor kidney transplantation with negative impact on clinical outcomes. In a single-center retrospective analysis, we compared patient and kidney survival, early renal function, and the incidence of acute rejection during the first year among all adult deceased donor kidney transplant patients without DGF, with DGF requiring one-time and/or more than one-time dialysis treatment between January 1, 2000, and December 31, 2008. Of 831 adult kidney transplant patients, 74 (8.9%) required one-time and 134 (16.1%) more than one-time dialysis treatment post-transplantation, respectively. While DGF patients with one-time dialysis treatment had comparable clinical outcomes to that of patients without DGF, patients with DGF requiring more than one-time dialysis treatment had a 45% increased risk for death (HR 1.45, 95% CI 1.02, 2.05, p = 0.04) after adjustment for the differences in demographic and baseline characteristics. Furthermore, DGF patients with more than one-time dialysis requirement displayed significantly lower renal function after recovery (OR 0.32, 95% CI 0.21, 0.49, p < 0.001, for eGFR ≥ 60 mL/min) and higher incidence of acute rejection during the first year (OR 1.66, 95% CI 1.11, 2.49, p = 0.015). Additional studies of therapeutic approaches to manage patients with prolonged DGF are needed.


Asunto(s)
Funcionamiento Retardado del Injerto , Rechazo de Injerto/diagnóstico , Trasplante de Riñón/mortalidad , Diálisis Renal/mortalidad , Donantes de Tejidos , Adulto , Femenino , Estudios de Seguimiento , Rechazo de Injerto/etiología , Supervivencia de Injerto , Humanos , Masculino , Persona de Mediana Edad , Pronóstico , Estudios Retrospectivos , Tasa de Supervivencia
16.
Transplant Direct ; 8(7): e1343, 2022 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-35747522

RESUMEN

Recent events of racial injustice prompted us to study potential impact of removing race from kidney donor risk index (KDRI) calculator. Methods: We used Scientific Registry for Transplant Recipients data to analyze outcomes of 66 987 deceased-donor kidney transplants performed in the United States between 2010 and 2016. Graft failure (GF) was defined as death or return to dialysis or requiring repeat transplant. We compared original KDRI and a race-free KDRI (Black donor coefficient zeroed out in the KDRI formula) with respect to recategorization of perceived GF risk (based on KDPI categories: ≤20, 21-34, 35-85, ≥86)' risk discrimination (using the C statistic) and predictive accuracy (using Brier score), and GF risk prediction (using Cox regression on time-to-GF). We used logistic regression to study the impact of donor race on discard probability. Results: There were 10 949 (16.3% of recipients) GF, and 1893 (17% of GFs) were among recipients of kidneys from Black donors. The use of race-free KDRI resulted in reclassification of 49% of kidneys from Black donors into lower GF risk categories. The impact on GF risk discrimination was minimal, with a relative decrease in C statistic of 0.16% and a change in GF predictive accuracy of 0.07%. For a given recipient/donor combination, transplants from Black (compared with non-Black) donors are estimated to decrease predicted graft survival at 1-y by 0.3%-3%, and 5-y by 1%-6%. Kidneys from Black donors are significantly more likely to be discarded (odds ratio adjusted for KDRI except race = 1.24). We estimate that an equal discard probability for Black and non-Black donors would yield 70 additional kidney transplants annually from Black donors. Conclusions: Use of race-free KDRI did not impact GF risk discrimination or predictive accuracy and may lower discard of kidneys from Black donors. We recommend use of race-free KDRI calculator acknowledging the possibility of miscalculation of GF risk in small proportion of kidneys from Black donors.

17.
Clin Transplant ; 25(6): E592-8, 2011.
Artículo en Inglés | MEDLINE | ID: mdl-21906173

RESUMEN

With an increasing number of individuals with end-stage organ disease and the increasing success of organ transplantation, the demand for transplants has steadily increased. This growth has led to a greater need to utilize organs from as many donors as possible. As selection criteria have become less stringent to accommodate increasing demand, transplant outcomes are more strongly influenced by recipient and donor factors; thus, finding the right organ for the right recipient is more important than ever. The Ninth Annual American Society of Transplant Surgeons (ASTS) State-of-the-Art Winter Symposium, entitled "The Right Organ for the Right Recipient," addressed the matching of donor organs to appropriate recipients. Representative dilemmas in the matching of donor organs with recipients were discussed. These included the following: matching by donor and recipient risk characteristics; use of organs with risk for disease transmission; biologic incompatibility; use of organs from donors after cardiac death; the justification for combined organ transplants like liver-kidney and kidney-pancreas; and the role of allocation in facilitating the matching of donors and recipients. Regardless of the particular issue, decisions about donor-recipient matching should be evidence-based, practical, and made with the goal of maximizing organ utilization while still protecting individual patient interests.


Asunto(s)
Rechazo de Injerto/prevención & control , Trasplante de Órganos , Selección de Paciente , Obtención de Tejidos y Órganos , Humanos , Sociedades Médicas
18.
Transplantation ; 105(12): 2596-2605, 2021 12 01.
Artículo en Inglés | MEDLINE | ID: mdl-33950636

RESUMEN

BACKGROUND: The 125I-iothalamate clearance and 99mTc diethylenetriamine-pentaacetic acid (99mTc-DTPA) split scan nuclear medicine studies are used among living kidney donor candidates to determine measured glomerular filtration rate (mGFR) and split scan ratio (SSR). The computerized tomography-derived cortical volume ratio (CVR) is a novel measurement of split kidney function and can be combined with predonation estimated GFR (eGFR) or mGFR to predict postdonation kidney function. Whether predonation SSR predicts postdonation kidney function better than predonation CVR and whether predonation mGFR provides additional information beyond predonation eGFR are unknown. METHODS: We performed a single-center retrospective analysis of 204 patients who underwent kidney donation between June 2015 and March 2019. The primary outcome was 1-y postdonation eGFR. Model bases were created from a measure of predonation kidney function (mGFR or eGFR) multiplied by the proportion that each nondonated kidney contributed to predonation kidney function (SSR or CVR). Multivariable elastic net regression with 1000 repetitions was used to determine the mean and 95% confidence interval of R2, root mean square error (RMSE), and proportion overprediction ≥15 mL/min/1.73 m2 between models. RESULTS: In validation cohorts, eGFR-CVR models performed best (R2, 0.547; RMSE, 9.2 mL/min/1.73 m2, proportion overprediction 3.1%), whereas mGFR-SSR models performed worst (R2, 0.360; RMSE, 10.9 mL/min/1.73 m2, proportion overprediction 7.2%) (P < 0.001 for all comparisons). CONCLUSIONS: These findings suggest that predonation CVR may serve as an acceptable alternative to SSR during donor evaluation and furthermore, that a model based on CVR and predonation eGFR may be superior to other methods.


Asunto(s)
Trasplante de Riñón , Medicina Nuclear , Tasa de Filtración Glomerular , Humanos , Radioisótopos de Yodo , Riñón/diagnóstico por imagen , Trasplante de Riñón/efectos adversos , Trasplante de Riñón/métodos , Donadores Vivos , Estudios Retrospectivos , Tomografía Computarizada por Rayos X
19.
Clin Transplant ; 24(1): 23-8, 2010.
Artículo en Inglés | MEDLINE | ID: mdl-19919609

RESUMEN

The evolution of organ transplantation has produced results so successful that many transplant programs commonly see recipients with medical risks, which in the past, would have prohibited transplantation. The Eighth Annual American Society of Transplant Surgeons State-of-the-Art Winter Symposium focused on the high-risk recipient. The assessment of risk has evolved over time, as transplantation has matured. The acceptance of risk associated with a given candidate today is often made in consideration of the relative value of the organ to other candidates, the regulatory environment, and philosophical notions of utility, equity, and fairness. In addition, transplant programs must balance outcomes, transplant volume, and the costs of organ transplantation, which are impacted by high-risk recipients. Discussion focused on various types of high-risk recipients, such as those with coronary artery disease, morbid obesity, and hepatitis C; strategies to reduce risk, such as down-staging of hepatocellular carcinoma and treatment of pulmonary hypertension; the development of alternatives to transplantation; and the degree to which risk can or should be used to define candidate selection. These approaches can modify the impact of recipient risk on transplant outcomes and permit transplantation to be applied successfully to a greater variety of patients.


Asunto(s)
Trasplante de Órganos , Selección de Donante , Humanos , Donadores Vivos , Trasplante de Órganos/efectos adversos , Trasplante de Órganos/economía , Trasplante de Órganos/métodos , Selección de Paciente , Medición de Riesgo , Factores de Riesgo
20.
Ann Transplant ; 25: e922178, 2020 Sep 15.
Artículo en Inglés | MEDLINE | ID: mdl-32929057

RESUMEN

BACKGROUND Peripheral vascular disease and iliac arterial calcification are prevalent in kidney transplant candidates and jeopardize graft outcomes. We report our experience with computed tomography (CT) screening for iliac arterial calcification. MATERIAL AND METHODS We retrospectively reviewed electronic medical records of 493 renal transplant candidates from protocol initiation in 2014. Non-contrast CT was performed or retrospectively reviewed if any of the following criteria were present: diabetes, ESRD >6 years, 25 pack-years of smoking or current smoker, diagnosis of peripheral vascular disease, parathyroidectomy, and coronary artery disease intervention. Differences in evaluation and transplant outcomes between groups were compared with chi-squared analysis. Multivariate logistic regression identified predictive criteria for presence of iliac arterial calcification. RESULTS Of 493 candidates evaluated, CTs were reviewed in 346 (70.2%). Iliac arterial calcification was identified in 119 screened candidates (34.4%). Of candidates with iliac arterial calcification identified on CT, 16 (13.4%) were excluded for CT findings, and 9 (7.6%) had their surgical management plan changed. Overall, 91 (76.5%) candidates with iliac arterial calcification on CT were approved, compared to 203 (89.4%) without calcification (P<0.001). The percentage of screened patients with iliac arterial calcification on CT increased with increasing age (P<0.0005). Age and diabetes mellitus were predictive of calcification. CONCLUSIONS Many kidney transplant candidates are at risk for iliac arterial calcification, although such calcification does not prevent transplantation for most candidates who have it. Algorithmic pre-operative screening has clinical value in determining transplant candidacy and potentially improving postoperative outcomes in patients requiring kidney transplantation.


Asunto(s)
Arteria Ilíaca/diagnóstico por imagen , Trasplante de Riñón , Calcificación Vascular/diagnóstico por imagen , Femenino , Humanos , Masculino , Persona de Mediana Edad , Estudios Retrospectivos , Tomografía Computarizada por Rayos X
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA