Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 17 de 17
Filtrar
3.
Transplant Direct ; 7(3): e670, 2021 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-34104709

RESUMO

Explantation of native viscera in multivisceral transplant candidates, particularly in those with extensive portomesenteric thrombosis (PMT), carries considerable morbidity due to extensive vascularized adhesions. Preemptive visceral angioembolization has been previously described as a technique to minimize excessive blood loss during mobilization of the native viscera but is not well described specifically in patients with extensive PMT. METHODS: In a series of 5 patients who underwent mutivisceral transplant for PMT from June 2015 to November 2018, we performed preoperative superior mesenteric, splenic, and hepatic artery embolization to reduce blood loss during explanation and evaluated the blood loss and blood product utilization, as well as 30-day rates of infectious complications. RESULTS: Following preemptive embolization, median total blood loss was 6000 mL (range 800-7000 mL). The median transfusion requirements were as follows: 16 units packed red blood cells (range 2-47), 14 units fresh frozen plasma (range 0-29), 2 units cryoprecipitate (range 1-14), 4 units platelets (range 2-10), and 500 mL cell saver autotransfusion (range 0-1817). In the first 30 postoperative days, 2 out of 5 patients developed positive blood cultures and 3 out of 5 developed complex intra-abdominal infections. Two patients developed severe graft pancreatitis resulting in mycotic aneurysm of the aortic conduit; bleeding from the aneurysm led to 1 patient mortality. CONCLUSIONS: Preoperative embolization is an effective modality to mitigate exsanguinating blood loss during multivisceral transplant in patients with portomesenteric thrombosis; however, it is unclear if the resultant native organ ischemia during explant carries clinically relevant consequences.

4.
Liver Transpl ; 27(3): 425-433, 2021 02.
Artigo em Inglês | MEDLINE | ID: mdl-33188659

RESUMO

Liver grafts from pediatric donors represent a small fraction of grafts transplanted into adult recipients, and their use in adults requires special consideration of donor size to prevent perioperative complications. In the past, graft weight or volume ratios have been adopted from the living donor liver transplant literature to guide clinicians; however, these metrics are not regularly available to surgeons accepting deceased donor organs. In this study, we evaluated all pediatric-to-adult liver transplants in the United Network for Organ Sharing Standard Transplant Analysis and Research database from 1987 to 2019, stratified by donor age and donor-recipient height mismatch ratio (HMR; defined as donor height/recipient height). On multivariable regression controlling for cold ischemia time, age, and transplantation era, the use of donors from ages 0 to 4 and 5 to 9 had increased risk of graft failure (hazard ratio [HR], 1.81 [P < 0.01] and HR, 1.16 [P < 0.01], respectively) compared with donors aged 15 to 17. On Kaplan-Meier survival analysis, a HMR < 0.8 was associated with inferior graft survival (mean, 11.8 versus 14.6 years; log-rank P < 0.001) and inferior patient survival (mean, 13.5 versus 14.9 years; log-rank P < 0.01) when compared with pairs with similar height (HMR, 0.95-1.05; ie, donors within 5% of recipient height). This study demonstrates that both young donor age and low HMR confer additional risk in adult recipients of pediatric liver grafts.


Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Adolescente , Adulto , Criança , Sobrevivência de Enxerto , Humanos , Estimativa de Kaplan-Meier , Transplante de Fígado/efeitos adversos , Doadores Vivos , Estudos Retrospectivos , Doadores de Tecidos , Transplantados , Resultado do Tratamento
5.
Am J Transplant ; 20(3): 752-760, 2020 03.
Artigo em Inglês | MEDLINE | ID: mdl-31553125

RESUMO

This study aimed to understand the relationship of preoperative measurements and risk factors on operative time and outcomes of laparoscopic donor nephrectomy. Two hundred forty-two kidney donors between 2010 and 2017 were identified. Patients' demographic, anthropomorphic, and operative characteristics were abstracted from the electronic medical record. Glomerular filtration rates (GFR) were documented before surgery, within 24 hours, 6, 12, and 24 months after surgery. Standard radiological measures and kidney volumes, and subcutaneous and perinephric fat thicknesses were assessed by three radiologists. Data were analyzed using standard statistical measures. There was significant correlation between cranio-caudal and latero-lateral diameters (P < .0001) and kidney volume. The left kidney was transplanted in 92.6% of cases and the larger kidney in 69.2%. Kidney choice (smaller vs. larger) had no statistically significant impact on the rate of change of donor kidney function over time adjusting for age, sex and race (P = .61). Perinephric fat thickness (+4.08 minutes) and surgery after 2011 were significantly correlated with operative time (P ≤ .01). In conclusion, cranio-caudal diameters can be used as a surrogate measure for volume in the majority of donors. Size may not be a decisive factor for long-term donor kidney function. Perinephric fat around the donor kidney should be reported to facilitate operative planning.


Assuntos
Transplante de Rim , Laparoscopia , Taxa de Filtração Glomerular , Humanos , Rim/diagnóstico por imagem , Rim/cirurgia , Doadores Vivos , Nefrectomia , Estudos Retrospectivos , Coleta de Tecidos e Órgãos
6.
Front Pediatr ; 7: 102, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-30972314

RESUMO

Background: Currently, there is no standardized approach for determining psychosocial readiness in pediatric transplantation. We examined the utility of the Psychosocial Assessment of Candidates for Transplantation (PACT) to identify pediatric kidney transplant recipients at risk for adverse clinical outcomes. Methods: Kidney transplant patients <21-years-old transplanted at Duke University Medical Center between 2005 and 2015 underwent psychosocial assessment by a social worker with either PACT or unstructured interview, which were used to determine transplant candidacy. PACT assessed candidates on a scale of 0 (poor candidate) to 4 (excellent candidate) in areas of social support, psychological health, lifestyle factors, and understanding. Demographics and clinical outcomes were analyzed by presence or absence of PACT and further characterized by high (≥3) and low (≤2) scores. Results: Of 54 pediatric patients, 25 (46.3%) patients underwent pre-transplant evaluation utilizing PACT, while 29 (53.7%) were not evaluated with PACT. Patients assessed with PACT had a significantly lower percentage of acute rejection (16.0 vs. 55.2%, p = 0.007). After adjusting for HLA mismatch, a pre-transplant PACT score was persistently associated with lower odds of acute rejection (Odds Ratio 0.119, 95% Confidence Interval 0.027-0.52, p = 0.005). In PACT subsection analysis, the lack of family availability (OR 0.08, 95% CI 0.01-0.97, p = 0.047) and risk for psychopathology (OR 0.34, 95% CI 0.13-0.87, p = 0.025) were associated with a low PACT score and post-transplant non-adherence. Conclusions: Our study highlights the importance of standardized psychosocial assessments and the potential use of PACT in risk stratifying pre-transplant candidates.

7.
Am J Transplant ; 19(3): 781-789, 2019 03.
Artigo em Inglês | MEDLINE | ID: mdl-30171800

RESUMO

Delayed graft function (DGF) is a risk factor for acute rejection (AR) in renal transplant recipients, and KDIGO guidelines suggest use of lymphocyte-depletion induction when DGF is anticipated. We analyzed the United Network for Organ Sharing/Organ Procurement and Transplantation Network (UNOS/OPTN) database to assess the impact of induction immunosuppression on the risk of AR in deceased kidney recipients based on pretransplant risk of DGF using a validated model. Recipients were categorized into 4 groups based upon the induction immunosuppression: (1) Rabbit anti-thymocyte globulin (rATG); (2) Alemtuzumab (C1H); (3) IL2-receptor antagonists (IL2-RA; basiliximab or daclizumab), and (4) No antibody induction. The primary endpoint for analysis was a composite endpoint of treated AR or graft failure by 1-year posttransplantation. Compared to no antibody induction, rATG and C1H had consistently lower adjusted odds of the composite endpoint across all risk strata for DGF risk, whereas IL2-Ra was associated with increased adjusted odds of the composite endpoint with increasing DGF risk. When the induction agents were compared, rATG and C1H were associated with decreasing adjusted odds for the composite endpoint with increasing risk of DGF, especially at the higher risk spectrum of DGF. Consideration must be given to use of lymphocyte-depletion induction when the anticipated risk of DGF is increased.


Assuntos
Função Retardada do Enxerto/etiologia , Rejeição de Enxerto/etiologia , Terapia de Imunossupressão , Falência Renal Crônica/cirurgia , Transplante de Rim/efeitos adversos , Depleção Linfocítica/efeitos adversos , Complicações Pós-Operatórias , Adolescente , Adulto , Idoso , Função Retardada do Enxerto/patologia , Feminino , Seguimentos , Taxa de Filtração Glomerular , Rejeição de Enxerto/patologia , Sobrevivência de Enxerto , Humanos , Falência Renal Crônica/imunologia , Testes de Função Renal , Masculino , Pessoa de Meia-Idade , Prognóstico , Fatores de Risco , Transplantados , Adulto Jovem
9.
Transplant Direct ; 4(5): e344, 2018 May.
Artigo em Inglês | MEDLINE | ID: mdl-29796415

RESUMO

Accessory gallbladder in a donor liver allograft is an uncommon anatomical finding that can complicate liver transplantation if unrecognized. This case describes a patient who underwent liver transplantation with a donor graft containing an accessory gallbladder that was obscured during transplantation; as a result, the patient experienced a prolonged postoperative course complicated by multiple readmissions for suspected biloma and intra-abdominal infection. The diagnosis of accessory gallbladder was not made until operative exploration several months after the initial transplant. Removal of the accessory gallbladder has led to resolution of clinical problems.

11.
Ann Surg ; 267(6): 1169-1172, 2018 06.
Artigo em Inglês | MEDLINE | ID: mdl-28650358

RESUMO

OBJECTIVE: The aim of this study was to investigate the volume-outcome relationship in kidney transplantation by examining graft and patient outcomes using standardized risk adjustment (observed-to-expected outcomes). A secondary objective was to examine the geographic proximity of low, medium, and high-volume kidney transplant centers in the United States. SUMMARY OF BACKGROUND DATA: The significant survival benefit of kidney transplantation in the context of a severe shortage of donor organs mandates strategies to optimize outcomes. Unlike for other solid organ transplants, the relationship between surgical volume and kidney transplant outcomes has not been clearly established. METHODS: The Scientific Registry of Transplant Recipients was used to examine national outcomes for adults undergoing deceased donor kidney transplantation from January 1, 1999 to December 31, 2013 (15-year study period). Observed-to-expected rates of graft loss and patient death were compared for low, medium, and high-volume centers. The geographic proximity of low-volume centers to higher volume centers was determined to assess the impact of regionalization on patient travel burden. RESULTS: A total of 206,179 procedures were analyzed. Compared with low-volume centers, high-volume centers had significantly lower observed-to-expected rates of 1-month graft loss (0.93 vs 1.18, P<0.001), 1-year graft loss (0.97 vs 1.12, P<0.001), 1-month patient death (0.90 vs 1.29, P=0.005), and 1-year patient death (0.95 vs 1.15, P=0.001). Low-volume centers were frequently in close proximity to higher volume centers, with a median distance of 7 miles (interquartile range: 2 to 75). CONCLUSIONS: A robust volume-outcome relationship was observed for deceased donor kidney transplantation, and low-volume centers are frequently in close proximity to higher volume centers. Increased regionalization could improve outcomes, but should be considered carefully in light of the potential negative impact on transplant volume and access to care.


Assuntos
Falência Renal Crônica/cirurgia , Transplante de Rim/métodos , Transplante de Rim/estatística & dados numéricos , Avaliação de Resultados da Assistência ao Paciente , Doadores de Tecidos , Morte , Sobrevivência de Enxerto , Acessibilidade aos Serviços de Saúde , Planejamento Hospitalar , Humanos , Falência Renal Crônica/mortalidade , Transplante de Rim/mortalidade , Doadores de Tecidos/provisão & distribuição , Estados Unidos/epidemiologia
12.
Cureus ; 8(11): e887, 2016 Nov 22.
Artigo em Inglês | MEDLINE | ID: mdl-28018757

RESUMO

There has been increasing concern in the kidney transplant community about the declining use of expanded criteria donors (ECD) despite improvement in survival and quality of life. The recent introduction of the Kidney Donor Profile Index (KDPI), which provides a more granular characterization of donor quality, was expected to increase utilization of marginal kidneys and decrease the discard rates. However, trends and practice patterns of ECD kidney utilization on a national level based on donor organ quality as per KDPI are not well known. We, therefore, performed a trend analysis of all ECD recipients in the United Network for Organ Sharing (UNOS) registry between 2002 and 2012, after calculating the corresponding KDPI, to enable understanding the trends of usage and outcomes based on the KDPI characterization. High-risk recipient characteristics (diabetes, body mass index ≥30 kg/m2, hypertension, and age ≥60 years) increased over the period of the study (trend test p<0.001 for all). The proportion of ECD transplants increased from 18% in 2003 to a peak of 20.4% in 2008 and then declined thereafter to 17.3% in 2012. Using the KDPI >85% definition, the proportion increased from 9.4% in 2003 to a peak of 12.1% in 2008 and declined to 9.7% in 2012. Overall, although this represents a significant utilization of kidneys with KDPI >85% over time (p<0.001), recent years have seen a decline in usage, probably related to regulations imposed by Centers for Medicare & Medicaid Services (CMS). When comparing the hazards of graft failure by KDPI, ECD kidneys with KDPI >85% have a slightly lower risk of graft failure compared to standard criteria donor (SCD) kidneys with KDPI >85%, with a hazard ratio (HR) of 0.95, a confidence interval (CI) of 0.94-0.96, and statistical significance of p<0.001. This indicates that some SCD kidneys may actually have a lower estimated quality, with a higher Kidney Donor Risk Index (KDRI), than some ECDs. The incidence of delayed graft function (DGF) in ECD recipients has significantly decreased over time from 35.2% in 2003 to 29.6% in 2011 (p=0.007), probably related to better understanding of the donor risk profile along with increased use of hypothermic machine perfusion and pretransplant biopsy to aid in optimal allograft selection. The recent decline in transplantation of KDPI >85% kidneys probably reflects risk-averse transplant center behavior. Whether discard of discordant SCD kidneys with KDPI >85% has contributed to this decline remains to be studied.

13.
Cureus ; 8(11): e889, 2016 Nov 22.
Artigo em Inglês | MEDLINE | ID: mdl-28018759

RESUMO

INTRODUCTION AND BACKGROUND: Gastrointestinal (GI) recovery after major abdominal surgery can be delayed from an ongoing need for narcotic analgesia thereby prolonging hospitalization. Enhanced recovery after surgery (ERAS) is a multimodal perioperative care pathway designed to facilitate early recovery after major surgery by maintaining preoperative body composition and physiological organ function and modifying the stress response induced by surgical exposure. Enhanced recovery programs (ERPs) in colorectal surgery have decreased the duration of postoperative ileus and the hospital stay while showing equivalent morbidity, mortality, and readmission rates in comparison to the traditional standard of care. This study is a pilot trial to evaluate the benefits of ERAS protocols in living kidney donors undergoing laparoscopic nephrectomy. METHODS: This is a single-center, non-randomized, retrospective analysis comparing the outcomes of the first 40 live kidney donors subjected to laparoscopic nephrectomy under the ERAS protocol to 40 donors operated prior to ERAS with traditional standard of care. Our ERAS protocol includes reduced duration of fasting with preoperative carbohydrate loading, intraoperative fluid restriction to 3 ml/kg/hr, target urine output of 0.5 ml/kg/hr, use of subfascial Exparel injection (bupivacaine liposome suspension), and postoperative narcotic-free pain regimen with acetaminophen, ketorolac, or tramadol. Short-term patient outcomes were compared using Pearsons's Chi-Squared test for categorical variables and the Kruskal-Wallis test for continuous variables. Additionally, a multivariate analysis was conducted to evaluate factors influencing patient length of stay and likelihood of readmission. RESULTS: ERAS protocol reduced the postoperative median length of stay decreased from 2.0 to 1.0 days (p=0.001). Overall pain scores were significantly lower in the ERAS group (peak pain score 6.0 vs. 8.00, p< 0.001; morning after surgery pain score 3.0 vs. 7.0, p=0.001; lowest pain score 0.0 vs. 2.0, p=0.016) despite the absence of postoperative narcotics. The average duration of surgery was shorter in the ERAS group (248 vs. 304 minutes, p<0.001). The average amount of intraoperative fluid used was significantly lower in the ERAS group (2500 ml vs. 3525 ml, p<0.001) without affecting the donor renal function. The incidence of delayed graft function was similar in the two groups (p=0.541). A trend toward lower readmission was noted with the ERAS protocol (12.8% vs. 27.5%, p=0.105). GI dysfunction was the most common reason for readmission. CONCLUSION: Application of an ERAS protocol in a laparoscopic living donor nephrectomy was associated with reduced length of hospitalization and improved pain scores related likely to intraoperative use of subfascial Exparel and a shorter duration of ileus. Restricted use of intraoperative fluids prevents excessive third spacing and bowel edema, enhancing gut recovery without adversely impacting recipient graft function. This study suggests that ERAS has the potential to enhance the advantages of laparoscopic surgery for live kidney donation through optimizing donor outcomes and perioperative patient satisfaction.

14.
J Comput Assist Tomogr ; 39(4): 506-9, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25853775

RESUMO

Torsion of an allograft kidney is an extremely rare and potentially reversible complication. Imaging diagnosis plays a crucial role because of the absence of specific clinical features. We report 2 cases in which kidney torsion after simultaneous kidney-pancreas transplant was diagnosed by ferumoxytol-enhanced magnetic resonance imaging/angiography and present a review of the relevant literature. Radiologists and clinicians should be aware of this entity because graft salvage depends on rapid diagnosis and surgical detorsion.


Assuntos
Nefropatias/diagnóstico , Nefropatias/etiologia , Transplante de Rim/efeitos adversos , Imageamento por Ressonância Magnética , Transplante de Pâncreas/efeitos adversos , Anormalidade Torcional/diagnóstico , Anormalidade Torcional/etiologia , Adulto , Humanos , Rim/patologia , Rim/cirurgia , Nefropatias/cirurgia , Masculino , Anormalidade Torcional/cirurgia
15.
Transplantation ; 99(2): 309-15, 2015 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-25594554

RESUMO

BACKGROUND: Previous studies demonstrate that graft survival from older living kidney donors (LD; age>60 years) is worse than younger LD but similar to deceased standard criteria donors (SCD). Limited sample size has precluded more detailed analyses of transplants from older LD. METHODS: Using the United Network for Organ Sharing database from 1994 to 2012, recipients were categorized by donor status: SCD, expanded criteria donor (ECD), or LD (by donor age: <60, 60-64, 65-69, ≥70 years). Adjusted models, controlling for donor and recipient risk factors, evaluated graft and recipient survivals. RESULTS: Of 250,827 kidney transplants during the study period, 92,646 were LD kidneys, with 4.5% of these recipients (n=4,186) transplanted with older LD kidneys. The use of LD donors 60 years or older increased significantly from 3.6% in 1994 to 7.4% in 2011. Transplant recipients with older LD kidneys had significantly lower graft and overall survival compared to younger LD recipients. Compared to SCD recipients, graft survival was decreased in recipients with LD 70 years or older, but overall survival was similar. Older LD kidney recipients had better graft and overall survival than ECD recipients. CONCLUSIONS: As use of older kidney donors increases, overall survival among kidney transplant recipients from older living donors was similar to or better than SCD recipients, better than ECD recipients, but worse than younger LD recipients. With increasing kidney donation from older adults to alleviate profound organ shortages, the use of older kidney donors appears to be an equivalent or beneficial alternative to awaiting deceased donor kidneys.


Assuntos
Seleção do Doador , Transplante de Rim/métodos , Doadores Vivos/provisão & distribuição , Transplantados , Adulto , Fatores Etários , Idoso , Bases de Dados Factuais , Feminino , Sobrevivência de Enxerto , Humanos , Transplante de Rim/efeitos adversos , Transplante de Rim/mortalidade , Masculino , Pessoa de Meia-Idade , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento , Estados Unidos , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA