Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 55.073
1.
Cell Metab ; 36(6): 1302-1319.e12, 2024 Jun 04.
Article En | MEDLINE | ID: mdl-38838642

Glucagon-like peptide-1 receptor (GLP-1R) is a key regulator of glucose metabolism known to be expressed by pancreatic ß cells. We herein investigated the role of GLP-1R on T lymphocytes during immune response. Our data showed that a subset of T lymphocytes expresses GLP-1R, which is upregulated during alloimmune response, similarly to PD-1. When mice received islet or cardiac allotransplantation, an expansion of GLP-1Rpos T cells occurred in the spleen and was found to infiltrate the graft. Additional single-cell RNA sequencing (scRNA-seq) analysis conducted on GLP-1Rpos and GLP-1Rneg CD3+ T cells unveiled the existence of molecular and functional dissimilarities between both subpopulations, as the GLP-1Rpos are mainly composed of exhausted CD8 T cells. GLP-1R acts as a T cell-negative costimulatory molecule, and GLP-1R signaling prolongs allograft survival, mitigates alloimmune response, and reduces T lymphocyte graft infiltration. Notably, GLP-1R antagonism triggered anti-tumor immunity when tested in a preclinical mouse model of colorectal cancer.


Glucagon-Like Peptide-1 Receptor , Islets of Langerhans Transplantation , Mice, Inbred C57BL , Animals , Glucagon-Like Peptide-1 Receptor/metabolism , Mice , T-Lymphocytes/immunology , T-Lymphocytes/metabolism , Male , Heart Transplantation , Mice, Inbred BALB C , CD8-Positive T-Lymphocytes/metabolism , CD8-Positive T-Lymphocytes/immunology , Graft Survival/immunology
2.
Microsurgery ; 44(5): e31200, 2024 Jul.
Article En | MEDLINE | ID: mdl-38828556

BACKGROUND: Vascularized free tissue transfer has been established as an effective method in the reconstruction of mandibular defects. However, a limited understanding of its efficacy in pediatric patients persists due to its infrequent presentation. The aim of this study is to systematically consolidate the survival and infection rates of free flaps in pediatric mandibular reconstruction. METHODS: A systematic literature search was conducted on Ovid Medline, Embase, and Cochrane Library for studies published up to January 2024. We included peer-reviewed studies reporting on survival and infection outcomes associated with free flap mandibular reconstruction in pediatric patients (<18 years). We performed a random-effects meta-analysis with the inverse-variance weighted approach to estimate survival and infection rates. Heterogeneity was assessed by I2, and publication bias was examined using Egger's test. RESULTS: A total of 26 studies, reporting on 463 free flaps and 439 pediatric patients with a mean age of 10.7 years, were included in our study. Most free flaps originated from the fibula (n = 392/463, 84.7%) and benign tumors were the most common cause for mandibular reconstruction (n = 179/463, 38.7%). The pooled estimate for survival of flaps was 96% (95% CI: 93-97, I2 = 0%), and recipient-site infections were estimated to occur in 9% (95% CI: 6-13, I2 = 0%) of cases. The most common reported complications within the study timeframe were early malocclusion (n = 28/123, 21.4%) and bite abnormalities (18/131, 13.7%). CONCLUSION: Free tissue transfer for mandibular reconstruction in pediatric patients is effective and safe. Further research is required to explore functionality following mandibular reconstruction in diverse pediatric populations.


Free Tissue Flaps , Mandibular Reconstruction , Humans , Free Tissue Flaps/transplantation , Mandibular Reconstruction/methods , Child , Graft Survival , Surgical Wound Infection/epidemiology , Surgical Wound Infection/etiology
3.
Front Immunol ; 15: 1371554, 2024.
Article En | MEDLINE | ID: mdl-38846942

Allograft rejection is a critical issue following solid organ transplantation (SOT). Immunosuppressive therapies are crucial in reducing risk of rejection yet are accompanied by several significant side effects, including infection, malignancy, cardiovascular diseases, and nephrotoxicity. There is a current unmet medical need with a lack of effective minimization strategies for these side effects. Extracorporeal photopheresis (ECP) has shown potential as an immunosuppression (IS)-modifying technique in several SOT types, with improvements seen in acute and recurrent rejection, allograft survival, and associated side effects, and could fulfil this unmet need. Through a review of the available literature detailing key areas in which ECP may benefit patients, this review highlights the IS-modifying potential of ECP in the four most common SOT procedures (heart, lung, kidney, and liver transplantation) and highlights existing gaps in data. Current evidence supports the use of ECP for IS modification following SOT, however there is a need for further high-quality research, in particular randomized control trials, in this area.


Graft Rejection , Immunosuppression Therapy , Organ Transplantation , Photopheresis , Photopheresis/methods , Humans , Organ Transplantation/adverse effects , Organ Transplantation/methods , Graft Rejection/prevention & control , Graft Rejection/immunology , Immunosuppression Therapy/methods , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/adverse effects , Graft Survival
4.
PLoS One ; 19(6): e0301425, 2024.
Article En | MEDLINE | ID: mdl-38843258

BACKGROUND: The influence of center volume on kidney transplant outcomes is a topic of ongoing debate. In this study, we employed competing risk analyses to accurately estimate the marginal probability of graft failure in the presence of competing events, such as mortality from other causes with long-term outcomes. The incorporation of immunosuppression protocols and extended follow-up offers additional insights. Our emphasis on long-term follow-up aligns with biological considerations where competing risks play a significant role. METHODS: We examined data from 219,878 adult kidney-only transplantations across 256 U.S. transplant centers (January 2001-December 2015) sourced from the Organ Procurement and Transplantation Network registry. Centers were classified into quartiles by annual volume: low (Q1 = 28), medium (Q2 = 75), medium-high (Q3 = 121), and high (Q4 = 195). Our study investigated the relationship between center volume and 5-year outcomes, focusing on graft failure and mortality. Sub-population analyses included deceased donors, living donors, diabetic recipients, those with kidney donor profile index >85%, and re-transplants from deceased donors. RESULTS: Adjusted cause-specific hazard ratios (aCHR) for Five-Year Graft Failure and Patient Death were examined by center volume, with low-volume centers as the reference standard (aCHR: 1.0). In deceased donors, medium-high and high-volume centers showed significantly lower cause-specific hazard ratios for graft failure (medium-high aCHR = 0.892, p<0.001; high aCHR = 0.953, p = 0.149) and patient death (medium-high aCHR = 0.828, p<0.001; high aCHR = 0.898, p = 0.003). Among living donors, no significant differences were found for graft failure, while a trend towards lower cause-specific hazard ratios for patient death was observed in medium-high (aCHR = 0.895, p = 0.107) and high-volume centers (aCHR = 0.88, p = 0.061). CONCLUSION: Higher center volume is associated with significantly lower cause-specific hazard ratios for graft failure and patient death in deceased donors, while a trend towards reduced cause-specific hazard ratios for patient death is observed in living donors.


Kidney Transplantation , Transplant Recipients , Humans , Kidney Transplantation/mortality , Male , Female , Adult , Middle Aged , Transplant Recipients/statistics & numerical data , Graft Survival , Registries , Treatment Outcome , Graft Rejection , United States , Aged
5.
Pediatr Transplant ; 28(5): e14801, 2024 Aug.
Article En | MEDLINE | ID: mdl-38845603

BACKGROUND: Approximately 2500 pediatric patients are awaiting kidney transplantation in the United States, with <5% comprising those ≤15 kg. Transplant in this cohort is often delayed by center-based growth parameters, often necessitating transplantation after the initiation of dialysis. Furthermore, prognostication remains somewhat ambiguous. In this report, we scrutinize the Organ Procurement and Transplantation Network (OPTN) data from 2001 to 2021 to help better understand specific variables impacting graft and patient outcomes in these children. METHODS: The OPTN kidney transplant dataset from 2001 to 2021 was analyzed. Inclusion criteria included age <18 years, weight ≤15 kg, and recipient of primary living donor kidney transplantation (LDKT) or deceased donor kidney transplantation (DDKT). Patient and graft survival probabilities were calculated using the Kaplan-Meier method. The Cox proportional hazards model was used to calculate hazard ratio (HR) and identify variables significantly associated with patient and graft survival. RESULTS: Two thousand one hundred sixty-eight pediatric transplant recipients met inclusion criteria. Patient survival at 1 and 3 years was 98% and 97%, respectively. Graft survival at 1 and 3 years was 95% and 92%, respectively. Dialysis was the sole significant variable impacting both patient and graft survival. Graft survival was further impacted by transplant era, recipient gender and ethnicity, and donor type. Infants transplanted at Age 1 had better graft survival compared with older children, and nephrotic syndrome was likewise associated with a better prognosis. CONCLUSION: Pediatric kidney transplantation is highly successful. The balance between preemptive transplantation, medical optimization, and satisfactory technical parameters seems to suggest a "Goldilocks zone" for many children, favoring transplantation between 1 and 2 years of age.


Databases, Factual , Graft Survival , Kidney Transplantation , Tissue and Organ Procurement , Humans , Child , Female , Male , Tissue and Organ Procurement/methods , Child, Preschool , Adolescent , Prognosis , Infant , United States/epidemiology , Kidney Failure, Chronic/surgery , Body Weight , Kaplan-Meier Estimate , Treatment Outcome , Retrospective Studies , Proportional Hazards Models , Infant, Newborn
6.
BJS Open ; 8(3)2024 May 08.
Article En | MEDLINE | ID: mdl-38837956

BACKGROUND: For individuals with advanced liver disease, equipoise in outcomes between live donor liver transplant (LDLT) and deceased donor liver transplant (DDLT) is uncertain. METHODS: A retrospective cohort study was performed using data extracted from the Scientific Registry of Transplant Recipients. Adults who underwent first-time DDLT or LTDL in the United States between 2002 and 2020 were paired using propensity-score matching with 1:10 ratio without replacement. Patient and graft survival were compared using the model for end-stage liver disease (MELD) score for stratification. RESULTS: After propensity-score matching, 31 522 DDLT and 3854 LDLT recipients were included. For recipients with MELD scores ≤15, LDLT was associated with superior patient survival (HR = 0.92; 95% c.i. 0.76 to 0.96; P = 0.013). No significant differences in patient survival were observed for MELD scores between 16 and 30. Conversely, for patients with MELD scores >30, LDLT was associated with higher mortality (HR 2.57; 95% c.i. 1.35 to 4.62; P = 0.003). Graft survival was comparable between the two groups for MELD ≤15 and for MELD between 21 and 30. However, for MELD between 16 and 20 (HR = 1.15; 95% c.i. 1.00 to 1.33; P = 0.04) and MELD > 30 (HR = 2.85; 95% c.i. 1.65 to 4.91; P = 0.001), graft survival was considerably shorter after LDLT. Regardless of MELD scores, re-transplantation rate within the first year was significantly higher after LDLT. CONCLUSIONS: In this large propensity score-matched study using national data, comparable patient survival was found between LDLT and DDLT in recipients with MELD scores between 16 and 30. Conversely, for patients with MELD > 30, LDLT was associated with worse outcomes. These findings underscore the importance of transplant selection for patients with high MELD scores.


Graft Survival , Liver Transplantation , Living Donors , Propensity Score , Humans , Liver Transplantation/mortality , Male , Female , Retrospective Studies , Middle Aged , Adult , End Stage Liver Disease/surgery , End Stage Liver Disease/mortality , United States/epidemiology , Registries
7.
Nat Commun ; 15(1): 4309, 2024 Jun 03.
Article En | MEDLINE | ID: mdl-38830846

The efficacy of costimulation blockade with CTLA4-Ig (belatacept) in transplantation is limited due to T cell-mediated rejection, which also persists after induction with anti-thymocyte globulin (ATG). Here, we investigate why ATG fails to prevent costimulation blockade-resistant rejection and how this barrier can be overcome. ATG did not prevent graft rejection in a murine heart transplant model of CTLA4-Ig therapy and induced a pro-inflammatory cytokine environment. While ATG improved the balance between regulatory T cells (Treg) and effector T cells in the spleen, it had no such effect within cardiac allografts. Neutralizing IL-6 alleviated graft inflammation, increased intragraft Treg frequencies, and enhanced intragraft IL-10 and Th2-cytokine expression. IL-6 blockade together with ATG allowed CTLA4-Ig therapy to achieve long-term, rejection-free heart allograft survival. This beneficial effect was abolished upon Treg depletion. Combining ATG with IL-6 blockade prevents costimulation blockade-resistant rejection, thereby eliminating a major impediment to clinical use of costimulation blockers in transplantation.


Abatacept , Antilymphocyte Serum , Graft Rejection , Graft Survival , Heart Transplantation , Interleukin-6 , Mice, Inbred C57BL , T-Lymphocytes, Regulatory , Animals , Graft Rejection/immunology , Graft Rejection/prevention & control , Interleukin-6/metabolism , Heart Transplantation/adverse effects , Mice , T-Lymphocytes, Regulatory/immunology , T-Lymphocytes, Regulatory/drug effects , Abatacept/pharmacology , Abatacept/therapeutic use , Antilymphocyte Serum/pharmacology , Antilymphocyte Serum/therapeutic use , Graft Survival/drug effects , Graft Survival/immunology , Mice, Inbred BALB C , Allografts/immunology , Male , Immunosuppressive Agents/pharmacology , Lymphocyte Depletion , Interleukin-10/metabolism , Interleukin-10/immunology
8.
Transpl Int ; 37: 12864, 2024.
Article En | MEDLINE | ID: mdl-38832357

Simultaneous pancreas-kidney (SPK) transplantation improves quality of life and limits progression of diabetic complications. There is reluctance to accept pancreata from donors with abnormal blood tests, due to concern of inferior outcomes. We investigated whether donor amylase and liver blood tests (markers of visceral ischaemic injury) predict pancreas graft outcome using the UK Transplant Registry (2016-2021). 857 SPK recipients were included (619 following brainstem death, 238 following circulatory death). Peak donor amylase ranged from 8 to 3300 U/L (median = 70), and this had no impact on pancreas graft survival when adjusting for multiple confounders (aHR = 0.944, 95% CI = 0.754-1.81). Peak alanine transaminases also did not influence pancreas graft survival in multivariable models (aHR = 0.967, 95% CI = 0.848-1.102). Restricted cubic splines were used to assess associations between donor blood tests and pancreas graft survival without assuming linear relationships; these confirmed neither amylase, nor transaminases, significantly impact pancreas transplant outcome. This is the largest, most statistically robust study evaluating donor blood tests and transplant outcome. Provided other factors are acceptable, pancreata from donors with mild or moderately raised amylase and transaminases can be accepted with confidence. The use of pancreas grafts from such donors is therefore a safe, immediate, and simple approach to expand the donor pool to reach increasing demands.


Amylases , Graft Survival , Kidney Transplantation , Pancreas Transplantation , Tissue Donors , Humans , Female , Male , Middle Aged , Adult , Amylases/blood , Cohort Studies , Alanine Transaminase/blood , United Kingdom , Hematologic Tests , Registries
10.
Exp Clin Transplant ; 22(3): 189-199, 2024 Mar.
Article En | MEDLINE | ID: mdl-38695588

OBJECTIVES: Kidney transplant survival can be improved with better graft surveillance postoperatively. In the quest to explore new technologies, we explored the feasibility of an implantable Doppler probe as a blood flow monitoring device in kidney transplant patients. This qualitative study was embeddedin a feasibility trial and aimed to test the device's clinical acceptability and obtain suggestions for the development of the intervention. Objectives included exploring the experiences of feasibility study participants and identifying barriers to the implementation of implantable Doppler probes in clinical practice. MATERIALS AND METHODS: We conducted semi-structured interviews containing open-ended questions with 12 feasibility study participants recruited by purposive sampling. All interviews were audio-recorded with verbatim transcription. Thematic data analysis was performed at the latent level by using an inductive approach with a previously published 6-phase guide. RESULTS: Three key themes emerged: (1) perceived value of the intervention in clinical practice, (2) challenges and barriers to implementation of the intervention, and (3) suggestions forthe development of the intervention. Due to functional limitations and lack of research, medical professional participants revealed clinical equipoise regarding the utility of implantable Doppler probes. However,the device was well received by patient participants. Challenges included device training needs for medical professionals and educational sessions for patients. Innovative ideas for development included the insertion of a display screen, adopting disposable units to reduce overall cost, online access allowing remote monitoring, decreasing external monitoring unit size, and integrating a wireless connection with the probe to reduce signal errors and increase patient safety. CONCLUSIONS: The clinical need for blood flow sensing technology in kidney transplants has been widely acknowledged. Implantable Doppler probes may be a beneficial adjunct in the early postoperative surveillance of kidney transplant patients. However, the device's technical limitations are the main challenges to its acceptance in clinical practice.


Feasibility Studies , Interviews as Topic , Kidney Transplantation , Predictive Value of Tests , Qualitative Research , Ultrasonography, Doppler , Humans , Kidney Transplantation/adverse effects , Female , Male , Ultrasonography, Doppler/instrumentation , Middle Aged , Adult , Treatment Outcome , Equipment Design , Renal Circulation , Aged , Health Knowledge, Attitudes, Practice , Graft Survival , Blood Flow Velocity
11.
Exp Clin Transplant ; 22(3): 185-188, 2024 Mar.
Article En | MEDLINE | ID: mdl-38695587

OBJECTIVES: Before the advent of direct-acting antiviral therapy for hepatitis C virus, a large proportion of kidneys from donors with hepatitis C viremia were discarded. Hepatitis C virus is now amenable to effective treatment with excellent seronegativity rates. In this study, we review the outcomes of hepatitis C viremic kidneys transplanted into hepatitis C-naive recipients. MATERIALS AND METHODS: In this retrospective observational study, we examined 6 deceased donor kidneys with hepatitis C viremia that were transplanted into hepatitis C-naive recipients between March 2020 and April 2021 at a single center. Because of health insurance constraints, patients were treated for hepatitis C virus with glecaprevir/pibrentasvir for 8 weeks following seroconversion posttransplant. Primary outcome measured was viral seroconversion; secondary outcomes included graft function, posttransplant complications, and all-cause mortality. RESULTS: On average, patients seroconverted 6 days (range, 4-10 d) after transplant and began treatment 26 days (range, 15-37 d) after seroconversion. An 8-week course of antiviral treatment was successful in preventing acute hepatitis C virus infection in all patients. Posttransplant median creatinine was 1.96 mg/dL (range, 1-4.55 mg/dL), whereas median estimated glomerular filtration rate was 41.33 mL/min/1.73 m2 (range, 17-85 mL/min/1.73 m2). Patient survival rate was 66.7%, and death-censored graft survival rate was 100%. Two patients died from unrelated reasons: 1 from acute respiratory failure secondary to SARS-CoV-2 infection and 1 from posttransplant lymphoproliferative disorder. Two patients developed allograft rejection posttransplant (1 developed antibody mediated rejection, 1 developed borderline T-cell-mediated cellular rejection). Other major complications included neutropenia, fungal rash, SARS-CoV-2 infection, cytomegalovirus, BK virus, and Epstein-Barr virus reactivation. CONCLUSIONS: Use of hepatitis C-viremic donor kidneys for transplant is a safe option and has great potential to increase the kidney donor pool, as long as high index of suspicion is maintained for allograft rejection and opportunistic infections.


Antiviral Agents , Benzimidazoles , Donor Selection , Hepatitis C , Kidney Transplantation , Pyrrolidines , Quinoxalines , Viremia , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Retrospective Studies , Male , Female , Middle Aged , Antiviral Agents/therapeutic use , Hepatitis C/diagnosis , Hepatitis C/drug therapy , Treatment Outcome , Viremia/diagnosis , Viremia/virology , Adult , Time Factors , Risk Factors , Tissue Donors , Drug Combinations , Graft Survival , Aged , Rural Health Services , Seroconversion
12.
Exp Clin Transplant ; 22(3): 207-213, 2024 Mar.
Article En | MEDLINE | ID: mdl-38695589

OBJECTIVES: Modern immunosuppressive regimens have reduced rejection episodes in renal allograft recipients but have increased the risk of opportunistic infections. Infections are considered to be the second leading cause of death after cardiovascular complications in renal allograft recipients. Data on opportunistic infections affecting the allograft itself are scarce. The present study describes the spectrum of renal opportunistic infections and their outcomes diagnosed on renal allograft biopsies and nephrectomy specimens. MATERIALS AND METHODS: Our retrospective observational study was conducted from December 2011 to December 2021. We analyzed infectious episodes diagnosed on renal allograft biopsies or graft nephrectomy specimens. We obtained clinical, epidemiological, and laboratory details for analyses from hospital records. RESULTS: BK virus nephropathy was the most common opportunistic infection affecting the allograft, accounting for 47% of cases, followed by bacterial graft pyelonephritis (25%). Mucormycosis was the most common fungal infection. The diagnosis of infection from day of transplant ranged from 14 days to 39 months. Follow-up periods ranged from 1 to 10 years. Mortality was highest among patients with opportunistic fungal infection (62%), followed by viral infections, and graft failure rate was highest in patients with graft pyelonephritis (50%). Among patients with BK polyomavirus nephropathy, 45% had stable graft function compared with just 33% of patients with bacterial graft pyelonephritis. CONCLUSIONS: BK polyoma virus infection was the most common infection affecting the renal allograft in our study. Although fungal infections caused the highest mortality among our patients, bacterial graft pyelonephritis was responsible for maximum graft failure. Correctly identifying infections on histology is important so that graft and patient life can be prolonged.


Kidney Transplantation , Nephrectomy , Opportunistic Infections , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Retrospective Studies , Male , Female , Nephrectomy/adverse effects , Middle Aged , Adult , Biopsy , Treatment Outcome , Time Factors , Risk Factors , Opportunistic Infections/immunology , Opportunistic Infections/mortality , Opportunistic Infections/diagnosis , Opportunistic Infections/microbiology , Opportunistic Infections/virology , Opportunistic Infections/epidemiology , Allografts , Living Donors , Graft Survival , Turkey/epidemiology , Aged , Pyelonephritis/microbiology , Pyelonephritis/diagnosis , Pyelonephritis/mortality , Polyomavirus Infections/diagnosis , Polyomavirus Infections/mortality , Polyomavirus Infections/virology , Polyomavirus Infections/epidemiology , Polyomavirus Infections/immunology
13.
Transpl Int ; 37: 12591, 2024.
Article En | MEDLINE | ID: mdl-38694489

Tacrolimus is pivotal in pancreas transplants but poses challenges in maintaining optimal levels due to recipient differences. This study aimed to explore the utility of time spent below the therapeutic range and intrapatient variability in predicting rejection and de novo donor-specific antibody (dnDSA) development in pancreas graft recipients. This retrospective unicentric study included adult pancreas transplant recipients between January 2006 and July 2020. Recorded variables included demographics, immunosuppression details, HLA matching, biopsy results, dnDSA development, and clinical parameters. Statistical analysis included ROC curves, sensitivity, specificity, and predictive values. A total of 131 patients were included. Those with biopsy-proven acute rejection (BPAR, 12.2%) had more time (39.9% ± 24% vs. 25.72% ± 21.57%, p = 0.016) and tests (41.95% ± 13.57% vs. 29.96% ± 17.33%, p = 0.009) below therapeutic range. Specific cutoffs of 31.5% for time and 34% for tests below the therapeutic range showed a high negative predictive value for BPAR (93.98% and 93.1%, respectively). Similarly, patients with more than 34% of tests below the therapeutic range were associated with dnDSA appearance (38.9% vs. 9.4%, p = 0.012; OR 6.135, 1.346-27.78). In pancreas transplantation, maintaining optimal tacrolimus levels is crucial. Suboptimal test percentages below the therapeutic range prove valuable in identifying acute graft rejection risk.


Graft Rejection , Immunosuppressive Agents , Pancreas Transplantation , Tacrolimus , Humans , Graft Rejection/immunology , Tacrolimus/therapeutic use , Male , Retrospective Studies , Female , Adult , Immunosuppressive Agents/therapeutic use , Middle Aged , Isoantibodies/blood , Isoantibodies/immunology , Tissue Donors , Time Factors , Biopsy , Graft Survival
14.
Pediatr Transplant ; 28(4): e14742, 2024 Jun.
Article En | MEDLINE | ID: mdl-38702926

BACKGROUND: As more pediatric patients become candidates for heart transplantation (HT), understanding pathological predictors of outcome and the accuracy of the pretransplantation evaluation are important to optimize utilization of scarce donor organs and improve outcomes. The authors aimed to investigate explanted heart specimens to identify pathologic predictors that may affect cardiac allograft survival after HT. METHODS: Explanted pediatric hearts obtained over an 11-year period were analyzed to understand the patient demographics, indications for transplant, and the clinical-pathological factors. RESULTS: In this study, 149 explanted hearts, 46% congenital heart defects (CHD), were studied. CHD patients were younger and mean pulmonary artery pressure and resistance were significantly lower than in cardiomyopathy patients. Twenty-one died or underwent retransplantation (14.1%). Survival was significantly higher in the cardiomyopathy group at all follow-up intervals. There were more deaths and the 1-, 5- and 7-year survival was lower in patients ≤10 years of age at HT. Early rejection was significantly higher in CHD patients exposed to homograft tissue, but not late rejection. Mortality/retransplantation rate was significantly higher and allograft survival lower in CHD hearts with excessive fibrosis of one or both ventricles. Anatomic diagnosis at pathologic examination differed from the clinical diagnosis in eight cases. CONCLUSIONS: Survival was better for the cardiomyopathy group and patients >10 years at HT. Prior homograft use was associated with a higher prevalence of early rejection. Ventricular fibrosis (of explant) was a strong predictor of outcome in the CHD group. We presented several pathologic findings in explanted pediatric hearts.


Graft Rejection , Graft Survival , Heart Defects, Congenital , Heart Transplantation , Humans , Child , Male , Female , Child, Preschool , Infant , Adolescent , Heart Defects, Congenital/surgery , Heart Defects, Congenital/pathology , Graft Rejection/pathology , Graft Rejection/epidemiology , Retrospective Studies , Treatment Outcome , Follow-Up Studies , Cardiomyopathies/surgery , Cardiomyopathies/pathology , Reoperation , Infant, Newborn , Survival Analysis
15.
Pediatr Transplant ; 28(4): e14771, 2024 Jun.
Article En | MEDLINE | ID: mdl-38702924

BACKGROUND: We examined the combined effects of donor age and graft type on pediatric liver transplantation outcomes with an aim to offer insights into the strategic utilization of these donor and graft options. METHODS: A retrospective analysis was conducted using a national database on 0-2-year-old (N = 2714) and 3-17-year-old (N = 2263) pediatric recipients. These recipients were categorized based on donor age (≥40 vs <40 years) and graft type. Survival outcomes were analyzed using the Kaplan-Meier and Cox proportional hazards models, followed by an intention-to-treat (ITT) analysis to examine overall patient survival. RESULTS: Living and younger donors generally resulted in better outcomes compared to deceased and older donors, respectively. This difference was more significant among younger recipients (0-2 years compared to 3-17 years). Despite this finding, ITT survival analysis showed that donor age and graft type did not impact survival with the exception of 0-2-year-old recipients who had an improved survival with a younger living donor graft. CONCLUSIONS: Timely transplantation has the largest impact on survival in pediatric recipients. Improving waitlist mortality requires uniform surgical expertise at many transplant centers to provide technical variant graft (TVG) options and shed the conservative mindset of seeking only the "best" graft for pediatric recipients.


Graft Survival , Kaplan-Meier Estimate , Liver Transplantation , Tissue Donors , Humans , Child, Preschool , Retrospective Studies , Child , Adolescent , Male , Female , Infant , Age Factors , Infant, Newborn , Proportional Hazards Models , Adult , Treatment Outcome , Living Donors
16.
Microsurgery ; 44(4): e31186, 2024 May.
Article En | MEDLINE | ID: mdl-38716649

INTRODUCTION: Free flap transfer for head and neck defects has gained worldwide acceptance. Because flap failure is a devastating outcome, studies have attempted to identify risk factors-including renal failure. We sought to determine whether end-stage renal disease (ESRD) patients undergoing dialysis are at increased risk of flap failure following microsurgical head and neck reconstruction. PATIENTS AND METHODS: The study's participants were patients who underwent free flap reconstruction in the head and neck region at Hualien Tzu Chi Hospital between January 2010 and December 2019. We used the National Health Insurance "Specific Diagnosis and Treatment Code" to identify patients undergoing dialysis; these patients comprised the dialysis group, whose members were matched to a non-dialysis group for age and gender. The dependent variables were flap survival rate, take-back rate, and flap failure risk between the dialysis and non-dialysis groups. RESULTS: We included 154 patients in the dialysis (n = 14) and non-dialysis (n = 140) groups. The groups were similar in terms of age and most comorbidities, except diabetes mellitus, hypertension, and coronary artery disease, which were more prevalent in the dialysis group. The dialysis and non-dialysis groups had similar flap survival rates (100% vs. 92.9%; p = .600). Twenty-three patients underwent take-back surgery, most in the non-dialysis group (14.3% vs. 15.0%; p = 1.000). Patients in the dialysis group were more likely to have prolonged intensive care unit stays; however, dialysis alone did not predict flap failure (OR: 0.83; p = .864). CONCLUSION: This study found no significant differences in free flap survival and take-back rates between patients with and without dialysis. Dialysis did not increase the risk of flap failure following microsurgical head and neck reconstruction in this study; however, prospective, randomized controlled trials are needed.


Free Tissue Flaps , Head and Neck Neoplasms , Kidney Failure, Chronic , Microsurgery , Plastic Surgery Procedures , Renal Dialysis , Humans , Male , Female , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/complications , Middle Aged , Free Tissue Flaps/transplantation , Plastic Surgery Procedures/methods , Microsurgery/methods , Head and Neck Neoplasms/surgery , Head and Neck Neoplasms/complications , Aged , Retrospective Studies , Graft Survival , Risk Factors , Adult
17.
Clin Transplant ; 38(5): e15325, 2024 May.
Article En | MEDLINE | ID: mdl-38716770

BACKGROUND/AIMS: Direct-acting antiviral (DAA) therapy has revolutionized solid organ transplantation by providing an opportunity to utilize organs from HCV-viremic donors. Though transplantation of HCV-viremic donor organs into aviremic recipients is safe in the short term, midterm data on survival and post-transplant complications is lacking. We provide a midterm assessment of complications of lung transplantation (LT) up to 2 years post-transplant, including patient and graft survival between HCV-viremic transplantation (D+) and HCV-aviremic transplantation (D-). METHODS: This is a retrospective cohort study including 500 patients from 2018 to 2022 who underwent LT at our quaternary care institution. Outcomes of patients receiving D+ grafts were compared to those receiving D- grafts. Recipients of HCV antibody+ but PCR- grafts were treated as D- recipients. RESULTS: We identified 470 D- and 30 D+ patients meeting inclusion criteria. Crude mortality did not differ between groups (p = .43). Patient survival at years 1 and 2 did not differ between D+ and D- patients (p = .89, p = .87, respectively), and graft survival at years 1 and 2 did not differ between the two groups (p = .90, p = .88, respectively). No extrahepatic manifestations or fibrosing cholestatic hepatitis (FCH) occurred among D+ recipients. D+ and D- patients had similar rates of post-transplant chronic lung allograft rejection (CLAD) (p = 6.7% vs. 12.8%, p = .3), acute cellular rejection (60.0% vs. 58.0%, p = .8) and antibody-mediated rejection (16.7% vs. 14.2%, p = .7). CONCLUSION: There is no difference in midterm patient or graft survival between D+ and D-LT. No extrahepatic manifestations of HCV occurred. No differences in any type of rejection including CLAD were observed, though follow-up for CLAD was limited. These results provide additional support for the use of HCV-viremic organs in selected recipients in LT.


Graft Rejection , Graft Survival , Hepacivirus , Hepatitis C , Lung Transplantation , Postoperative Complications , Viremia , Humans , Lung Transplantation/adverse effects , Female , Male , Retrospective Studies , Middle Aged , Follow-Up Studies , Prognosis , Hepatitis C/surgery , Hepatitis C/virology , Hepacivirus/isolation & purification , Viremia/virology , Viremia/etiology , Survival Rate , Graft Rejection/etiology , Risk Factors , Tissue Donors/supply & distribution , Adult , Antiviral Agents/therapeutic use , Transplant Recipients
18.
Transpl Int ; 37: 12338, 2024.
Article En | MEDLINE | ID: mdl-38813393

The current gold standard for preserving vascularized composite allografts (VCA) is 4°C static cold storage (SCS), albeit muscle vulnerability to ischemia can be described as early as after 2 h of SCS. Alternatively, machine perfusion (MP) is growing in the world of organ preservation. Herein, we investigated the outcomes of oxygenated acellular subnormothermic machine perfusion (SNMP) for 24-h VCA preservation before allotransplantation in a swine model. Six partial hindlimbs were procured on adult pigs and preserved ex vivo for 24 h with either SNMP (n = 3) or SCS (n = 3) before heterotopic allotransplantation. Recipient animals received immunosuppression and were followed up for 14 days. Clinical monitoring was carried out twice daily, and graft biopsies and blood samples were regularly collected. Two blinded pathologists assessed skin and muscle samples. Overall survival was higher in the SNMP group. Early euthanasia of 2 animals in the SCS group was linked to significant graft degeneration. Analyses of the grafts showed massive muscle degeneration in the SCS group and a normal aspect in the SNMP group 2 weeks after allotransplantation. Therefore, this 24-h SNMP protocol using a modified Steen solution generated better clinical and histological outcomes in allotransplantation when compared to time-matched SCS.


Graft Survival , Organ Preservation , Perfusion , Vascularized Composite Allotransplantation , Animals , Organ Preservation/methods , Perfusion/methods , Swine , Vascularized Composite Allotransplantation/methods , Hindlimb , Composite Tissue Allografts , Models, Animal , Transplantation, Homologous , Allografts
19.
BMC Surg ; 24(1): 165, 2024 May 27.
Article En | MEDLINE | ID: mdl-38802757

BACKGROUND: Kidney transplantation (KT) improves clinical outcomes of patients with end stage renal disease. Little has been reported on the impact of early post-operative surgical complications (SC) on long-term clinical outcomes following KT. We sought to determine the impact of vascular complications, urological complications, surgical site complications, and peri-graft collections within 30 days of transplantation on patient survival, graft function, and hospital readmissions. METHODS: We conducted a single-centre, observational cohort study examining adult patients (≥ 18 years) who received a kidney transplant from living and deceased donors between January 1st, 2005 and December 31st, 2015 with follow-up until December 31st, 2016 (n = 1,334). Univariable and multivariable analyses were performed with Cox proportional hazards models to analyze the outcomes of SC in the early post-operative period after KT. RESULTS: The cumulative probability of SC within 30 days of transplant was 25%, the most common SC being peri-graft collections (66.8%). Multivariable analyses showed significant relationships between Clavien Grade 1 SC and death with graft function (HR 1.78 [95% CI: 1.11, 2.86]), and between Clavien Grades 3 to 4 and hospital readmissions (HR 1.95 [95% CI: 1.37, 2.77]). CONCLUSIONS: Early SC following KT are common and have a significant influence on long-term patient outcomes.


Kidney Failure, Chronic , Kidney Transplantation , Postoperative Complications , Humans , Kidney Transplantation/adverse effects , Male , Female , Middle Aged , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Adult , Kidney Failure, Chronic/surgery , Graft Survival , Patient Readmission/statistics & numerical data , Retrospective Studies , Treatment Outcome , Aged , Time Factors
20.
Exp Clin Transplant ; 22(4): 270-276, 2024 Apr.
Article En | MEDLINE | ID: mdl-38742317

OBJECTIVES: Induction treatment in renal transplant is associated with better graft survival. However, intensified immunosuppression is known to cause unwanted side effects such as infection and malignancy. Furthermore, the effects of the routine use of immunosuppressants in low-risk kidney transplant recipients are still not clear. In this study, we assessed the first-year safety and efficacy of induction treatment. MATERIALS AND METHODS: We examined first living donor kidney transplant patients who were on tacrolimus based immunosuppression therapy. We formed 3 groups according to the induction status: antithymocyte globulin induction, basiliximab induction, and no induction. We collected outcome data on delayed graft function, graft loss, creatinine levels, estimated glomerular filtration rates, acute rejection episodes, hospitalization episodes, and infection episodes, including cytomegalovirus infection and bacterial infections. RESULTS: We examined a total of 126 patients (age 35 ± 12 years; 65% male). Of them, 25 received antithymocyte globulin, 52 received basiliximab, and 49 did notreceive any induction treatment. We did not observe any statistically significant difference among the 3 groups in terms of acute rejection episodes, delayed graft function, and first-year graft loss. The estimated glomerular filtration rates were similar among the groups. Overall bacterial infectious complications and cytomegalovirus infection showed similar prevalence among all groups. Hospitalization was less common in the induction-free group. CONCLUSIONS: In low-risk patients, induction-free regimens could be associated with a better safety profile without compromising graft survival. Therefore, induction treatment may be disregarded in first living donor transplant patients who receive tacrolimusbased triple immunosuppression treatment.


Antilymphocyte Serum , Basiliximab , Graft Rejection , Graft Survival , Immunosuppressive Agents , Kidney Transplantation , Living Donors , Tacrolimus , Humans , Kidney Transplantation/adverse effects , Basiliximab/adverse effects , Basiliximab/therapeutic use , Immunosuppressive Agents/adverse effects , Immunosuppressive Agents/therapeutic use , Female , Male , Tacrolimus/adverse effects , Tacrolimus/therapeutic use , Adult , Antilymphocyte Serum/adverse effects , Antilymphocyte Serum/therapeutic use , Middle Aged , Treatment Outcome , Time Factors , Graft Rejection/immunology , Graft Rejection/prevention & control , Graft Survival/drug effects , Risk Factors , Retrospective Studies , Delayed Graft Function/immunology , Young Adult , Recombinant Fusion Proteins/adverse effects , Recombinant Fusion Proteins/therapeutic use , Calcineurin Inhibitors/adverse effects , Calcineurin Inhibitors/administration & dosage , Drug Therapy, Combination
...