RESUMO
Germinal centers (GCs) are the primary sites of clonal B cell expansion and affinity maturation, directing the production of high-affinity antibodies. This response is a central driver of pathogenesis in autoimmune diseases, such as systemic lupus erythematosus (SLE), but the natural history of autoreactive GCs remains unclear. Here, we present a novel mouse model where the presence of a single autoreactive B cell clone drives the TLR7-dependent activation, expansion, and differentiation of other autoreactive B cells in spontaneous GCs. Once tolerance was broken for one self-antigen, autoreactive GCs generated B cells targeting other self-antigens. GCs became independent of the initial clone and evolved toward dominance of individual clonal lineages, indicating affinity maturation. This process produced serum autoantibodies to a breadth of self-antigens, leading to antibody deposition in the kidneys. Our data provide insight into the maturation of the self-reactive B cell response, contextualizing the epitope spreading observed in autoimmune disease.
Assuntos
Linfócitos B/imunologia , Evolução Clonal , Centro Germinativo/citologia , Centro Germinativo/imunologia , Tolerância Imunológica , Animais , Autoanticorpos/imunologia , Autoantígenos/imunologia , Doenças Autoimunes/imunologia , Linfócitos B/citologia , Quimera/imunologia , Epitopos/imunologia , Rim/imunologia , Camundongos , Camundongos Endogâmicos C57BLRESUMO
Recent human decedent model studies1,2 and compassionate xenograft use3 have explored the promise of porcine organs for human transplantation. To proceed to human studies, a clinically ready porcine donor must be engineered and its xenograft successfully tested in nonhuman primates. Here we describe the design, creation and long-term life-supporting function of kidney grafts from a genetically engineered porcine donor transplanted into a cynomolgus monkey model. The porcine donor was engineered to carry 69 genomic edits, eliminating glycan antigens, overexpressing human transgenes and inactivating porcine endogenous retroviruses. In vitro functional analyses showed that the edited kidney endothelial cells modulated inflammation to an extent that was indistinguishable from that of human endothelial cells, suggesting that these edited cells acquired a high level of human immune compatibility. When transplanted into cynomolgus monkeys, the kidneys with three glycan antigen knockouts alone experienced poor graft survival, whereas those with glycan antigen knockouts and human transgene expression demonstrated significantly longer survival time, suggesting the benefit of human transgene expression in vivo. These results show that preclinical studies of renal xenotransplantation could be successfully conducted in nonhuman primates and bring us closer to clinical trials of genetically engineered porcine renal grafts.
Assuntos
Rejeição de Enxerto , Transplante de Rim , Macaca fascicularis , Suínos , Transplante Heterólogo , Animais , Humanos , Animais Geneticamente Modificados , Células Endoteliais/imunologia , Células Endoteliais/metabolismo , Rejeição de Enxerto/imunologia , Rejeição de Enxerto/prevenção & controle , Transplante de Rim/métodos , Polissacarídeos/deficiência , Suínos/genética , Transplante Heterólogo/métodos , Transgenes/genéticaRESUMO
BACKGROUND & AIMS: Continuous risk-stratification of candidates and urgency-based prioritization have been utilized for liver transplantation (LT) in patients with non-hepatocellular carcinoma (HCC) in the United States. Instead, for patients with HCC, a dichotomous criterion with exception points is still used. This study evaluated the utility of the hazard associated with LT for HCC (HALT-HCC), an oncological continuous risk score, to stratify waitlist dropout and post-LT outcomes. METHODS: A competing risk model was developed and validated using the UNOS database (2012-2021) through multiple policy changes. The primary outcome was to assess the discrimination ability of waitlist dropouts and LT outcomes. The study focused on the HALT-HCC score, compared with other HCC risk scores. RESULTS: Among 23,858 candidates, 14,646 (59.9%) underwent LT and 5196 (21.8%) dropped out of the waitlist. Higher HALT-HCC scores correlated with increased dropout incidence and lower predicted 5-year overall survival after LT. HALT-HCC demonstrated the highest area under the curve (AUC) values for predicting dropout at various intervals post-listing (0.68 at 6 months, 0.66 at 1 year), with excellent calibration (R2 = 0.95 at 6 months, 0.88 at 1 year). Its accuracy remained stable across policy periods and locoregional therapy applications. CONCLUSIONS: This study highlights the predictive capability of the continuous oncological risk score to forecast waitlist dropout and post-LT outcomes in patients with HCC, independent of policy changes. The study advocates integrating continuous scoring systems like HALT-HCC in liver allocation decisions, balancing urgency, organ utility, and survival benefit.
Assuntos
Carcinoma Hepatocelular , Neoplasias Hepáticas , Transplante de Fígado , Listas de Espera , Humanos , Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/cirurgia , Masculino , Feminino , Pessoa de Meia-Idade , Medição de Risco/métodos , Estados Unidos/epidemiologia , Idoso , AdultoRESUMO
Post-liver transplant (LT) immunosuppression is necessary to prevent rejection; however, a major consequence of this is tumor recurrence. Although recurrence is a concern after LT for patients with HCC, the oncologically optimal tacrolimus (FK) regimen is still unknown. This retrospective study included 1406 patients with HCC who underwent LT (2002-2019) at 4 US institutions using variable post-LT immunosuppression regimens. Receiver operating characteristic analyses were performed to investigate the influences of post-LT time-weighted average FK (TWA-FK) level on HCC recurrence. A competing risk analysis was employed to evaluate the prognostic influence of TWA-FK while adjusting for patient and tumor characteristics. The AUC for TWA-FK was greatest at 2 weeks (0.68), followed by 1 week (0.64) after LT. Importantly, this was consistently observed across the institutions despite immunosuppression regimen variability. In addition, the TWA-FK at 2 weeks was not associated with rejection within 6 months of LT. A competing risk regression analysis showed that TWA-FK at 2 weeks after LT is significantly associated with recurrence (HR: 1.31, 95% CI: 1.21-1.41, p < 0.001). The TWA-FK effect on recurrence varied depending on the exposure level and the individual's risk of recurrence, including vascular invasion and tumor morphology. Although previous studies have explored the influence of FK levels at 1-3 months after LT on HCC recurrence, this current study suggests that earlier time points and exposure levels must be evaluated. Each patient's oncological risk must also be considered in developing an individualized immunosuppression regimen.
RESUMO
Facile gene editing has accelerated progress in pig to non-human-primate (NHP) renal xenotransplantation, however, outcomes are considered inferior to NHP-allotransplantation. This systematic review and outcomes analysis of life-sustaining NHP-renal transplantation aimed to benchmark "preclinical success" and aggregated 1051 NHP-to-NHP or pig-to-NHP transplants across 88 articles. Although protocols varied, NHP-allotransplantation survival (1, 3, 12months, 67.5%, 37.1%, 13.2%) was significantly greater than NHP-xenotransplantation (1, 3, 12 months, 38.8%, 14.0%, 4.4%; p < .001); a difference partially mitigated by gene-edited donors containing at least knockout of alpha-1,3-galactosyltransferase (1, 3, 12 months, 47.1%, 24.2%, 7.6%; p < .001). Pathological analysis demonstrated more cellular rejection in allotransplantation (62.8% vs. 3.1%, p < .001) and more antibody-mediated rejection in xenotransplantation (6.8% vs. 45.5%, p < .001). Nonrejection causes of graft loss between allotransplants and xenotransplants differed; infection and animal welfare (1.7% vs. 11.2% and 3.9% vs. 17.0%, respectively, p < .001 for both). Importantly, even among a subgroup of unsensitized rhesus macaques under long-term immunosuppression, NHP-allotransplant survival was significantly inferior to clinical allotransplantation (6 months, 36.1% vs. 94.0%; p < .001), which suggests clinical outcomes with renal xenografts may be better than predicted by current preclinical data.
Assuntos
Transplante de Rim , Transplantes , Animais , Rejeição de Enxerto/etiologia , Sobrevivência de Enxerto , Xenoenxertos , Humanos , Transplante de Rim/efeitos adversos , Transplante de Rim/métodos , Macaca mulatta , Suínos , Transplante Heterólogo/efeitos adversos , Transplante Heterólogo/métodosRESUMO
Prognosticating outcomes in liver transplant (LT) for hepatocellular carcinoma (HCC) continues to challenge the field. Although Milan Criteria (MC) generalized the practice of LT for HCC and improved outcomes, its predictive character has degraded with increasing candidate and oncological heterogeneity. We sought to validate and recalibrate a previously developed, preoperatively calculated, continuous risk score, the Hazard Associated with Liver Transplantation for Hepatocellular Carcinoma (HALTHCC), in an international cohort. From 2002 to 2014, 4,089 patients (both MC in and out [25.2%]) across 16 centers in North America, Europe, and Asia were included. A continuous risk score using pre-LT levels of alpha-fetoprotein, Model for End-Stage Liver Disease Sodium score, and tumor burden score was recalibrated among a randomly selected cohort (n = 1,021) and validated in the remainder (n = 3,068). This study demonstrated significant heterogeneity by site and year, reflecting practice trends over the last decade. On explant pathology, both vascular invasion (VI) and poorly differentiated component (PDC) increased with increasing HALTHCC score. The lowest-risk patients (HALTHCC 0-5) had lower rates of VI and PDC than the highest-risk patients (HALTHCC > 35) (VI, 7.7%[ 1.2-14.2] vs. 70.6% [48.3-92.9] and PDC:4.6% [0.1%-9.8%] vs. 47.1% [22.6-71.5]; P < 0.0001 for both). This trend was robust to MC status. This international study was used to adjust the coefficients in the HALTHCC score. Before recalibration, HALTHCC had the greatest discriminatory ability for overall survival (OS; C-index = 0.61) compared to all previously reported scores. Following recalibration, the prognostic utility increased for both recurrence (C-index = 0.71) and OS (C-index = 0.63). Conclusion: This large international trial validated and refined the role for the continuous risk metric, HALTHCC, in establishing pre-LT risk among candidates with HCC worldwide. Prospective trials introducing HALTHCC into clinical practice are warranted.
Assuntos
Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/cirurgia , Transplante de Fígado , Medição de Risco , Feminino , Humanos , Cooperação Internacional , Masculino , Pessoa de Meia-Idade , Prognóstico , Estudos RetrospectivosRESUMO
The use of livers from donation after circulatory death (DCD) is historically characterized by increased rates of biliary complications and inferior short-term graft survival (GS) compared to donation after brain death (DBD) allografts. This study aimed to evaluate the dynamic prognostic impact of DCD livers to reveal whether they remain an adverse factor even after patients survive a certain period following liver transplant (LT). This study used 74 961 LT patients including 4065 DCD LT in the scientific registry of transplant recipients from 2002-2017. The actual, 1 and 3-year conditional hazard ratio (HR) of 1-year GS in DCD LT were calculated using a conditional version of Cox regression model. The actual 1-, 3-, and 5-year GS of DCD LT recipients were 83.3%, 73.3%, and 66.3%, which were significantly worse than those of DBD (all P < 0.01). Actual, 1-, and 3-year conditional HR of 1-year GS in DCD compared to DBD livers were 1.87, 1.49, and 1.39, respectively. Graft loss analyses showed that those lost to biliary related complications were significantly higher in the DCD group even 3 years after LT. National registry data demonstrate the protracted higher risks inherent to DCD liver grafts in comparison to their DBD counterparts, despite survival through the early period after LT. These findings underscore the importance of judicious DCD graft selection at individual center level to minimize the risk of long-term biliary complications.
Assuntos
Transplante de Fígado , Obtenção de Tecidos e Órgãos , Morte Encefálica , Morte , Sobrevivência de Enxerto , Humanos , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Doadores de TecidosRESUMO
This study estimated the utility of technical variant grafts (TVGs), such as split/reduced liver transplantation (SRLT) and living donor liver transplantation (LDLT), in pediatric acute liver failure (PALF). PALF is a devastating condition portending a poor prognosis without liver transplantation (LT). Pediatric candidates have fewer suitable deceased donor liver transplantation (DDLT) donor organs, and the efficacy of TVG in this setting remains incompletely investigated. PALF patients from 1995 to 2015 (age <18 years) were identified using the Scientific Registry of Transplant Recipients (n = 2419). Cox proportional hazards model and Kaplan-Meier curves were used to assess outcomes. Although wait-list mortality decreased (19.1% to 9.7%) and successful transplantations increased (53.7% to 62.2%), patients <1 year of age had persistently higher wait-list mortality rates (>20%) compared with other age groups (P < 0.001). TVGs accounted for only 25.7% of LT for PALF. In the adjusted model for wait-list mortality, among other factors, increased age (subhazard ratio [SHR], 0.97 per year; P = 0.020) and access to TVG were associated with decreased risk (SHR, 0.37; P < 0.0001). LDLT recipients had shorter median waiting times compared with DDLT (LDLT versus DDLT versus SRLT, 3 versus 4 versus 5 days, respectively; P = 0.017). In the adjusted model for post-LT survival, LDLT was superior to DDLT using whole grafts (SHR, 0.41; P = 0.004). However, patient survival after SRLT was not statistically different from DDLT (SHR, 0.75; P = 0.165). In conclusion, despite clear advantages to reduce wait-list mortality, TVGs have been underutilized in PALF. Early access to TVG, especially from LDLT, should be sought to further improve outcomes.
Assuntos
Falência Hepática Aguda/cirurgia , Transplante de Fígado/métodos , Doadores Vivos , Listas de Espera/mortalidade , Adolescente , Fatores Etários , Aloenxertos/estatística & dados numéricos , Aloenxertos/provisão & distribuição , Criança , Pré-Escolar , Feminino , Seguimentos , Sobrevivência de Enxerto , Humanos , Lactente , Estimativa de Kaplan-Meier , Falência Hepática Aguda/diagnóstico , Falência Hepática Aguda/mortalidade , Transplante de Fígado/estatística & dados numéricos , Transplante de Fígado/tendências , Masculino , Prognóstico , Sistema de Registros/estatística & dados numéricos , Estudos Retrospectivos , Fatores de Risco , Índice de Gravidade de Doença , Fatores de Tempo , Tempo para o Tratamento , Resultado do TratamentoRESUMO
A recent study using US national registry data reported, using Cox proportional hazards (PH) models, that split-liver transplantation (SLT) has improved over time and is no more hazardous than whole-liver transplantation (WLT). However, the study methods violated the PH assumption, which is the fundamental assumption of Cox modeling. As a result, the reported hazard ratios (HRs) are biased and unreliable. This study aimed to investigate whether the risk of graft survival (GS) in SLT has really improved over time, ensuring attention to the PH assumption. This study included 80,998 adult deceased donor liver transplantation (LT) (1998-2015) from the Scientific Registry Transplant Recipient. The study period was divided into 3 time periods: era 1 (January 1998 to February 2002), era 2 (March 2002 to December 2008), and era 3 (January 2009 to December 2015). The PH assumption was tested using Schoenfeld's test, and where the HR of SLT violated the assumption, changes in risk for SLT over time from transplant were assessed. SLT was performed in 1098 (1.4%) patients, whereas WLT was used in 79,900 patients. In the Cox PH analysis, the P values of Schoenfeld's global tests were <0.05 in all eras, which is consistent with deviation from proportionality. Assessing HRs of SLT with a time-varying effect, multiple Cox models were conducted for post-LT intervals. The HR curves plotted according to time from transplant were higher in the early period and then decreased at approximately 1 year and continued to decrease in all eras. For 1-year GS, the HRs of SLT were 1.92 in era 1, 1.52 in era 2, and 1.47 in era 3 (all P < 0.05). In conclusion, the risk of SLT has a time-varying effect and is highest in the early post-LT period. The risk of SLT is underestimated if it is evaluated by overall GS. SLT was still hazardous if the PH assumption was considered, although it became safer over time.
Assuntos
Rejeição de Enxerto/epidemiologia , Sobrevivência de Enxerto , Transplante de Fígado/efeitos adversos , Doadores de Tecidos/estatística & dados numéricos , Transplantados/estatística & dados numéricos , Adolescente , Adulto , Aloenxertos/provisão & distribuição , Aloenxertos/cirurgia , Interpretação Estatística de Dados , Feminino , Seguimentos , Rejeição de Enxerto/etiologia , Humanos , Fígado/cirurgia , Transplante de Fígado/métodos , Masculino , Pessoa de Meia-Idade , Sistema de Registros/estatística & dados numéricos , Fatores de Tempo , Resultado do Tratamento , Estados Unidos/epidemiologia , Adulto JovemRESUMO
Patients with hepatocellular carcinoma (HCC) are screened at presentation for appropriateness of liver transplantation (LT) using morphometric criteria, which poorly specifies risk. Morphology is the crux of measuring tumor response to locoregional therapy (LRT) using modified Response Evaluation Criteria in Solid Tumors (mRECIST). This study investigated the utility of following a continuous risk score (hazard associated with liver transplantation in hepatocellular carcinoma; HALTHCC) to longitudinally assess risk. This multicenter, retrospective study from 2002 to 2014 enrolled 419 patients listed for LT for HCC. One cohort had LRT while waiting (n = 351), compared to the control group (n = 68) without LRT. Imaging studies (n = 2,085) were collated to laboratory data to calculate HALTHCC, MORAL, Metroticket 2.0, and alpha fetoprotein (AFP) score longitudinally. Cox proportional hazards evaluated associations of HALTHCC and peri-LRT changes with intention-to-treat (ITT) survival (considering dropout or post-LT mortality), and utility was assessed with Harrell's C-index. HALTHCC better predicted ITT outcome (LT = 309; dropout = 110) when assessed closer to delisting (P < 0.0001), maximally just before delisting (C-index, 0.742 [0.643-0.790]). Delta-HALTHCC post-LRT was more sensitive to changes in risk than mRECIST. HALTHCC score and peri-LRT percentage change were independently associated with ITT mortality (hazard ratio = 1.105 [1.045-1.169] per point and 1.014 [1.004-1.024] per percent, respectively). CONCLUSIONS: HALTHCC is superior in assessing tumor risk in candidates awaiting LT, and its utility increases over time. Peri-LRT relative change in HALTHCC outperforms mRECIST in stratifying risk of dropout, mortality, and recurrence post-LT. With improving estimates of post-LT outcomes, it is reasonable to consider allocation using HALTHCC and not just waiting time. Furthermore, this study supports a shift in perspective, from listing to allocation, to better utilize precious donor organs. (Hepatology 2018).
Assuntos
Carcinoma Hepatocelular/patologia , Carcinoma Hepatocelular/cirurgia , Neoplasias Hepáticas/patologia , Neoplasias Hepáticas/cirurgia , Transplante de Fígado/métodos , Listas de Espera , Adulto , Biomarcadores Tumorais/análise , Biópsia por Agulha , Carcinoma Hepatocelular/mortalidade , Estudos de Casos e Controles , Feminino , Seguimentos , Sobrevivência de Enxerto , Humanos , Imuno-Histoquímica , Estimativa de Kaplan-Meier , Neoplasias Hepáticas/mortalidade , Transplante de Fígado/efeitos adversos , Estudos Longitudinais , Masculino , Pessoa de Meia-Idade , Seleção de Pacientes , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Medição de Risco , Taxa de Sobrevida , Resultado do Tratamento , Estados Unidos , alfa-Fetoproteínas/metabolismoRESUMO
OBJECTIVE: The objective of this retrospective study was to characterize the neutrophil to lymphocyte ratio (NLR) on the waitlist and determine its prognostic utility in liver transplantation (LT) for hepatocellular carcinoma (HCC) with special focus on longitudinal data. Biomarkers such as pre-operative NLR have been suggested to predict poor oncological outcomes for patients with HCC seeking LT. NLR's utility is thought to be related to tumor biology. However, recent studies have demonstrated that a high NLR conveys worse outcomes in non-HCC cirrhotics. This study investigated the relationship between NLR, liver function, tumor factors and patient prognosis. METHODS: Patients with HCC undergoing LT were identified between 2002 and 2014 (n = 422). Variables of interest were collected longitudinally from time of listing until LT. The prognostic utility of NLR was assessed using Kaplan-Meier and Cox Proportional Hazard regression. Associations between NLR and MELD-Na, AFP, and tumor morphology were also assessed. RESULTS: NLR demonstrated a positive correlation with MELD-Na at LT (R2 = 0.125, P < 0.001) and had parallel trends over time. The lowest NLR quartile had a median MELD-Na of 9 while the highest had a median MELD-Na of 19. There were minimal differences in AFP, tumor morphology, and rates of vascular invasion between quartiles. NLR was a statistically significant predictor of OS (HR = 1.64, P = 0.017) and recurrence (HR = 1.59, P = 0.016) even after controlling for important tumor factors. However, NLR lost its statistical significance when MELD-Na was added to the Cox regression model (OS: HR = 1.46, P = 0.098) (recurrence: HR = 1.40, P = 0.115). CONCLUSIONS: NLR is a highly volatile marker on the waitlist that demonstrates a significant correlation and collinearity with MELD-Na temporally and at the time of LT. These characteristics of NLR bring into question its utility as a predictive marker in HCC patients.
Assuntos
Carcinoma Hepatocelular/mortalidade , Neoplasias Hepáticas/mortalidade , Transplante de Fígado/mortalidade , Linfócitos/patologia , Neutrófilos/patologia , Listas de Espera/mortalidade , Idoso , Carcinoma Hepatocelular/patologia , Carcinoma Hepatocelular/cirurgia , Feminino , Seguimentos , Humanos , Neoplasias Hepáticas/patologia , Neoplasias Hepáticas/cirurgia , Masculino , Pessoa de Meia-Idade , Prognóstico , Estudos Retrospectivos , Fatores de Risco , Taxa de SobrevidaRESUMO
OBJECTIVE: Portal vein thrombosis (PVT) does not preclude liver transplantation (LT), but poor portal vein (PV) flow after LT remains a predictor of poor outcomes. Given the physiologic tendency of the hepatic artery (HA) to compensate for low PV flow via vasodilation, we investigated whether adequate HA flow would have a favorable prognostic impact among patients with low PV flow following LT. METHODS: This study included 163 patients with PVT who underwent LT between 2004 and 2015. PV and HA flow were categorized into quartiles, and their association with 1-year graft survival (GS) and biliary complication rates was assessed. For both the HA and the PV, patients at the lowest two quartiles were categorized as having low flow and the remainder as having high flow. RESULTS: The median MELD score was 22 and 1-year GS was 87.3%. As expected, GS paralleled PV flow with patients at the lowest flow quartile faring the worst. In combination of PV and HA flows, high HA flow was associated with improved 1-year GS among patients with low PV flow (P = .03). Similar findings were observed with respect to biliary complication rates. CONCLUSIONS: Sufficient HA flow may compensate for poor PV flow. Consequently, meticulous HA reconstruction may be central to achieving optimal outcomes in PVT cases.
Assuntos
Artéria Hepática/fisiopatologia , Hepatopatias/mortalidade , Transplante de Fígado/mortalidade , Fígado/irrigação sanguínea , Veia Porta/patologia , Trombose Venosa/mortalidade , Adulto , Idoso , Feminino , Seguimentos , Sobrevivência de Enxerto , Humanos , Circulação Hepática , Hepatopatias/cirurgia , Masculino , Pessoa de Meia-Idade , Prognóstico , Taxa de Sobrevida , Trombose Venosa/fisiopatologiaRESUMO
INTRODUCTION: Investigation into right and left-sided primary colon liver metastasis (CLM) has revealed differences in the tumor biology and prognosis. This indicates that preoperative and operative factors may affect outcomes of right-sided primary CLM differently than left. This retrospective analysis investigated the effects of resection margin stratified by left and right-sided primary CLM on overall survival (OS) for patients undergoing hepatectomy. METHODS: A total of 732 patients undergoing hepatic resection for CLM at the Cleveland Clinic and Johns Hopkins were identified between 2002 and 2016. Clinically significant variables were analyzed using Cox proportional hazard regression. The cohort was then divided into patients with right and left-sided CLM and analyzed separately using Kaplan Meier analysis and Cox proportional hazard regression. RESULTS: Cox proportional hazard regression showed that left-sided CLM with an R0 margin was a statistically significant predictor of OS even after controlling for other important factors (HR = 0.629, P = 0.024) but right-sided CLM with R0 margin was not (HR = 0.788, P = 0.245). Kaplan-Meier analysis demonstrated that patients with a left-sided CLM and R0 margin had the best prognosis (P = 0.037). CONCLUSION: Surgical margin is an important prognostic factor for left-sided primary CLM but tumor biology may override surgical technique for right-sided CLM.
Assuntos
Neoplasias do Colo/patologia , Hepatectomia/métodos , Neoplasias Hepáticas/cirurgia , Margens de Excisão , Estadiamento de Neoplasias , Idoso , Neoplasias do Colo/mortalidade , Neoplasias do Colo/cirurgia , Feminino , Seguimentos , Humanos , Neoplasias Hepáticas/diagnóstico , Neoplasias Hepáticas/secundário , Masculino , Pessoa de Meia-Idade , Prognóstico , Estudos Retrospectivos , Taxa de Sobrevida/tendências , Estados Unidos/epidemiologiaAssuntos
Transplante Heterólogo , Transplantes , Animais , Rim , Primatas , Suínos , Revisões Sistemáticas como AssuntoRESUMO
BACKGROUND: Reports of liver transplantation (LT) in patients with mixed hepatocellular carcinoma/cholangiocarcinoma (HCC/CC) and intrahepatic cholangiocarcinoma (ICC) are modest and have been mostly retrospective after pathological categorization in the setting of presumed HCC. Some studies suggest that patients undergoing LT with small and unifocal ICC or mixed HCC/CC can achieve about 40%-60% 5-year post-transplant survival. The study aimed to report our experience in patients undergoing LT with explant pathology revealing HCC/CC and ICC. METHODS: From a prospectively maintained database, we performed cohort analysis. We identified 13 patients who underwent LT with explant pathology revealing HCC/CC or ICC. RESULTS: The observed recurrence rate post-LT was 31% (4/13) and overall survival was 85%, 51%, and 51% at 1, 3 and 5 years, respectively. Disease-free survival was 68%, 51%, and 41% at 1, 3 and 5 years, respectively. In our cohort, four patients would have qualified for exception points based on updated HCC Organ Procurement and Transplantation Network imaging guidelines. CONCLUSIONS: Lesions which lack complete imaging characteristics of HCC may warrant pre-LT biopsy to fully elucidate their pathology. Identified patients with early HCC/CC or ICC may benefit from LT if unresectable. Additionally, incorporating adjunctive perioperative therapies such as in the case of patients undergoing LT with hilar cholangiocarcinoma may improve outcomes but this warrants further investigation.
Assuntos
Neoplasias dos Ductos Biliares/cirurgia , Carcinoma Hepatocelular/cirurgia , Colangiocarcinoma/cirurgia , Neoplasias Hepáticas/cirurgia , Transplante de Fígado , Neoplasias Complexas Mistas/cirurgia , Idoso , Neoplasias dos Ductos Biliares/diagnóstico por imagem , Neoplasias dos Ductos Biliares/mortalidade , Neoplasias dos Ductos Biliares/patologia , Biópsia , Carcinoma Hepatocelular/diagnóstico por imagem , Carcinoma Hepatocelular/mortalidade , Carcinoma Hepatocelular/patologia , Colangiocarcinoma/diagnóstico por imagem , Colangiocarcinoma/mortalidade , Colangiocarcinoma/patologia , Bases de Dados Factuais , Progressão da Doença , Intervalo Livre de Doença , Detecção Precoce de Câncer , Feminino , Humanos , Estimativa de Kaplan-Meier , Neoplasias Hepáticas/diagnóstico por imagem , Neoplasias Hepáticas/mortalidade , Neoplasias Hepáticas/patologia , Transplante de Fígado/efeitos adversos , Transplante de Fígado/mortalidade , Masculino , Pessoa de Meia-Idade , Recidiva Local de Neoplasia , Neoplasias Complexas Mistas/diagnóstico por imagem , Neoplasias Complexas Mistas/mortalidade , Neoplasias Complexas Mistas/patologia , Ohio , Fatores de Risco , Fatores de Tempo , Resultado do TratamentoRESUMO
Donation after circulatory death (DCD) donors show heterogeneous hemodynamic trajectories following withdrawal of life support. Impact of hemodynamics in DCD liver transplant is unclear, and objective measures of graft viability would ease transplant surgeon decision making and inform safe expansion of the donor organ pool. This retrospective study tested whether hemodynamic trajectories were associated with transplant outcomes in DCD liver transplantation (n = 87). Using longitudinal clustering statistical techniques, we phenotyped DCD donors based on hemodynamic trajectory for both mean arterial pressure (MAP) and peripheral oxygen saturation (SpO2 ) following withdrawal of life support. Donors were categorized into 3 clusters: those who gradually decline after withdrawal of life support (cluster 1), those who maintain stable hemodynamics followed by rapid decline (cluster 2), and those who decline rapidly (cluster 3). Clustering outputs were used to compare characteristics and transplant outcomes. Cox proportional hazards modeling revealed hepatocellular carcinoma (hazard ratio [HR] = 2.53; P = 0.047), cold ischemia time (HR = 1.50 per hour; P = 0.027), and MAP cluster 1 were associated with increased risk of graft loss (HR = 3.13; P = 0.021), but not SpO2 cluster (P = 0.172) or donor warm ischemia time (DWIT; P = 0.154). Despite longer DWIT, MAP and SpO2 clusters 2 showed similar graft survival to MAP and SpO2 clusters 3, respectively. In conclusion, despite heterogeneity in hemodynamic trajectories, DCD donors can be categorized into 3 clinically meaningful subgroups that help predict graft prognosis. Further studies should confirm the utility of liver grafts from cluster 2. Liver Transplantation 22 1469-1481 2016 AASLD.
Assuntos
Doença Hepática Terminal/cirurgia , Sobrevivência de Enxerto , Hemodinâmica/fisiologia , Transplante de Fígado/efeitos adversos , Fígado/fisiologia , Adulto , Aloenxertos/fisiologia , Pressão Arterial , Isquemia Fria , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Fenótipo , Prognóstico , Modelos de Riscos Proporcionais , Estudos Prospectivos , Estudos Retrospectivos , Fatores de Risco , Doadores de Tecidos/classificação , Obtenção de Tecidos e Órgãos , Isquemia QuenteRESUMO
Preterm birth impacts brain development and leads to chronic deficits including cognitive delay, behavioral problems, and epilepsy. Premature loss of the subplate, a transient subcortical layer that guides development of the cerebral cortex and axonal refinement, has been implicated in these neurological disorders. Subplate neurons influence postnatal upregulation of the potassium chloride co-transporter KCC2 and maturation of γ-amino-butyric acid A receptor (GABAAR) subunits. We hypothesized that prenatal transient systemic hypoxia-ischemia (TSHI) in Sprague-Dawley rats that mimic brain injury from extreme prematurity in humans would cause premature subplate loss and affect cortical layer IV development. Further, we predicted that the neuroprotective agent erythropoietin (EPO) could attenuate the injury. Prenatal TSHI induced subplate neuronal loss via apoptosis. TSHI impaired cortical layer IV postnatal upregulation of KCC2 and GABAAR subunits, and postnatal EPO treatment mitigated the loss (n ≥ 8). To specifically address how subplate loss affects cortical development, we used in vitro mechanical subplate ablation in slice cultures (n ≥ 3) and found EPO treatment attenuates KCC2 loss. Together, these results show that subplate loss contributes to impaired cerebral development, and EPO treatment diminishes the damage. Limitation of premature subplate loss and the resultant impaired cortical development may minimize cerebral deficits suffered by extremely preterm infants.
Assuntos
Lesões Encefálicas/tratamento farmacológico , Lesões Encefálicas/etiologia , Córtex Cerebral , Eritropoetina/uso terapêutico , Regulação da Expressão Gênica no Desenvolvimento/efeitos dos fármacos , Hipóxia-Isquemia Encefálica/patologia , Fatores Etários , Animais , Animais Recém-Nascidos , Morte Celular/efeitos dos fármacos , Córtex Cerebral/efeitos dos fármacos , Córtex Cerebral/crescimento & desenvolvimento , Córtex Cerebral/patologia , Modelos Animais de Doenças , Embrião de Mamíferos , Doenças Fetais/tratamento farmacológico , Doenças Fetais/fisiopatologia , Hipóxia-Isquemia Encefálica/complicações , Técnicas In Vitro , Atividade Motora/efeitos dos fármacos , Atividade Motora/fisiologia , Membro 2 do Grupo A da Subfamília 4 de Receptores Nucleares/metabolismo , Ratos , Ratos Sprague-Dawley , Receptores de GABA-A/metabolismo , Simportadores/metabolismo , Cotransportadores de K e Cl-RESUMO
The use of liver grafts from donation after circulatory death (DCD) donors remains controversial, particularly with donors of advanced age. This retrospective study investigated the impact of donor age in DCD liver transplantation. We examined 92 recipients who received DCD grafts and 92 matched recipients who received donation after brain death (DBD) grafts at Cleveland Clinic from January 2005 to June 2014. DCD grafts met stringent criteria to minimize risk factors in both donors and recipients. The 1-, 3-, and 5-year graft survival in DCD recipients was significantly inferior to that in DBD recipients (82%, 71%, 66% versus 92%, 87%, 85%, respectively; P = 0.03). Six DCD recipients (7%), but no DBD recipients, experienced ischemic-type biliary stricture (P = 0.01). However, the incidence of biliary stricture was not associated with donor age (P = 0.57). Interestingly, recipients receiving DCD grafts from donors who were <45 years of age (n = 55) showed similar graft survival rates compared to those receiving DCD grafts from donors who were ≥45 years of age (n = 37; 80%, 69%, 66% versus 83%, 72%, 66%, respectively; P = 0.67). Cox proportional hazards modeling in all study populations (n = 184) revealed advanced donor age (P = 0.05) and the use of a DCD graft (P = 0.03) as unfavorable factors for graft survival. Logistic regression analysis showed that the risk of DBD graft failure increased with increasing age, but the risk of DCD graft failure did not increase with increasing age (P = 0.13). In conclusion, these data suggest that stringent donor and recipient selection may ameliorate the negative impact of donor age in DCD liver transplantation. DCD grafts should not be discarded because of donor age, per se, and could help expand the donor pool for liver transplantation.