RESUMEN
RATIONALE: Chronic lung allograft dysfunction (CLAD) hinders lung transplant success. A 2019 consensus refined CLAD diagnosis, introducing probable or definite CLAD based on persistence of lung function decline. Outcomes and risks for probable CLAD remain uncertain. OBJECTIVES: Determine the prognosis and clinical risks for probable CLAD in a prospective multicenter cohort. METHODS: Clinical Trials in Organ Transplantation-20 included 745 CLAD-eligible adult lung recipients at 5 centers and applied rigorous methods to prospectively adjudicate probable CLAD. The impact of probable CLAD on graft loss was determined using a Cox model that considered CLAD as a time-dependent covariate. Regularized Cox modeling with LASSO penalty was used to evaluate donor or recipient characteristics and the occurrence and timing of posttransplant events as probable CLAD risks. Similar analyses were performed for definite CLAD. MEASUREMENTS AND MAIN RESULTS: Probable CLAD occurred in 29.7% of patients at 3 years posttransplant and conferred a marked increase in risk for graft loss (unadjusted HR 4.38, p<0.001). Most patients (80%) with probable CLAD progressed to definite CLAD. Cytomegalovirus infection and specifically late presence (>90 days posttransplant) of donor-specific alloantibodies, acute rejection, acute lung injury, or organizing pneumonia contributed the greatest independent information about probable CLAD risk. Definite CLAD risks were similar. CONCLUSIONS: Probable CLAD identifies patients at high risk for graft loss, supporting prospective identification of this condition for early initiation of CLAD-directed interventions. More effective strategies to prevent posttransplant cytomegalovirus, inhibit allospecific immunity, and reduce tissue injury are needed to reduce probable CLAD and improve lung recipient survival.
RESUMEN
Rationale: It remains unclear how gastroesophageal reflux disease (GERD) affects allograft microbial community composition in lung transplant recipients and its impact on lung allograft inflammation and function. Objectives: Our objective was to compare the allograft microbiota in lung transplant recipients with or without clinically diagnosed GERD in the first year after transplant and assess associations between GERD, allograft microbiota, inflammation, and acute and chronic lung allograft dysfunction (ALAD and CLAD). Methods: A total of 268 BAL samples were collected from 75 lung transplant recipients at a single transplant center every 3 months after transplant for 1 year. Ten transplant recipients from a separate transplant center provided samples before and after antireflux Nissen fundoplication surgery. Microbial community composition and density were measured using 16S ribosomal RNA gene sequencing and quantitative polymerase chain reaction, respectively, and inflammatory markers and bile acids were quantified. Measurements and Main Results: We observed a range of allograft community composition with three discernible types (labeled community state types [CSTs] 1-3). Transplant recipients with GERD were more likely to have CST1, characterized by high bacterial density and relative abundance of the oropharyngeal colonizing genera Prevotella and Veillonella. GERD was associated with more frequent transitions to CST1. CST1 was associated with lower inflammatory cytokine concentrations than pathogen-dominated CST3 across the range of microbial densities observed. Cox proportional hazard models revealed associations between CST3 and the development of ALAD/CLAD. Nissen fundoplication decreased bacterial load and proinflammatory cytokines. Conclusions: GERD was associated with a high bacterial density, Prevotella- and Veillonella-dominated CST1. CST3, but not CST1 or GERD, was associated with inflammation and early development of ALAD and CLAD. Nissen fundoplication was associated with a reduction in microbial density in BAL fluid samples, especially the CST1-specific genus, Prevotella.
Asunto(s)
Reflujo Gastroesofágico , Trasplante de Pulmón , Microbiota , Humanos , Estudios Retrospectivos , Reflujo Gastroesofágico/complicaciones , Pulmón , Inflamación , AloinjertosRESUMEN
BACKGROUND: Clonal hematopoiesis of indeterminate potential (CHIP), the age-related acquisition of somatic mutations that leads to an expanded blood cell clone, has been associated with development of a pro-inflammatory state. An enhanced or dysregulated inflammatory response may contribute to rejection after lung transplantation, however the prevalence of CHIP in lung recipients and influence of CHIP on allograft outcomes is unknown. METHODS: We analyzed whole-exome sequencing data in 279 lung recipients to detect CHIP, defined by pre-specified somatic mutations in 74 genes known to promote clonal expansion of hematopoietic stem cells. We compared the burden of acute rejection (AR) over the first post-transplant year in lung recipients with vs. without CHIP using multivariable ordinal regression. Multivariate Cox proportional hazards models were used to assess the association between CHIP and CLAD-free survival. An exploratory analysis evaluated the association between the number of CHIP-associated variants and chronic lung allograft dysfunction (CLAD)-free survival. RESULTS: We detected 64 CHIP-associated mutations in 45 individuals (15.7%), most commonly in TET2 (10.8%), DNMT3A (9.2%), and U2AF1 (9.2%). Patients with CHIP tended to be older but did not significantly differ from patients without CHIP in terms of race or native lung disease. Patients with CHIP did not have a higher incidence of AR over the first post-transplant year (p = 0.45) or a significantly increased risk of death or CLAD (adjusted HR 1.25, 95% CI 0.88-1.78). We did observe a significant association between the number of CHIP variants and CLAD-free survival, specifically patients with 2 or more CHIP-associated variants had an increased risk for death or CLAD (adjusted HR 3.79, 95% CI 1.98-7.27). CONCLUSIONS: Lung recipients have a higher prevalence of CHIP and a larger variety of genes with CHIP-associated mutations compared with previous reports for the general population. CHIP did not increase the risk of AR, CLAD, or death in lung recipients.
Asunto(s)
Hematopoyesis Clonal , Trasplante de Pulmón , Humanos , Receptores de Trasplantes , Prevalencia , Pulmón , Trasplante de Pulmón/efectos adversosRESUMEN
We determined prognostic implications of acute lung injury (ALI) and organizing pneumonia (OP), including timing relative to transplantation, in a multicenter lung recipient cohort. We sought to understand clinical risks that contribute to development of ALI/OP. We analyzed prospective, histologic diagnoses of ALI and OP in 4786 lung biopsies from 803 adult lung recipients. Univariable Cox regression was used to evaluate the impact of early (≤90 days) or late (>90 days) posttransplant ALI or OP on risk for chronic lung allograft dysfunction (CLAD) or death/retransplantation. These analyses demonstrated late ALI/OP conferred a two- to threefold increase in the hazards of CLAD or death/retransplantation; there was no association between early ALI/OP and these outcomes. To determine risk factors for late ALI/OP, we used univariable Cox models considering donor/recipient characteristics and posttransplant events as candidate risks. Grade 3 primary graft dysfunction, higher degree of donor/recipient human leukocyte antigen mismatch, bacterial or viral respiratory infection, and an early ALI/OP event were significantly associated with increased late ALI/OP risk. These data from a contemporary, multicenter cohort underscore the prognostic implications of ALI/OP on lung recipient outcomes, clarify the importance of the timing of these events, and identify clinical risks to target for ALI/OP prevention.
Asunto(s)
Lesión Pulmonar Aguda , Trasplante de Pulmón , Neumonía , Adulto , Humanos , Estudios Prospectivos , Pronóstico , Estudios Retrospectivos , Trasplante de Pulmón/efectos adversos , Lesión Pulmonar Aguda/etiología , Lesión Pulmonar Aguda/patología , Pulmón , Neumonía/epidemiología , Neumonía/etiología , Neumonía/patología , Factores de Riesgo , Estudios de CohortesRESUMEN
Histopathologic lung allograft injuries are putative harbingers for chronic lung allograft dysfunction (CLAD). However, the mechanisms responsible are not well understood. CXCL9 and CXCL10 are potent chemoattractants of mononuclear cells and potential propagators of allograft injury. We hypothesized that these chemokines would be quantifiable in plasma, and would associate with subsequent CLAD development. In this prospective multicenter study, we evaluated 721 plasma samples for CXCL9/CXCL10 levels from 184 participants at the time of transbronchial biopsies during their first-year post-transplantation. We determined the association between plasma chemokines, histopathologic injury, and CLAD risk using Cox proportional hazards models. We also evaluated CXCL9/CXCL10 levels in bronchoalveolar lavage (BAL) fluid and compared plasma to BAL with respect to CLAD risk. Plasma CXCL9/CXCL10 levels were elevated during the injury patterns associated with CLAD, acute rejection, and acute lung injury, with a dose-response relationship between chemokine levels and CLAD risk. Importantly, there were strong interactions between injury and plasma CXCL9/CXCL10, where histopathologic injury associated with CLAD only in the presence of elevated plasma chemokines. We observed similar associations and interactions with BAL CXCL9/CXCL10 levels. Elevated plasma CXCL9/CXCL10 during allograft injury may contribute to CLAD pathogenesis and has potential as a minimally invasive immune monitoring biomarker.
Asunto(s)
Enfermedad Injerto contra Huésped , Trasplante de Pulmón , Aloinjertos , Biomarcadores , Quimiocina CXCL10 , Quimiocina CXCL9 , Rechazo de Injerto/diagnóstico , Rechazo de Injerto/etiología , Humanos , Pulmón , Trasplante de Pulmón/efectos adversos , Estudios ProspectivosRESUMEN
The histopathologic diagnosis of acute allograft injury is prognostically important in lung transplantation with evidence demonstrating a strong and consistent association between acute rejection (AR), acute lung injury (ALI), and the subsequent development of chronic lung allograft dysfunction (CLAD). The pathogenesis of these allograft injuries, however, remains poorly understood. CXCL9 and CXCL10 are CXC chemokines induced by interferon-γ and act as potent chemoattractants of mononuclear cells. We hypothesized that these chemokines are involved in the mononuclear cell recruitment associated with AR and ALI. We further hypothesized that the increased activity of these chemokines could be quantified as increased levels in the bronchoalveolar lavage fluid. In this prospective multicenter study, we evaluate the incidence of histopathologic allograft injury development during the first-year post-transplant and measure bronchoalveolar CXCL9 and CXCL10 levels at the time of the biopsy. In multivariable models, CXCL9 levels were 1.7-fold and 2.1-fold higher during AR and ALI compared with "normal" biopsies without histopathology. Similarly, CXCL10 levels were 1.6-fold and 2.2-fold higher during these histopathologies, respectively. These findings support the association of CXCL9 and CXCL10 with episodes of AR and ALI and provide potential insight into the pathogenesis of these deleterious events.
Asunto(s)
Quimiocina CXCL10 , Rechazo de Injerto , Aloinjertos , Quimiocina CXCL9 , Rechazo de Injerto/etiología , Pulmón , Estudios ProspectivosRESUMEN
Epstein-Barr virus (EBV)-driven posttransplant lymphoproliferative disorder (PTLD) is a serious complication following lung transplant. The extent to which the presence of EBV in PTLD tissue is associated with survival is uncertain. Moreover, whether the heterogeneity in expression of EBV latency programs is related to the timing of PTLD onset remains unexplored. We retrospectively performed a comprehensive histological evaluation of EBV markers at the tissue level in 34 adult lung transplant recipients with early- and late-onset PTLD. Early-onset PTLD, occurring within the first 12 months posttransplant, had higher odds to express EBV markers. The presence of EBV in PTLD was not associated with a difference in survival relative to EBV-negative tumors. However, we found evidence of heterogeneous expression of EBV latency programs, including type III, IIb, IIa, and 0/I. Our study suggests that the heterogeneous expression of EBV latency programs may represent a mechanism for immune evasion in patients with PLTD after lung transplants. The recognition of multiple EBV latency programs can be used in personalized medicine in patients who are nonresponsive to traditional types of chemotherapy and can be potentially evaluated in other types of solid organ transplants.
Asunto(s)
Infecciones por Virus de Epstein-Barr/virología , Herpesvirus Humano 4/genética , Pulmón/virología , Trastornos Linfoproliferativos/virología , Trasplante de Órganos/efectos adversos , Adulto , Infecciones por Virus de Epstein-Barr/etiología , Infecciones por Virus de Epstein-Barr/mortalidad , Femenino , Expresión Génica , Humanos , Pulmón/metabolismo , Pulmón/cirugía , Trastornos Linfoproliferativos/etiología , Trastornos Linfoproliferativos/mortalidad , Masculino , Persona de Mediana Edad , Trasplante de Órganos/mortalidad , Estudios Retrospectivos , Receptores de Trasplantes , Proteínas Virales/genética , Proteínas Virales/metabolismo , Latencia del Virus/genéticaRESUMEN
Rationale: Acute rejection, manifesting as lymphocytic inflammation in a perivascular (acute perivascular rejection [AR]) or peribronchiolar (lymphocytic bronchiolitis [LB]) distribution, is common in lung transplant recipients and increases the risk for chronic graft dysfunction.Objectives: To evaluate clinical factors associated with biopsy-proven acute rejection during the first post-transplant year in a present-day, five-center lung transplant cohort.Methods: We analyzed prospective diagnoses of AR and LB from over 2,000 lung biopsies in 400 newly transplanted adult lung recipients. Because LB without simultaneous AR was rare, our analyses focused on risk factors for AR. Multivariable Cox proportional hazards models were used to assess donor and recipient factors associated with the time to the first AR occurrence.Measurements and Main Results: During the first post-transplant year, 53.3% of patients experienced at least one AR episode. Multivariable proportional hazards analyses accounting for enrolling center effects identified four or more HLA mismatches (hazard ratio [HR], 2.06; P ≤ 0.01) as associated with increased AR hazards, whereas bilateral transplantation (HR, 0.57; P ≤ 0.01) was associated with protection from AR. In addition, Wilcoxon rank-sum analyses demonstrated bilateral (vs. single) lung recipients, and those with fewer than four (vs. more than four) HLA mismatches demonstrated reduced AR frequency and/or severity during the first post-transplant year.Conclusions: We found a high incidence of AR in a contemporary multicenter lung transplant cohort undergoing consistent biopsy sampling. Although not previously recognized, the finding of reduced AR in bilateral lung recipients is intriguing, warranting replication and mechanistic exploration.
Asunto(s)
Bronquiolitis/epidemiología , Rechazo de Injerto/epidemiología , Trasplante de Pulmón , Complicaciones Posoperatorias/epidemiología , Enfermedad Aguda , Anciano , Estudios de Cohortes , Femenino , Humanos , Masculino , Persona de Mediana Edad , Factores de Riesgo , Factores de TiempoRESUMEN
Long-term survival after lung transplant lags behind that of other commonly transplanted organs, reflecting the current incomplete understanding of the mechanisms involved in the development of posttransplant lung injury, rejection, infection, and chronic allograft dysfunction. To address this unmet need, 2 ongoing National Institute of Allergy and Infectious Disease funded studies through the Clinical Trials in Organ Transplant Consortium (CTOT) CTOT-20 and CTOT-22 were dedicated to understanding the clinical factors and biological mechanisms that drive chronic lung allograft dysfunction and those that maintain cytomegalovirus polyfunctional protective immunity. The CTOT-20 and CTOT-22 studies enrolled 800 lung transplant recipients at 5 North American centers over 3 years. Given the number and complexity of subjects included, CTOT-20 and CTOT-22 utilized innovative data transfers and capitalized on patient-entered data collection to minimize site manual data entry. The data were coupled with an extensive biosample collection strategy that included DNA, RNA, plasma, serum, bronchoalveolar lavage fluid, and bronchoalveolar lavage cell pellet. This Special Article describes the CTOT-20 and CTOT-22 protocols, data and biosample strategy, initial results, and lessons learned through study execution.
Asunto(s)
Trasplante de Pulmón , Trasplante de Órganos , Líquido del Lavado Bronquioalveolar , Citomegalovirus , Rechazo de Injerto/etiología , Humanos , Trasplante de Pulmón/efectos adversos , Trasplante de Órganos/efectos adversos , Receptores de TrasplantesRESUMEN
BACKGROUND: Physical inactivity and depressive symptoms following cardiothoracic transplantation are recognized as potentially modifiable psychosocial factors to improve clinical outcomes. However, few studies have prospectively assessed these in ambulatory, outpatient transplant recipients. METHODS: We conducted a prospective, single-center study examining actigraphy-assessed physical activity (PA) levels over a 1-week period in heart or lung transplant recipients recruited at 6 months (range 4-9) post-transplant. Depressive symptoms (Centers for Epidemiologic Study of Depression [CESD]), quality of life (QoL), and clinical events (transplant-related hospitalization and death) were collected. Clustered Cox proportional hazards models were used to examine the associations between PA, psychological measures, and clinical events. RESULTS: Among 105 potentially eligible participants, 66 (63%) met inclusion criteria and were enrolled between July, 2016 and May, 2017, including 42 lung and 24 heart transplant recipients. The mean age of the population was 53 years, 41% were women and 18% were black. Participants tended to be sedentary, with the majority of activity spent within the "sedentary" level (61%) and an average daily step count of 7188 (SD = 2595). In addition, participants tended to exhibit subclinical depressive symptoms, (mean CESD = 9.4 [SD = 8]) with only a subset exhibiting levels suggestive of clinical depression (22%). Over a median follow-up of 1.4 years (1.14, 1.62), 21 participants (32%) experienced at least one transplant-related hospitalization, including two deaths. In adjusted survival models, greater intensity of PA (HR = 0.45 [0.24, 0.84] per 0.2 METs, P = .012) was associated with a lower risk of clinical events, whereas greater depressive symptoms (HR = 2.11 [1.58, 2.82] per 9 CESD points, P < .001) at 6 months were associated a higher likelihood of subsequent transplant-related hospitalization and/or death. CONCLUSIONS: Physical inactivity and depressive symptoms at 6 months post-transplant were predictive of subsequent adverse clinical events among ambulatory cardiothoracic transplant recipients. Future studies should examine whether improving these potentially modifiable post-transplant risk factors improves clinical outcomes.
Asunto(s)
Trastorno Depresivo/mortalidad , Ejercicio Físico , Trasplante de Corazón/mortalidad , Trasplante de Pulmón/mortalidad , Complicaciones Posoperatorias/mortalidad , Calidad de Vida , Trastorno Depresivo/epidemiología , Femenino , Estudios de Seguimiento , Trasplante de Corazón/efectos adversos , Humanos , Trasplante de Pulmón/efectos adversos , Masculino , Persona de Mediana Edad , Proyectos Piloto , Pronóstico , Estudios Prospectivos , Factores de Riesgo , Encuestas y Cuestionarios , Tasa de Supervivencia , Estados Unidos/epidemiologíaRESUMEN
RATIONALE: Idiopathic pulmonary fibrosis (IPF) is an increasingly recognized, often fatal lung disease of unknown etiology. OBJECTIVES: The aim of this study was to use whole-exome sequencing to improve understanding of the genetic architecture of pulmonary fibrosis. METHODS: We performed a case-control exome-wide collapsing analysis including 262 unrelated individuals with pulmonary fibrosis clinically classified as IPF according to American Thoracic Society/European Respiratory Society/Japanese Respiratory Society/Latin American Thoracic Association guidelines (81.3%), usual interstitial pneumonia secondary to autoimmune conditions (11.5%), or fibrosing nonspecific interstitial pneumonia (7.2%). The majority (87%) of case subjects reported no family history of pulmonary fibrosis. MEASUREMENTS AND MAIN RESULTS: We searched 18,668 protein-coding genes for an excess of rare deleterious genetic variation using whole-exome sequence data from 262 case subjects with pulmonary fibrosis and 4,141 control subjects drawn from among a set of individuals of European ancestry. Comparing genetic variation across 18,668 protein-coding genes, we found a study-wide significant (P < 4.5 × 10-7) case enrichment of qualifying variants in TERT, RTEL1, and PARN. A model qualifying ultrarare, deleterious, nonsynonymous variants implicated TERT and RTEL1, and a model specifically qualifying loss-of-function variants implicated RTEL1 and PARN. A subanalysis of 186 case subjects with sporadic IPF confirmed TERT, RTEL1, and PARN as study-wide significant contributors to sporadic IPF. Collectively, 11.3% of case subjects with sporadic IPF carried a qualifying variant in one of these three genes compared with the 0.3% carrier rate observed among control subjects (odds ratio, 47.7; 95% confidence interval, 21.5-111.6; P = 5.5 × 10-22). CONCLUSIONS: We identified TERT, RTEL1, and PARN-three telomere-related genes previously implicated in familial pulmonary fibrosis-as significant contributors to sporadic IPF. These results support the idea that telomere dysfunction is involved in IPF pathogenesis.
Asunto(s)
Exoma/genética , Predisposición Genética a la Enfermedad/genética , Fibrosis Pulmonar Idiopática/genética , Femenino , Variación Genética/genética , Humanos , Masculino , Persona de Mediana EdadRESUMEN
Background: Pulmonary embolism (PE) is a rare yet serious postoperative complication for lung transplant recipients (LTRs). The association between timing and severity of PE and the development of chronic allograft lung dysfunction (CLAD) has not been described. Methods: A single-center, retrospective cohort analysis of first LTRs included bilateral or single lung transplants and excluded multiorgan transplants and retransplants. PEs were confirmed by computed tomography angiography or ventilation/perfusion (VQ) scans. Infarctions were confirmed on computed tomography angiography by a trained physician. The PE severity was defined by the Pulmonary Embolism Severity Index (PESI) score, a 30-d post-PE mortality risk calculator, and stratified by low I and II (0-85), intermediate III and IV (85-125), and high V (>125). PE and PESI were analyzed in the outcomes of overall survival, graft failure, and chronic lung allograft dysfunction (CLAD). Results: We identified 57 of 928 patients (6.14%) who had at least 1 PE in the LTR cohort with a median follow-up of 1623 d. In the subset with PE, the median PESI score was 85 (75.8-96.5). Most of the PESI scores (32/56 available) were in the low-risk category. In the CLAD analysis, there were 49 LTRs who had a PE and 16 LTRs (33%) had infarction. When treating PE as time-dependent and adjusting for covariates, PE was significantly associated with death (hazard ratio [HR] 1.8; 95% confidence interval [CI], 1.3-2.5), as well as increased risk of graft failure, defined as retransplant, CLAD, or death (HR 1.8; 95% CI, 1.3-2.5), and CLAD (HR 1.7; 95% CI, 1.2-2.4). Infarction was not associated with CLAD or death. The PESI risk category was not a significant predictor of death or CLAD. Conclusions: PE is associated with decreased survival and increased hazard of developing CLAD. PESI score was not a reliable predictor of CLAD or death in this lung transplant cohort.
RESUMEN
BACKGROUND: Poor agreement among lung transplant pathologists has been reported in the assessment of rejection. In addition to acute rejection (AR) and lymphocytic bronchiolitis (LB), acute lung injury (ALI) and organizing pneumonia (OP) were recently identified as histopathologic risk factors for chronic lung allograft dysfunction (CLAD). Therefore, maximizing inter-rater reliability (IRR) for identifying these histopathologic risk factors is important to guide individual patient care and to support incorporating them in inclusion criteria for clinical trials in lung transplantation. METHODS: Nine pathologists across eight North American lung transplant centers were surveyed for practices in the assessment of lung transplant transbronchial biopsies. We conducted seven diagnostic alignment sessions with pathologists discussing histomorphologic features of CLAD high-risk histopathology. Then, each pathologist blindly scored 75 digitized slides. Fleiss' kappa, accounting for agreement across numerous observers, was used to determine IRR across all raters for presence of any high-risk finding and each individual entity. RESULTS: IRR (95% confidence intervals) and % agreement for any high-risk finding (AR, LB, ALI and/or OP) and each individual finding is as follows: Any Finding, k = 0.578 (0.487, 0.668), 78.9%; AR, k = 0.582 (0.481, 0.651), 79.1%; LB, k = 0.683 (0.585, 0.764), 83.5%; ALI, k = 0.418 (0.312, 0.494), 70.9%; OP, k = 0.621 (0.560, 0.714), 81.0%. CONCLUSIONS: After pre-study diagnostic alignment sessions, a multi-center group of lung transplant pathologists seeking to identify histopathology high-risk for CLAD achieved good IRR.
RESUMEN
BACKGROUND: Few tools exist for early identification of patients at risk for chronic lung allograft dysfunction (CLAD). We previously showed hyaluronan (HA), a matrix molecule that regulates lung inflammation and fibrosis, accumulates in bronchoalveolar lavage fluid (BALF) and blood in CLAD. We aimed to determine if early posttransplant HA elevations inform CLAD risk. METHODS: HA was quantified in 3080 BALF and 1323 blood samples collected over the first posttransplant year in 743 adult lung recipients at 5 centers. The relationship between BALF or blood HA and CLAD was assessed using Cox models with a time-dependent binary covariate for "elevated" HA. Potential thresholds for elevated HA were examined using a grid search between the 50th and 85th percentile. The optimal threshold was identified using fit statistics, and the association between the selected threshold and CLAD was internally validated through iterative resampling. A multivariable Cox model using the selected threshold was performed to evaluate the association of elevated HA with CLAD considering other factors that may influence CLAD risk. RESULTS: BALF HA levels >19.1ng/mL (65th percentile), had the largest hazard ratio for CLAD (HR 1.70, 95% CI 1.25-1.31; p<0.001), optimized fit statistics, and demonstrated robust reproducibility. In a multivariable model, the occurrence of BALF HA >19.1 ng/mL in the first posttransplant year conferred a 66% increase in the hazards for CLAD (adjusted HR 1.66, 95% CI 1.19-2.32; p=0.003). Blood HA was not significantly associated with CLAD. CONCLUSIONS: We identified and validated a precise threshold for BALF HA in the first posttransplant year that distinguishes patients at increased CLAD risk.
RESUMEN
BACKGROUND: Sensitized lung transplant recipients are at increased risk of developing donor-specific antibodies, which have been associated with acute and chronic rejection. Perioperative intravenous immune globulin has been used in sensitized individuals to down-regulate antibody production. METHODS: We compared patients with a pre-transplant calculated panel reactive antibody ≥25% who did not receive preemptive immune globulin therapy to a historical control that received preemptive immune globulin therapy. Our cohort included 59 patients, 17 patients did not receive immune globulin therapy and 42 patients received therapy. RESULTS: Donor specific antibody development was numerically higher in the non-immune globulin group compared to the immune globulin group (58.8% vs 33.3%, respectively, odds ratio 2.80, 95% confidence interval [0.77, 10.79], p = 0.13). Median time to antibody development was 9 days (Q1, Q3: 7, 19) and 28 days (Q1, Q3: 7, 58) in the non-immune globulin and immune globulin groups, respectively. There was no significant difference between groups in the incidence of primary graft dysfunction at 72 h post-transplant or acute cellular rejection, antibody-mediated rejection, and chronic lung allograft dysfunction at 12 months. CONCLUSION: These findings are hypothesis generating and emphasize the need for larger, randomized studies to determine association of immune globulin therapy with clinical outcomes.
Asunto(s)
Inmunoglobulinas Intravenosas , Humanos , Anticuerpos , Rechazo de Injerto/prevención & control , Inmunoglobulinas Intravenosas/uso terapéutico , Pulmón , Receptores de TrasplantesRESUMEN
BACKGROUND: Frailty, measured as a single construct, is associated variably with poor outcomes before and after lung transplantation. The usefulness of a comprehensive frailty assessment before transplantation is unknown. RESEARCH QUESTION: How are multiple frailty constructs, including phenotypic and cumulative deficit models, muscle mass, exercise tolerance, and social vulnerabilities, measured before transplantation, associated with short-term outcomes after lung transplantation? STUDY DESIGN AND METHODS: We conducted a retrospective cohort study of 515 lung recipients who underwent frailty assessments before transplantation, including the short physical performance battery (SPPB), transplant-specific frailty index (FI), 6-min walk distance (6MWD), thoracic sarcopenia, and social vulnerability indexes. We tested the association between frailty measures before transplantation and outcomes after transplantation using logistic regression to model 1-year survival and zero-inflated negative binomial regression to model hospital-free days (HFDs) in the first 90 days after transplantation. Adjustment covariates included age, sex, native lung disease, transplantation type, lung allocation score, BMI, and primary graft dysfunction. RESULTS: Before transplantation, 51.3% of patients were frail by FI (FI ≥ 0.25) and no patients were frail by SPPB. In multivariate adjusted models that also included FI, SPPB, and 6MWD, greater frailty by FI, but not SPPB, was associated with fewer HFDs (-0.006 per 0.01 unit worsening; 95% CI, -0.01 to -0.002 per 0.01 unit worsening) among discharged patients. Greater SPPB deficits were associated with decreased odds of 1-year survival (OR, 0.51 per 1 unit worsening; 95% CI, 0.28-0.93 per 1 unit worsening). Correlation among frailty measurements overall was poor. No association was found between thoracic sarcopenia, 6MWD, or social vulnerability assessments and short-term outcomes after lung transplantation. INTERPRETATION: Both phenotypic and cumulative deficit models measured before transplantation are associated with short-term outcomes after lung transplantation. Cumulative deficit measures of frailty may be more relevant in the first 90 days after transplantation, whereas phenotypic frailty may have a stronger association with 1-year survival.
Asunto(s)
Fragilidad , Trasplante de Pulmón , Sarcopenia , Humanos , Fragilidad/complicaciones , Estudios Retrospectivos , Sarcopenia/epidemiología , Sarcopenia/complicaciones , PulmónRESUMEN
BACKGROUND: Chronic lung allograft dysfunction (CLAD) is the leading cause of death among lung transplant recipients. Eosinophils, effector cells of type 2 immunity, are implicated in the pathobiology of many lung diseases, and prior studies suggest their presence associates with acute rejection or CLAD after lung transplantation. RESEARCH QUESTION: Does histologic allograft injury or respiratory microbiology correlate with the presence of eosinophils in BAL fluid (BALF)? Does early posttransplant BALF eosinophilia associate with future CLAD development, including after adjustment for other known risk factors? STUDY DESIGN AND METHODS: We analyzed BALF cell count, microbiology, and biopsy data from a multicenter cohort of 531 lung recipients with 2,592 bronchoscopies over the first posttransplant year. Generalized estimating equation models were used to examine the correlation of allograft histology or BALF microbiology with the presence of BALF eosinophils. Multivariable Cox regression was used to determine the association between ≥ 1% BALF eosinophils in the first posttransplant year and definite CLAD. Expression of eosinophil-relevant genes was quantified in CLAD and transplant control tissues. RESULTS: The odds of BALF eosinophils being present was significantly higher at the time of acute rejection and nonrejection lung injury histologies and during pulmonary fungal detection. Early posttransplant ≥ 1% BALF eosinophils significantly and independently increased the risk for definite CLAD development (adjusted hazard ratio, 2.04; P = .009). Tissue expression of eotaxins, IL-13-related genes, and the epithelial-derived cytokines IL-33 and thymic stromal lymphoprotein were significantly increased in CLAD. INTERPRETATION: BALF eosinophilia was an independent predictor of future CLAD risk across a multicenter lung recipient cohort. Additionally, type 2 inflammatory signals were induced in established CLAD. These data underscore the need for mechanistic and clinical studies to clarify the role of type 2 pathway-specific interventions in CLAD prevention or treatment.
Asunto(s)
Eosinofilia , Trasplante de Pulmón , Humanos , Líquido del Lavado Bronquioalveolar , Pulmón , Trasplante Homólogo , Trasplante de Pulmón/efectos adversos , Aloinjertos , Eosinofilia/etiología , Estudios Retrospectivos , Rechazo de InjertoRESUMEN
BACKGROUND: Chronic lung allograft dysfunction (CLAD) increases morbidity and mortality for lung transplant recipients. Club cell secretory protein (CCSP), produced by airway club cells, is reduced in the bronchoalveolar lavage fluid (BALF) of lung recipients with CLAD. We sought to understand the relationship between BALF CCSP and early posttransplant allograft injury and determine if early posttransplant BALF CCSP reductions indicate later CLAD risk. METHODS: We quantified CCSP and total protein in 1606 BALF samples collected over the first posttransplant year from 392 adult lung recipients at 5 centers. Generalized estimating equation models were used to examine the correlation of allograft histology or infection events with protein-normalized BALF CCSP. We performed multivariable Cox regression to determine the association between a time-dependent binary indicator of normalized BALF CCSP level below the median in the first posttransplant year and development of probable CLAD. RESULTS: Normalized BALF CCSP concentrations were 19% to 48% lower among samples corresponding to histological allograft injury as compared with healthy samples. Patients who experienced any occurrence of a normalized BALF CCSP level below the median over the first posttransplant year had a significant increase in probable CLAD risk independent of other factors previously linked to CLAD (adjusted hazard ratio 1.95; p = 0.035). CONCLUSIONS: We discovered a threshold for reduced BALF CCSP to discriminate future CLAD risk; supporting the utility of BALF CCSP as a tool for early posttransplant risk stratification. Additionally, our finding that low CCSP associates with future CLAD underscores a role for club cell injury in CLAD pathobiology.
Asunto(s)
Trasplante de Pulmón , Adulto , Humanos , Trasplante de Pulmón/efectos adversos , Biomarcadores/metabolismo , Pulmón , Líquido del Lavado Bronquioalveolar , Aloinjertos , Estudios RetrospectivosRESUMEN
BACKGROUND: Acute rejection is a risk factor for the development of chronic lung allograft dysfunction, the leading cause of morbidity and mortality in lung transplant recipients. Calcineurin inhibitors are the cornerstone of immunosuppression regimens after lung transplantation. METHODS: We retrospectively evaluated the association of tacrolimus level variability with total acute rejection score at 12 months post-transplant. Secondary outcomes included the development of chronic lung allograft dysfunction and antibody-mediated rejection at 24months post-transplant. There were 229 lung transplant recipients included. RESULTS: The mean (standard deviation) total rejection score of the cohort was 1.6 (1.7). Patients with high tacrolimus variability at 0 to 3, 3 to 6, and 6 to 12 months on average scored 0.18 (mean 1.6 vs 1.5; 95% CI): -0.3 to 0.66, P =.46), 0.14 (mean 1.7 vs 1.5; 95% CI: -0.32 to 0.6, P = .55), and 0.12 (mean 1.6 vs 1.5; 95% CI: -0.34 to 0.58, P = .62) point higher in 12-month total acute rejection scores, respectively; however, these differences were not statistically significant. The incidences of chronic lung allograft dysfunction and antibody-mediated rejection were numerically greater in the high variability group throughout certain periods; however, this was not consistent throughout all study timeframes and statistical significance was not evaluated. CONCLUSIONS: High tacrolimus variability was not associated with increased 12-month total acute rejection score. Further studies are needed to assess long-term outcomes with tacrolimus level variability.
Asunto(s)
Trasplante de Pulmón , Tacrolimus , Humanos , Tacrolimus/efectos adversos , Rechazo de Injerto/epidemiología , Inmunosupresores/efectos adversos , Estudios Retrospectivos , Trasplante de Pulmón/efectos adversosRESUMEN
BACKGROUND: Lung transplantation is increasingly performed in recipients aged ≥65 years. However, the risk factors for mortality specific to this population have not been well studied. In lung transplant recipients aged ≥65 years, we sought to determine post-transplant survival and clinical factors associated with post-transplant mortality. METHODS: We investigated 5,815 adult lung transplants recipients aged ≥65 years in the Scientific Registry of Transplant Recipients. Mortality was defined as a composite of recipient death or retransplantation. The Kaplan-Meier method was used to estimate the median time to mortality. Univariable and multivariable Cox proportional hazards regression models were used to examine the association between time to mortality and 23 donor, recipient, or center characteristics. RESULTS: Median survival in lung transplant recipients aged ≥65 years was 4.41 years (95% CI: 4.21-4.60 years) and significantly worsened by increasing age strata. In the multivariable model, increasing recipient age strata, creatinine level, bilirubin level, hospitalization at the time of transplantation, single lung transplant operation, steroid use at the time of transplantation, donor diabetes, and cytomegalovirus mismatch were independently associated with increased mortality. CONCLUSIONS: Among the 8 risk factors we identified, 5 factors are readily available, which can be used to optimize post-transplant survival by informing risk during candidate selection of patients aged ≥65 years. Furthermore, bilateral lung transplantation may confer improved survival in comparison with single lung transplantation. Our results support that after careful consideration of risk factors, lung transplantation can provide life-extending benefits in individuals aged ≥65 years.