ABSTRACT
BACKGROUND: Bronchoalveolar lavage (BAL) is a key tool in respiratory medicine for sampling the distal airways. BAL bile acids are putative biomarkers of pulmonary microaspiration, which is associated with poor outcomes after lung transplantation. Compared to BAL, large airway bronchial wash (LABW) samples the tracheobronchial space where bile acids may be measurable at more clinically relevant levels. We assessed whether LABW bile acids, compared to BAL bile acids, are more strongly associated with poor clinical outcomes in lung transplant recipients. METHODS: Concurrently obtained BAL and LABW at 3 months post-transplant from a retrospective cohort of 61 lung transplant recipients were analyzed for taurocholic acid (TCA), glycocholic acid (GCA), and cholic acid by mass spectrometry and 10 inflammatory proteins by multiplex immunoassay. Associations between bile acids with inflammatory proteins and acute lung allograft dysfunction were assessed using Spearman correlation and logistic regression, respectively. Time to chronic lung allograft dysfunction and death were evaluated using multivariable Cox proportional hazards and Kaplan-Meier methods. RESULTS: Most bile acids and inflammatory proteins were higher in LABW than in BAL. LABW bile acids correlated with inflammatory proteins within and between sample type. LABW TCA and GCA were associated with acute lung allograft dysfunction (OR = 1.368; 95%CI = 1.036-1.806; P = 0.027, OR = 1.064; 95%CI = 1.009-1.122; P = 0.022, respectively). No bile acids were associated with chronic lung allograft dysfunction. Adjusted for risk factors, LABW TCA and GCA predicted death (HR = 1.513; 95%CI = 1.014-2.256; P = 0.042, HR = 1.597; 95%CI = 1.078-2.366; P = 0.020, respectively). Patients with LABW TCA in the highest tertile had worse survival compared to all others. CONCLUSIONS: LABW bile acids are more strongly associated than BAL bile acids with inflammation, acute lung allograft dysfunction, and death in lung transplant recipients. Collection of LABW may be useful in the evaluation of microaspiration in lung transplantation and other respiratory diseases.
Subject(s)
Lung Transplantation , Transplant Recipients , Bile Acids and Salts , Biomarkers , Bronchoalveolar Lavage , Bronchoalveolar Lavage Fluid , Cohort Studies , Humans , Lung , Retrospective StudiesABSTRACT
Human leukocyte antigen (HLA)-G is a non-classical HLA that inhibits immune responses. Its expression is modified by single nucleotide polymorphisms (SNPs), which are associated with transplant outcomes. Our aim was to investigate the association of donor and recipient HLA-G SNPs with chronic lung allograft dysfunction (CLAD) and mortality after lung transplantation.In this single-centre study, we examined 11 HLA-G SNPs in 345 consecutive recipients and 297 donors of a first bilateral lung transplant. A multivariable Cox proportional hazards model assessed associations of SNPs with death and CLAD. Transbronchial biopsies (TBBx) and bronchoalveolar lavage (BAL) samples were examined using quantitative PCR, ELISA and immunofluorescence.Over a median of 4.75â years, 142 patients (41%) developed CLAD; 170 (49%) died. Multivariable analysis revealed donor SNP +3142 (GG+CG versus CC) was associated with increased mortality (hazard ratio 1.78, 95% CI 1.12-2.84; p=0.015). In contrast, five donor SNPs, -201(CC), -716(TT), -56(CC), G*01:03(AA) and 14â bp INDEL, conferred reduced mortality risk. Specific donor-recipient SNP pairings reduced CLAD risk. Predominantly epithelial HLA-G expression was observed on TBBx without rejection. Soluble HLA-G was present in higher concentrations in the BAL samples of patients who later developed CLAD.Specific donor SNPs were associated with mortality risk after lung transplantation, while certain donor-recipient SNP pairings modulated CLAD risk. TBBx demonstrated predominantly epithelial, and therefore presumably donor-derived, HLA-G expression in keeping with these observations. This study is the first to demonstrate an effect of donor HLA-G SNPs on lung transplantation outcome.
Subject(s)
HLA-G Antigens/genetics , Lung Transplantation/mortality , Polymorphism, Single Nucleotide , Tissue Donors , Adult , Aged , Alleles , Biopsy , DNA/genetics , Female , Genotype , Graft Survival , Humans , Kaplan-Meier Estimate , Leukocytes/cytology , Lung/pathology , Male , Middle Aged , Multivariate Analysis , Proportional Hazards Models , Retrospective Studies , RiskABSTRACT
BACKGROUND: Bronchoalveolar lavage (BAL) has proven to be very useful to monitor the lung allograft after transplantation. In addition to allowing detection of infections, multiple BAL analytes have been proposed as potential biomarkers of lung allograft rejection or dysfunction. However, BAL collection is not well standardized and differences in BAL collection represent an important source of variation. We hypothesized that there are systematic differences between sequential BALs that are relevant to BAL analysis. METHODS: As part of 126 consecutive bronchoscopies in lung transplant recipients, two sequential BALs (BAL1 and BAL2) were performed in one location during each bronchoscopy by instilling and suctioning 50 ml of normal saline twice into separate containers. Cell concentration, viability and differentials, Surfactant Protein-D (SP-D), Club Cell Secretory Protein (CCSP), and levels of CXCL10, IL-10, CCL2, CCL5, VEGF-C, RAGE, CXCL9, CXCL1, IL-17A, IL-21, PDGF, and GCSF were compared between BAL1 and BAL2. RESULTS: Total cell concentration did not differ between BAL1 and BAL2; however, compared to BAL2, BAL1 had more dead cells, epithelial cells, neutrophils, and higher concentrations of airway epithelium-derived CCSP and inflammatory markers. BAL2 had a higher concentration of SP-D compared to BAL1. CONCLUSION: In this study performed in lung transplant recipients, we show that sequential BALs represent different lung compartments and have distinct compositions. BAL1 represents the airway compartment with more epithelial cells, neutrophils, and epithelium-derived CCSP. Conversely, BAL2 samples preferentially the distal bronchoalveolar space with greater cell viability and higher SP-D. Our findings illustrate how the method of BAL collection can influence analyte concentrations and further emphasize the need for a standardized approach in translational research involving BAL samples.
Subject(s)
Biomedical Research/trends , Bronchoalveolar Lavage Fluid/cytology , Bronchoalveolar Lavage/trends , Lung Transplantation/trends , Lung/pathology , Adult , Aged , Bronchoscopy/trends , Cohort Studies , Female , Humans , Lung/surgery , Male , Middle Aged , Retrospective Studies , Time FactorsABSTRACT
Chronic lung allograft dysfunction (CLAD) is a major cause of mortality in lung transplant recipients. CLAD can be sub-divided into at least 2 subtypes with distinct mortality risk characteristics: restrictive allograft syndrome (RAS), which demonstrates increased overall computed tomography (CT) lung density in contrast with bronchiolitis obliterans syndrome (BOS), which demonstrates reduced overall CT lung density. This study aimed to evaluate a reader-independent quantitative density metric (QDM) derived from CT histograms to associate with CLAD survival. A retrospective study evaluated CT scans corresponding to CLAD onset using pulmonary function tests in 74 patients (23 RAS, 51 BOS). Two different QDM values (QDM1 and QDM2) were calculated using CT lung density histograms. Calculation of QDM1 includes the extreme edges of the histogram. Calculation of QDM2 includes the central region of the histogram. Kaplan-Meier analysis and Cox regression analysis were used for CLAD prognosis. Higher QDM values were significantly associated with decreased survival. The hazard ratio for death was 3.2 times higher at the 75th percentile compared to the 25th percentile using QDM1 in a univariate model. QDM may associate with CLAD patient prognosis.
Subject(s)
Bronchiolitis Obliterans/mortality , Graft Rejection/mortality , Lung Diseases/mortality , Lung Transplantation/mortality , Postoperative Complications , Primary Graft Dysfunction/mortality , Tomography, X-Ray Computed/methods , Adult , Allografts , Bronchiolitis Obliterans/classification , Bronchiolitis Obliterans/diagnostic imaging , Bronchiolitis Obliterans/etiology , Chronic Disease , Female , Follow-Up Studies , Graft Rejection/diagnostic imaging , Graft Rejection/etiology , Graft Survival , Humans , Lung Diseases/surgery , Lung Transplantation/adverse effects , Male , Middle Aged , Primary Graft Dysfunction/classification , Primary Graft Dysfunction/diagnostic imaging , Primary Graft Dysfunction/etiology , Prognosis , Radiography, Thoracic , Respiratory Function Tests , Retrospective Studies , Risk FactorsABSTRACT
RATIONALE: Immediate graft performance after lung transplantation is associated with short- and long-term clinical outcomes. However, the biologic mechanism that determines outcomes is not fully understood. OBJECTIVES: To investigate the impact of cell death signals at 24 and 48 hours after lung transplantation on short- and long-term clinical outcomes. METHODS: Plasma samples were collected pretransplantation and at 24 and 48 hours after transplant from 60 bilateral lung transplant recipients. Ten patients had primary graft dysfunction (PGD) grade 3 (PaO2/FiO2 ratio <200 or on extracorporeal membrane oxygenation support) at 72 hours after transplant (PGD group). The remaining 50 patients were defined as the control group. Levels of plasma M30 (signifying epithelial apoptosis), M65 (signifying epithelial apoptosis plus necrosis), and high-mobility group box 1 protein (HMGB-1; signifying necrosis of all cell types) were measured by ELISA and correlated with clinical outcomes. Survival analyses were performed using Kaplan-Meier curves and Cox proportional hazards regression. Prediction accuracy of markers was assessed by calculated area under the curve of receiver operating characteristic graph. MEASUREMENTS AND MAIN RESULTS: The PGD group had significantly higher M30 and M65 levels at 24 and 48 hours after transplant compared with the control group. There was no significant difference in HMGB-1. Area under the curve for 1-year survival was 0.86, 0.93, and 0.51 for M30, M65, and HMGB-1 at 48 hours, respectively. Survival analysis showed that higher M30 and M65 levels at 24 and 48 hours were significantly associated with worse survival. M65 at 48 hours remained significant even after adjustment for PGD. HMGB-1 was not significantly associated with survival. CONCLUSIONS: Recipient plasma concentration of epithelial cell death markers (M30, M65) after lung transplantation is negatively correlated with early graft performance and long-term survival.
Subject(s)
Cell Death , Lung Transplantation , Postoperative Complications/blood , Primary Graft Dysfunction/blood , Adult , Biomarkers/blood , Enzyme-Linked Immunosorbent Assay , Extracorporeal Membrane Oxygenation/statistics & numerical data , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Predictive Value of Tests , Proportional Hazards Models , Reproducibility of Results , Retrospective Studies , Survival AnalysisABSTRACT
RATIONALE: Despite increasing evidence about the role of donor-specific human leukocyte antigen (HLA) antibodies in transplant outcomes, the incidence and impact of de novo donor-specific antibodies (dnDSA) after lung transplantation remains unclear. OBJECTIVES: To describe the incidence, characteristics, and impact of dnDSA after lung transplantation. METHODS: We investigated a single-center cohort of 340 lung transplant recipients undergoing transplant during 2008 to 2011. All patients underwent HLA-antibody testing quarterly pretransplant and at regular intervals over the first 24 months after transplant. The patients received modified immunosuppression depending on their pretransplant sensitization status. Risk factors for dnDSA development, as well as the associations of dnDSA with patient survival and chronic lung allograft dysfunction (CLAD), were determined using multivariable analysis. MEASUREMENTS AND MAIN RESULTS: The cumulative incidence of dnDSA was 47% at a median of 86 days (range, 44-185 d) after lung transplantation. Seventy-six percent of recipients with dnDSA had DQ-DSA. Male sex and the use of ex vivo lung perfusion were associated with an increased risk of dnDSA, whereas increased HLA-DQB1 matching was protective. DQ-dnDSA preceded or coincided with the diagnosis of CLAD in all cases. Developing dnDSA (vs. no dnDSA) was associated with a twofold increased risk of CLAD (hazard ratio, 2.04; 95% confidence interval, 1.13-3.69). This association appeared to be driven by the development of DQ-dnDSA. CONCLUSIONS: dnDSA are common after lung transplantation, with the majority being DQ DSA. DQ-dnDSA are associated with an increased risk of CLAD. Strategies to prevent or treat DQ-dnDSA may improve outcomes for lung transplant recipients.
Subject(s)
Allografts/immunology , Bronchiolitis Obliterans/immunology , Graft Rejection/immunology , HLA Antigens/immunology , Lung Transplantation/adverse effects , Lung/immunology , Tissue Donors , Allografts/statistics & numerical data , Bronchiolitis Obliterans/epidemiology , Bronchiolitis Obliterans/etiology , Female , Graft Rejection/complications , Graft Rejection/epidemiology , Humans , Immunosuppressive Agents/administration & dosage , Kaplan-Meier Estimate , Lung Transplantation/statistics & numerical data , Male , Middle Aged , Ontario/epidemiology , Proportional Hazards Models , Retrospective Studies , Sex DistributionABSTRACT
PURPOSE: In the clinical setting, there is no reliable tool for diagnosing gastric aspiration. A potential way of diagnosing gastric fluid aspiration entails bronchoalveolar lavage (BAL) with subsequent examination of the BAL fluid for gastric fluid components that are exogenous to the lungs. The objective of this study was to determine the longevity of the gastric fluid components bile and trypsin in the lung, in order to provide an estimate of the time frame in which assessment of these components in the BAL might effectively be used as a measure of aspiration. MATERIALS AND METHODS: Human gastric fluid (0.5 mg/kg) was infused in the right lung of intubated male Fischer 344 rats (n = 30). Animals were sacrificed at specified times following the experimentally induced aspiration, and bronchoalveolar lavage fluid (BALF) was collected. Bile concentrations were analyzed by an enzyme-linked chromatogenic method, and the concentration of trypsin was quantified using an ELISA. Data were analyzed using non-linear regression and a one-phase decay equation. RESULTS: In this experimental model, the half-life of bile was 9.3 hours (r(2) = 0.81), and the half-life of trypsin was 9.0 hours (r(2) = 0.68). CONCLUSIONS: The half-lives of bile and trypsin in the rodent aspiration model suggest that the ability to detect aspiration may be limited to a few days post-aspiration. If studies using rats are any indication, it may be most effective to collect BAL samples within the first 24 hours of suspected aspiration events in order to detect aspiration.
Subject(s)
Bile/metabolism , Body Fluids/metabolism , Trypsin/metabolism , Animals , Bronchoalveolar Lavage/methods , Bronchoalveolar Lavage Fluid , Humans , Lung , Male , Paracentesis/methods , Rats , Rats, Inbred F344ABSTRACT
OBJECTIVES: To study the impact of ex vivo lung perfusion (EVLP) on cytokines, chemokines, and growth factors and their correlation with graft performance either during perfusion or after transplantation. BACKGROUND: EVLP is a modern technique that preserves lungs on normothermia in a metabolically active state. The identification of biomarkers during clinical EVLP can contribute to the safe expansion of the donor pool. METHODS: High-risk brain death donors and donors after cardiac death underwent 4 to 6 hours EVLP. Using a multiplex magnetic bead array assay, we evaluated analytes in perfusate samples collected at 1 hour and 4 hours of EVLP. Donor lungs were divided into 3 groups: (I) Control: bilateral transplantation with good early outcome [absence of primary graft dysfunction- (PGD) grade 3]; (II) PGD3: bilateral transplantation with PGD grade 3 anytime within 72 hours; (III) Declined: lungs unsuitable for transplantation after EVLP. RESULTS: Of 50 cases included in this study, 27 were in Control group, 7 in PGD3, and 16 in Declined. From a total of 51 analytes, 34 were measurable in perfusates. The best marker to differentiate declined lungs from control lungs was stem cell growth factor -ß [P < 0.001, AUC (area under the curve) = 0.86] at 1 hour. The best markers to differentiate PGD3 cases from controls were interleukin-8 (P < 0.001, AUC = 0.93) and growth-regulated oncogene-α (P = 0.001, AUC = 0.89) at 4 hours of EVLP. CONCLUSIONS: Perfusate protein expression during EVLP can differentiate lungs with good outcome from lungs PGD3 after transplantation. These perfusate biomarkers can be potentially used for more precise donor lung selection improving the outcomes of transplantation.
Subject(s)
Cytokines/metabolism , Lung Transplantation , Lung/blood supply , Perfusion/methods , Tissue Donors , Biomarkers/metabolism , Brain Death , Chemokines/metabolism , Heart Diseases/mortality , Humans , In Vitro Techniques , Intercellular Signaling Peptides and Proteins/metabolism , Ontario , Predictive Value of Tests , Tissue and Organ Procurement/methodsABSTRACT
BACKGROUND: More than 80% of donor lungs are potentially injured and therefore not considered suitable for transplantation. With the use of normothermic ex vivo lung perfusion (EVLP), the retrieved donor lung can be perfused in an ex vivo circuit, providing an opportunity to reassess its function before transplantation. In this study, we examined the feasibility of transplanting high-risk donor lungs that have undergone EVLP. METHODS: In this prospective, nonrandomized clinical trial, we subjected lungs considered to be high risk for transplantation to 4 hours of EVLP. High-risk donor lungs were defined by specific criteria, including pulmonary edema and a ratio of the partial pressure of arterial oxygen to the fraction of inspired oxygen (PO(2):FIO(2)) less than 300 mm Hg. Lungs with acceptable function were subsequently transplanted. Lungs that were transplanted without EVLP during the same period were used as controls. The primary end point was primary graft dysfunction 72 hours after transplantation. Secondary end points were 30-day mortality, bronchial complications, duration of mechanical ventilation, and length of stay in the intensive care unit and hospital. RESULTS: During the study period, 136 lungs were transplanted. Lungs from 23 donors met the inclusion criteria for EVLP; in 20 of these lungs, physiological function remained stable during EVLP and the median PO(2):FIO(2) ratio increased from 335 mm Hg in the donor lung to 414 and 443 mm Hg at 1 hour and 4 hours of perfusion, respectively (P<0.001). These 20 lungs were transplanted; the other 116 lungs constituted the control group. The incidence of primary graft dysfunction 72 hours after transplantation was 15% in the EVLP group and 30% in the control group (P=0.11). No significant differences were observed for any secondary end points, and no severe adverse events were directly attributable to EVLP. CONCLUSIONS: Transplantation of high-risk donor lungs that were physiologically stable during 4 hours of ex vivo perfusion led to results similar to those obtained with conventionally selected lungs. (Funded by Vitrolife; ClinicalTrials.gov number, NCT01190059.).
Subject(s)
Lung Transplantation , Lung/physiology , Perfusion/methods , Adolescent , Adult , Aged , Feasibility Studies , Graft Survival , Humans , Middle Aged , Organ Preservation/methods , Prospective Studies , Pulmonary Gas Exchange , Respiratory Mechanics , Tissue Donors , Tissue and Organ Harvesting , Vascular Resistance , Young AdultABSTRACT
BACKGROUND: Gastroesophageal reflux disease (GERD) is a risk factor for chronic lung allograft dysfunction. Bile acids-putative markers of gastric microaspiration-and inflammatory proteins in the bronchoalveolar lavage (BAL) have been associated with chronic lung allograft dysfunction, but their relationship with GERD remains unclear. Although GERD is thought to drive chronic microaspiration, the selection of patients for anti-reflux surgery lacks precision. This multicenter study aimed to test the association of BAL bile acids with GERD, lung inflammation, allograft function, and anti-reflux surgery. METHODS: We analyzed BAL obtained during the first post-transplant year from a retrospective cohort of patients with and without GERD, as well as BAL obtained before and after Nissen fundoplication anti-reflux surgery from a separate cohort. Levels of taurocholic acid (TCA), glycocholic acid, and cholic acid were measured using mass spectrometry. Protein markers of inflammation and injury were measured using multiplex assay and enzyme-linked immunosorbent assay. RESULTS: At 3 months after transplantation, TCA, IL-1ß, IL-12p70, and CCL5 were higher in the BAL of patients with GERD than in that of no-GERD controls. Elevated TCA and glycocholic acid were associated with concurrent acute lung allograft dysfunction and inflammatory proteins. The BAL obtained after anti-reflux surgery contained reduced TCA and inflammatory proteins compared with that obtained before anti-reflux surgery. CONCLUSIONS: Targeted monitoring of TCA and selected inflammatory proteins may be useful in lung transplant recipients with suspected reflux and microaspiration to support diagnosis and guide therapy. Patients with elevated biomarker levels may benefit most from anti-reflux surgery to reduce microaspiration and allograft inflammation.
Subject(s)
Bile Acids and Salts/metabolism , Bronchiolitis Obliterans/surgery , Bronchoalveolar Lavage Fluid/chemistry , Gastroesophageal Reflux/complications , Graft Rejection/metabolism , Lung Transplantation , Transplant Recipients , Adult , Aged , Biomarkers/metabolism , Bronchiolitis Obliterans/complications , Female , Follow-Up Studies , Gastroesophageal Reflux/metabolism , Graft Rejection/etiology , Humans , Male , Middle Aged , Retrospective Studies , Young AdultABSTRACT
OBJECTIVES: Extracorporeal life support (ECLS) is increasingly used to bridge deteriorating patients awaiting lung transplantation (LTx), however, few systematic descriptions of this practice exist. We therefore aimed to review our institutional experience over the past 10 years. METHODS: In this case series, we included all adults who received ECLS with the intent to bridge to LTx. Data were retrieved from patient charts and our institutional ECLS and transplant databases. RESULTS: Between January 2006 and September 2016, 1111 LTx were performed in our institution. ECLS was used in 71 adults with the intention to bridge to LTx; of these, 11 (16%) were bridged to retransplantation. The median duration of ECLS before LTx was 10 days (range, 0-95). We used a single dual-lumen venous cannula in 23 patients (32%). Nine of 13 patients (69%) with pulmonary hypertension were bridged by central pulmonary artery to left atrium Novalung. Twenty-five patients (35%) were extubated while on ECLS and 26 patients (37%) were mobilized. Sixty-three patients (89%) survived to LTx. Survival by intention to treat was 66% (1 year), 58% (3 years) and 48% (5 years). Survival was significantly shorter in patients undergoing ECLS bridge to retransplantation compared with first LTx (median survival, 15 months (95% CI, 0-31) versus 60 months (95% CI, 37-83); P = .041). CONCLUSIONS: In our center experience, ECLS bridge to first lung transplant leads to good short-term and long-term outcomes in carefully selected patients. In contrast, our data suggest that ECLS as a bridge to retransplantation should be used with caution.
Subject(s)
Extracorporeal Membrane Oxygenation , Hospitals, High-Volume , Lung Diseases/surgery , Lung Transplantation , Adolescent , Adult , Aged , Clinical Decision-Making , Databases, Factual , Extracorporeal Membrane Oxygenation/adverse effects , Extracorporeal Membrane Oxygenation/mortality , Female , Humans , Lung Diseases/diagnosis , Lung Diseases/mortality , Lung Diseases/physiopathology , Male , Middle Aged , Ontario , Patient Selection , Recovery of Function , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , Waiting Lists , Young AdultABSTRACT
BACKGROUND: Ex vivo lung perfusion (EVLP) provides opportunities to treat injured donor lungs before transplantation. We investigated whether lung lavage, to eliminate inflammatory inhibitory components, followed by exogenous surfactant replacement, could aid lung recovery and improve post-transplant lung function after gastric aspiration injury. METHODS: Gastric acid aspiration was induced in donor pigs, which were ventilated for 6 hours to develop lung injury. After retrieval and 10 hours of cold preservation, EVLP was performed for 6 hours. The lungs were randomly divided into 4 groups (n = 5, each): (1) no treatment (control), (2) lung lavage, (3) surfactant administration, and (4) lung lavage, followed by surfactant administration. After another 2-hour period of cold preservation, the left lung was transplanted and reperfused for 4 hours. RESULTS: Physiologic lung function significantly improved after surfactant administration during EVLP. The EVLP perfusate from the lavage + surfactant group showed significantly lower levels of interleukin (IL)-1ß, IL-6, IL-8, and secretory phospholipase A2. Total phosphatidylcholine was increased, and minimum surface tension was recovered to normal levels (≤5 mN/m) in the bronchioalveolar fluid after surfactant administration. Lysophosphatidylcholine in bronchioalveolar fluid was significantly lower in the lavage + surfactant group than in the surfactant group. Post-transplant lung function was significantly better in the lavage + surfactant group compared with all other groups. CONCLUSIONS: Lung lavage, followed by surfactant replacement during EVLP, reduced inflammatory mediators and prevented hydrolysis of phosphatidylcholine, which contributed to the superior post-transplant function in donor lungs with aspiration injury.
Subject(s)
Bronchoalveolar Lavage/methods , Lung Injury/surgery , Lung Transplantation/methods , Organ Preservation/methods , Pulmonary Surfactants/pharmacology , Reperfusion Injury/prevention & control , Analysis of Variance , Animals , Disease Models, Animal , Extracorporeal Circulation/methods , Gastric Acid , Lung Injury/physiopathology , Lung Transplantation/adverse effects , Male , Preoperative Care/methods , Random Allocation , Respiratory Function Tests , Statistics, Nonparametric , Sus scrofa , Swine , Tissue DonorsABSTRACT
BACKGROUND: The long-term success of lung transplantation is challenged by the development of chronic lung allograft dysfunction (CLAD) and its distinct subtypes of bronchiolitis obliterans syndrome (BOS) and restrictive allograft syndrome (RAS). However, the current diagnostic criteria for CLAD subtypes rely on total lung capacity (TLC), which is not always measured during routine post-transplant assessment. Our aim was to investigate the utility of low-dose 3-dimensional computed tomography (CT) lung volumetry for differentiating RAS from BOS. METHODS: This study was a retrospective evaluation of 63 patients who had developed CLAD after bilateral lung or heartâlung transplantation between 2006 and 2011, including 44 BOS and 19 RAS cases. Median post-transplant follow-up was 65 months in BOS and 27 months in RAS. The median interval between baseline and the disease-onset time-point for CT volumetry was 11 months in both BOS and RAS. Chronologic changes and diagnostic accuracy of CT lung volume (measured as percent of baseline) were investigated. RESULTS: RAS showed a significant decrease in CT lung volume at disease onset compared with baseline (mean 3,916 ml vs 3,055 ml when excluding opacities, p < 0.0001), whereas BOS showed no significant post-transplant change (mean 4,318 ml vs 4,396 ml, p = 0.214). The area under the receiver operating characteristic curve of CT lung volume for differentiating RAS from BOS was 0.959 (95% confidence interval 0.912 to 1.01, p < 0.0001) and the calculated accuracy was 0.938 at a threshold of 85%. CONCLUSION: In bilateral lung or heartâlung transplant patients with CLAD, low-dose CT volumetry is a useful tool to differentiate patients who develop RAS from those who develop BOS.
Subject(s)
Bronchiolitis Obliterans/surgery , Lung Transplantation/adverse effects , Lung/diagnostic imaging , Primary Graft Dysfunction/diagnostic imaging , Total Lung Capacity/physiology , Adult , Allografts , Female , Follow-Up Studies , Graft Rejection , Humans , Lung/physiopathology , Lung/surgery , Male , Middle Aged , Organ Size , Primary Graft Dysfunction/physiopathology , ROC Curve , Radiation Dosage , Retrospective Studies , Risk Factors , Time Factors , Tomography, X-Ray ComputedABSTRACT
BACKGROUND: Invasive pulmonary aspergillosis (IPA) is a significant complication after lung transplantation. However, the risk factors for IPA in patients colonized with Aspergillus species, and the effectiveness of culture-directed preemptive treatment, are not well known. METHODS: We studied 328 lung transplant recipients, from January 2006 to July 2009, with 1-year follow-up. Risk factors and effectiveness of culture-directed preemptive treatment were evaluated via a Cox-proportional hazard model. RESULTS: Seventy-one recipients (21.6%) developed invasive fungal infections, including 29 patients (8.8%) with IPA. Only 48.3% (14/29) of patients with IPA had pretransplantation or posttransplantation airway colonization with Aspergillus spp. In the Cox-proportional hazard model, treatment with rabbit antithymocyte globulin was significantly associated with posttransplant IPA in patients with Aspergillus colonization (hazards ratio, 4.25; 95% confidence interval, 1.09-16.6). Preemptive antifungal treatment for 3 months was significantly associated with a lower rate of IPA (0% [0/36] vs 18% [14/77]; P = 0.003, odds ratio, 0.8; 95% confidence interval, 0.7-0.9) but did not impact mortality. CONCLUSIONS: Our data suggest that almost half the cases of IPA occurred in patients without pretransplantation or posttransplantation airway colonization with Aspergillus spp. Among patients with Aspergillus colonization, use of rabbit antithymocyte globulin was associated with 4-fold risk of subsequent development of IPA. Invasive pulmonary aspergillosis was an independent risk factor for 1-year mortality. Use of preemptive antifungal treatment for 3 months may be associated with significant reduction of IPA without influencing mortality.
Subject(s)
Antifungal Agents/administration & dosage , Aspergillus/drug effects , Invasive Pulmonary Aspergillosis/prevention & control , Lung Transplantation/adverse effects , Respiratory System/microbiology , Adult , Antilymphocyte Serum/adverse effects , Aspergillus/classification , Aspergillus/immunology , Aspergillus/isolation & purification , Chi-Square Distribution , Drug Administration Schedule , Female , Humans , Immunocompromised Host , Immunosuppressive Agents/adverse effects , Invasive Pulmonary Aspergillosis/diagnosis , Invasive Pulmonary Aspergillosis/immunology , Invasive Pulmonary Aspergillosis/microbiology , Invasive Pulmonary Aspergillosis/mortality , Logistic Models , Lung Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Proportional Hazards Models , Retrospective Studies , Risk Factors , Time Factors , Treatment OutcomeABSTRACT
The standard of care for chronic gastro-esophageal reflux disease (GERD), which affects up to 40% of the population, is the use of drugs such as proton pump inhibitors (PPIs) that block the production of stomach acid. Despite widespread use, the effects of PPIs on gastric fluid remain poorly characterized. In this study, gastric fluid was collected from patients undergoing cardiac surgery who were not (n = 40) or were (n = 25) actively taking PPIs. Various enzymatic and immunoassays as well as mass spectrometry were utilized to analyze the concentrations of bile, gastricsin, trypsin, and pepsin in the gastric fluid. Proteomic analyses by mass spectrometry suggested that degradation of trypsin at low pH might account, at least in part, for the observation that patients taking PPIs have a greater likelihood of having high concentrations of trypsin in their gastric fluid. In general, the concentrations of all analytes evaluated varied over several orders of magnitude, covering a minimum of a 2000-fold range (gastricsin) and a maximum of a 1 × 10(6) -fold range (trypsin). Furthermore, the concentrations of various analytes were poorly correlated with one another in the samples. For example, trypsin and bile concentrations showed a significant (P < 0.0001) but not strong correlation (r = 0.54). Finally, direct assessment of bacterial concentrations by flow cytometry revealed that PPIs did not cause a profound increase in microbial load in the gastric fluid. These results further delineate the profound effects that PPI usage has on the physiology of the stomach.
ABSTRACT
OBJECTIVE: The study objective was to compare the outcomes of intraoperative extracorporeal membrane oxygenation versus cardiopulmonary bypass support in lung transplantation. METHODS: We performed a retrospective cohort study from a prospective database of adult lung transplantations performed at the University of Toronto from 2007 to 2013. Among 673 lung transplantations performed in the study period, 267 (39.7%) required cardiopulmonary support. There were 39 cases of extracorporeal membrane oxygenation (2012-2013) and 228 cases of cardiopulmonary bypass (2007-2013). Patients who were bridged with extracorporeal life support, underwent a concomitant cardiac procedure, received a combined liver or heart transplant, were colonized with Burkholderia cenocepacia, or required emergency cannulation for cardiopulmonary support were excluded. Finally, 33 extracorporeal membrane oxygenation cases were matched with 66 cases of cardiopulmonary bypass according to age (±10 years), lung transplantation indication, and procedure type (bilateral vs single lung transplantation). RESULTS: Recipient factors such as body mass index and gender were not different between extracorporeal membrane oxygenation and cardiopulmonary bypass groups. Furthermore, donor variables were similar, including age, body mass index, last PaO2/FiO2 ratio, smoking history, positive airway cultures, and donor type (brain death and donation after cardiac death). Early outcomes, such as mechanical ventilation requirement, length of intensive care unit stay, and length of hospital stay, significantly favored extracorporeal membrane oxygenation (median 3 vs 7.5 days, P = .005; 5 vs 9.5 days, P = .026; 19 vs 27 days, P = .029, respectively). Perioperative blood product transfusion requirement was lower in the extracorporeal membrane oxygenation group. The 90-day mortality for the extracorporeal membrane oxygenation group was 6% versus 15% for cardiopulmonary bypass (P = .32). CONCLUSIONS: Extracorporeal membrane oxygenation may be considered as the first choice of intraoperative cardiorespiratory support for lung transplantation.
Subject(s)
Cardiopulmonary Bypass , Extracorporeal Membrane Oxygenation , Idiopathic Pulmonary Fibrosis/surgery , Lung Transplantation/methods , Pulmonary Disease, Chronic Obstructive/surgery , Adult , Aged , Cardiopulmonary Bypass/adverse effects , Cardiopulmonary Bypass/mortality , Databases, Factual , Extracorporeal Membrane Oxygenation/adverse effects , Extracorporeal Membrane Oxygenation/mortality , Female , Humans , Idiopathic Pulmonary Fibrosis/diagnosis , Idiopathic Pulmonary Fibrosis/mortality , Intensive Care Units , Kaplan-Meier Estimate , Length of Stay , Lung Transplantation/adverse effects , Lung Transplantation/mortality , Male , Middle Aged , Ontario , Postoperative Complications/mortality , Postoperative Complications/therapy , Pulmonary Disease, Chronic Obstructive/diagnosis , Pulmonary Disease, Chronic Obstructive/mortality , Respiration, Artificial , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment OutcomeABSTRACT
BACKGROUND: Ex vivo lung perfusion (EVLP) is an effective method to assess and improve the function of otherwise unacceptable lungs, alleviating the shortage of donor lungs. The early results with EVLP have been encouraging, but longer-term results, including functional and patient-reported outcomes, are not well characterized. METHODS: This retrospective single-center study included all lung transplants performed between September 2008 and December 2012. We investigated whether survival or rate of chronic lung allograft dysfunction (CLAD) differed in recipients of EVLP-treated lungs compared with contemporaneous recipients of conventional donor lungs. We also studied functional (highest forced expiratory volume in 1 second predicted, change in 6-minute walk distance, number of acute rejection episodes) and quality of life outcomes. RESULTS: Of 403 lung transplants that were performed, 63 patients (15.6%) received EVLP-treated allografts. Allograft survival for EVLP and conventional donor lung recipients was 79% vs 85%, 71% vs 73%, and 58% vs 57% at 1, 3, and 5 years after transplant, respectively (log-rank p = not significant). Freedom from CLAD was also similar (log-rank p = 0.53). There were no significant differences in functional outcomes such as highest forced expiratory volume in 1 second predicted (76.5% ± 23.8% vs 75.8% ± 22.8%, p = 0.85), change in 6-minute walk distance (194 ± 108 meters vs 183 ± 126 meters, p = 0.57), or the number of acute rejection episodes (1.5 ± 1.4 vs 1.3 ± 1.3, p = 0.36). The EVLP and conventional donor groups both reported a significantly improved quality of life after transplantation, but there was no intergroup difference. CONCLUSION: EVLP is a safe and effective method of assessing and using high-risk donor lungs before transplantation and leads to acceptable long-term survival, graft function, and improvements of quality of life that are comparable with conventionally selected donor lungs.
Subject(s)
Lung Transplantation/methods , Quality of Life , Adolescent , Adult , Aged , Female , Humans , Male , Middle Aged , Perfusion/methods , Preoperative Care , Recovery of Function , Retrospective Studies , Treatment Outcome , Young AdultABSTRACT
A thin-film solid-phase microextraction (SPME) method coupled to liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis was developed for high-throughput determination of bronchoalveolar lavage bile acids. The proposed method was validated according to the bioanalytical method validation guidelines. LOQ and LLOQ were 0.007 and 0.02 µmol/L, respectively. The accuracy and the precision were <7 and <4%, respectively. The performance of the proposed method was also compared with an optimized enzymatic cycling assay. Results showed a weak correlation between total concentration of bile acid (BAs) obtained with enzymatic assay and cumulative concentration of specific BAs determined with SPME-LC-MS/MS. This discrepancy was probably due to the presence of other BAs in bronchoalveolar lavage fluid (BALF) samples. Metabolites profiling of BALF samples extracts using a high-resolution mass spectrometer (HRMS) revealed the presence of additional BAs, which were not included in the proposed method. After considering these additional BAs, a strong correlation was found between the LC-MS method and the enzymatic assay. Unsupervised statistical analysis conducted on HRMS data also showed clear separation within BALF samples, depending on the presence of BAs and other lipids. SPME-LC-MS-based metabolites profiling may provide additional information for diagnosis occurrence and severity of gastric reflux/aspiration in lung transplant patients.
Subject(s)
Bile Acids and Salts/analysis , Chromatography, High Pressure Liquid/methods , Enzyme Assays/methods , Solid Phase Microextraction/methods , Tandem Mass Spectrometry/methods , Bile Acids and Salts/metabolism , Bronchoalveolar Lavage , Humans , Reproducibility of ResultsABSTRACT
BACKGROUND: Invasive aspergillosis (IA) is an important cause of morbidity and mortality among patients undergoing lung transplant. Cystic fibrosis-lung transplant recipients (CF-LTRs) may be at greater risk of IA following lung transplantation because of the presence of Aspergillus in their airways before transplantation. This study evaluated the impact of pretransplant Aspergillus colonization on the risk for IA among CF-LTRs. METHODS: A single-center retrospective cohort study of CF-LTRs was conducted between 2006 and 2010. Respiratory tract cultures before transplantation were reviewed to identify patients with pretransplant Aspergillus colonization. Patients with positive Aspergillus sputum culture or positive bronchoalvelolar lavage (BAL) galactomannan after transplantation were classified as having colonization or disease according to the International Society of Heart and Lung Transplantation criteria. RESULTS: A total of 93 CF patients underwent lung transplantation. Seventy percent (65/93) of CF-LTRs had pretransplant Aspergillus colonization. Thirty-six patients had positive intraoperative Aspergillus culture from the native lung BAL. Overall, 22.5% (20/93) of CF-LTRs developed IA. Median time to IA was 42 days following transplantation. Positive intraoperative Aspergillus culture (OR 4.36, 95% CI 1.35-14.05, P=0.01) and treatment for acute cellular rejection within 90 days after transplantation (OR 3.53, 95% CI 1.03-12.15, P=0.05) were independent risk factors for IA. Antifungal prophylaxis was administered to 61% (57/93) of CF-LTRs. One-year mortality rate was 16% (15/93). IA was not associated with increased risk of death (OR 2.10, 95% CI 0.62-7.06, P=0.23). CONCLUSION: Pretransplant Aspergillus colonization is frequent among CF-LTRs and a positive intraoperative Aspergillus culture produced a fourfold higher risk of developing IA.
Subject(s)
Aspergillus/pathogenicity , Cystic Fibrosis/complications , Cystic Fibrosis/microbiology , Invasive Pulmonary Aspergillosis/complications , Lung Transplantation/methods , Adult , Antifungal Agents/therapeutic use , Bronchoalveolar Lavage Fluid/microbiology , Female , Humans , Immunosuppressive Agents/adverse effects , Incidence , Invasive Pulmonary Aspergillosis/epidemiology , Male , Postoperative Complications , Retrospective Studies , Risk Factors , Sputum/microbiology , Time FactorsABSTRACT
INTRODUCTION: To determine long-term outcome and risk factors for recurrence after thymectomy. METHODS: Patients who underwent thymectomy (n = 262) for a thymic tumor (1986-2010) were identified from a prospective database. Patients were classified according to World Helath Organization (WHO) histologic classification, Masaoka staging system, and completeness of resection. Risk factors for recurrence: WHO histology, tumor size, Masaoka stage and completeness of resection were analyzed. RESULTS: Of 262 patients, 51% were female, median age was 55 years, and 39% had myasthenia gravis. Median follow-up was 7.5 years, median tumor size was 5.4 cm, and Masaoka stage distribution was: I (25%), II (47%), III (17%), IV (4%), and (7%) not classified. Of 200 patients classified under the WHO system, there were (7%) type A, (22%) type AB, and (71%) type B; 83% had complete resection. One-hundred and sixty-nine patients received adjuvant radiotherapy, eight adjuvant chemoradiotherapy and 14 neoadjuvant chemoradiotherapy. Overall survival was 95% at 5 years, 91% at 10 years and 91% at 15 years. Recurrence occurred in 12 patients and disease-related death in four patients. Five patients underwent re-resection for recurrence with survival of 2-15 years. Only Masaoka stage and tumor size were associated with statistically significant risk of recurrence on multivariate analysis. CONCLUSION: Resectable thymoma is associated with excellent prognosis. Aggressive resection of recurrent disease yielded excellent long-term results. Higher Masaoka stage is associated with a greater chance of incomplete resection. Higher Masaoka stage and increasing tumor size are independent factors associated with recurrence.