ABSTRACT
BACKGROUND: We previously reported beneficial effects of prone positioning during ex vivo lung perfusion (EVLP) using porcine lungs. In this study, we sought to determine if prone positioning during EVLP was beneficial in human donor lungs rejected for clinical use. METHODS: Human double lung blocs were randomized to prone EVLP (n = 5) or supine EVLP (n = 5). Following 16 h of cold storage at 4°C and 2 h of cellular EVLP in either the prone or supine position. Lung function, compliance, and weight were evaluated and transplant suitability determined after 2 h of EVLP. RESULTS: Human lungs treated with prone EVLP had significantly higher partial pressure of oxygen/fraction of inspired oxygen (P/F) ratio [348 (291-402) vs. 199 (191-257) mm Hg, p = 0.022] and significantly lower lung weight [926(864-1078) vs. 1277(1029-1483) g, p = 0.037] after EVLP. 3/5 cases in the prone group were judged suitable for transplant after EVLP, while 0/5 cases in the supine group were suitable. When function of upper vs. lower lobes was evaluated, prone EVLP lungs showed similar P/F ratios and inflammatory cytokine levels in lower vs. upper lobes. In contrast, supine EVLP lungs showed significantly lower P/F ratios [68(59-150) vs. 467(407-515) mm Hg, p = 0.012] and higher tissue tumor necrosis factor alpha levels [100.5 (46.9-108.3) vs. 39.9 (17.0-61.0) ng/ml, p = 0.036] in lower vs. upper lobes. CONCLUSIONS: Prone lung positioning during EVLP may optimize the outcome of EVLP in human donor lungs, possibly by improving lower lobe function.
Subject(s)
Lung Transplantation , Reperfusion Injury , Animals , Humans , Lung , Lung Transplantation/adverse effects , Oxygen , Perfusion , Reperfusion Injury/etiology , Reperfusion Injury/prevention & control , Reperfusion Injury/pathology , SwineABSTRACT
BACKGROUND: Currently, pulmonary edema is evaluated via surgical inspection and palpation in donor lungs, and there is no quantitative standard diagnostic tool for evaluating pulmonary edema in donor procurement and ex vivo lung perfusion (EVLP). The purpose of this study was to investigate the significance of lung weight at the donor hospital and lung weight during EVLP as a complementary parameter of transplant suitability in EVLP. MATERIALS AND METHODS: Twenty-one of rejected human lungs were perfused in cellular EVLP. Transplant suitability was evaluated at 2 h as per standard criteria of Lund-protocol EVLP. RESULTS: Lung weight at donor hospital was significantly correlated with PaO2/FiO2 (P/F) ratio in EVLP (r = -0.44). There was a significant difference in lung weight at donor hospital between suitable cases (n = 13) and nonsuitable cases (n = 8). Light lung group (lung weight at donor hospital < 1280 g; n = 17) was suitable for transplant in 76%, whereas none of heavy lung group (lung weight at donor hospital ≥ 1280 g; n = 4) was suitable (P < 0.05). Lung weight at 2 h and lung weight change during EVLP were significantly associated with P/F ratio at 2 h and transplant suitability (P < 0.05, each). CONCLUSIONS: Our findings demonstrate that lung weight at donor hospital, lung weight change, and lung weight at 2 h of EVLP might be a predictor of P/F ratio and transplant suitability in cellular EVLP.
Subject(s)
Lung Transplantation , Lung/pathology , Organ Preservation , Perfusion , Pulmonary Edema/diagnosis , Tissue and Organ Procurement/methods , Adult , Aged , Female , Humans , Male , Middle Aged , Organ Size , Pulmonary Edema/pathologyABSTRACT
BACKGROUND: Donor lungs with smoking history are perfused in ex vivo lung perfusion (EVLP) to expand donor lung pool. However, the impact of hyperinflation of perfused lungs in EVLP remains unknown. The aim of this study was to investigate the significance of hyperinflation, using an ex vivo measurement delta VT, during EVLP in smoker's lungs. MATERIALS AND METHODS: Seventeen rejected donor lungs with smoking history of median 10 pack-years were perfused for 2 h in cellular EVLP. Hyperinflation was evaluated by measuring delta VT = inspiratory - expiratory tidal volume (VT) difference at 1 h. All lungs were divided into two groups; negative delta VT (n = 11, no air-trapping pattern) and positive delta VT (n = 6, air-trapping pattern). Transplant suitability was judged at 2 h. By using lung tissue, linear intercept analysis was performed to evaluate the degree of hyperinflation. RESULTS: The positive delta VT group had significantly lower transplant suitability than the negative delta VT group (16 versus 81%, P = 0.035). The positive delta VT group was significantly associated with lower partial pressure of oxygen/fraction of inspired oxygen ratio ratio (278 versus 356 mm Hg, P = 0.049), higher static compliance (119 versus 98 mL/cm H2O, P = 0.050), higher lung weight ratio (1.10 versus 0.96, P = 0.014), and higher linear intercept ratio (1.52 versus 0.93, P = 0.005) than the negative delta VT group. CONCLUSIONS: Positive delta VT appears as an ex vivo marker of ventilator-associated lung hyperinflation of smoker's lungs during EVLP.
Subject(s)
Allografts/physiopathology , Lung Transplantation/standards , Lung/physiopathology , Smoking/physiopathology , Tissue and Organ Procurement/standards , Adult , Aged , Exhalation/physiology , Female , Humans , Male , Middle Aged , Organ Preservation , Perfusion , Smoking/adverse effects , Tidal Volume/physiology , Tissue Donors , Tissue and Organ Procurement/methodsABSTRACT
BACKGROUND: Ex vivo lung perfusion (EVLP) permits extended evaluation of donor lungs for transplant. However, the optimal EVLP duration of Lund protocol is unclear. Using human lungs rejected for clinical transplant, we sought to compare the results of 1 versus 2 h of EVLP using the Lund protocol. METHODS: Twenty-five pairs of human lungs rejected for clinical transplant were perfused with the Lund EVLP protocol. Blood gas analysis, lung compliance, bronchoscopy assessment, and perfusate cytokine analysis were performed at both 1 and 2 h. Recruitment was performed at both time points. Donor lung transplant suitability was determined at both time points. RESULTS: All cases were divided into four groups based on transplant suitability assessment at 1 h and 2 h of EVLP. In group A (n = 10), lungs were judged suitable for transplant at both 1 and 2 h of EVLP. In group B (n = 6), lungs were suitable at 1 h but nonsuitable at 2 h. In group C (n = 2), lungs were nonsuitable at 1 h but suitable at 2 h. Finally, in group D (n = 7), lungs were nonsuitable for transplant at both time points. In both groups B and C (n = 8), the transplant suitability assessment changed between 1 and 2 h of EVLP. CONCLUSIONS: In human lungs rejected for transplant, transplant suitability differed at 1 versus 2 h of EVLP in 32% of lungs studied. Evaluation of lungs with Lund protocol EVLP beyond 1 h may improve donor organ assessment.
Subject(s)
Donor Selection/methods , Lung Transplantation/standards , Lung/physiology , Perfusion , Transplants/physiology , Adult , Bronchoscopy , Donor Selection/standards , Female , Humans , Lung/diagnostic imaging , Male , Middle Aged , Pulmonary Gas Exchange/physiology , Time Factors , Transplants/diagnostic imagingABSTRACT
For more accurate lung evaluation in ex vivo lung perfusion (EVLP), we have devised a new parameter, PaO2 /FiO2 ratio difference (PFD); PFD1-0.4 = P/F ratio at FiO2 1.0 - P/F ratio at FiO2 0.4. The aim of this study is to compare PFD and transplant suitability, and physiological parameters utilized in cellular EVLP. Thirty-nine human donor lungs were perfused. At 2 h of EVLP, PFD1-0.4 was compared with transplant suitability and physiological parameters. In a second study, 10 pig lungs were perfused in same fashion. PFD1-0.4 was calculated by blood from upper and lower lobe pulmonary veins and compared with lobe wet/dry ratio and pathological findings. In human model, receiver operating characteristic curve analysis showed PFD1-0.4 had the highest area under curve, 0.90, sensitivity, 0.96, to detect nonsuitable lungs, and significant negative correlation with lung weight ratio (R2 = 0.26, P < 0.001). In pig model, PFD1-0.4 on lower and upper lobe pulmonary veins were significantly associated with corresponding lobe wet/dry ratios (R2 = 0.51, P = 0.019; R2 = 0.37, P = 0.060), respectively. PFD1-0.4 in EVLP demonstrated a significant correlation with lung weight ratio and allowed more precise assessment of individual lobes in detecting lung edema. Moreover, it might support decision-making in evaluation with current EVLP criteria.
Subject(s)
Lung Transplantation , Lung/pathology , Lung/physiology , Respiratory Function Tests/standards , Adult , Animals , Death , Extracorporeal Circulation , Female , Humans , Male , Middle Aged , Organ Size , Oxygen , Perfusion , Pulmonary Veins/physiology , ROC Curve , Sensitivity and Specificity , Swine , Tissue Donors , Tissue and Organ Procurement , Warm IschemiaABSTRACT
BACKGROUND: Real-time lung weight (LW) measurement is a simple and noninvasive technique for detecting extravascular lung water during ex vivo lung perfusion (EVLP). We investigated the feasibility of real-time LW measurement in clinical EVLP as a predictor of transplant suitability and post-transplant outcomes. METHODS: In our clinical acellular EVLP protocol, real-time LW was measured in 117 EVLP cases from June 2019 to June 2022. The estimated LW gain at each time point was calculated using a scale placed under the organ chamber. The lungs were classified into 4 categories based on LW adjusted for height and compared between suitable and unsuitable cases. The relationship between estimated LW gain and primary graft dysfunction was also investigated. RESULTS: The estimated LW gain during the EVLP significantly correlated with the LW gain (post-EVLP LW and pre-EVLP LW) measured on the back table (R2 = 0.61, p < 0.01). In the adjusted LW categories 2 to 4, the estimated LW gain at 0-1 hour after EVLP was significantly higher in unsuitable cases than in suitable cases. The area under the curve for the estimated LW gain was ≥0.80. Primary graft dysfunction grades 0 to 1 had a significantly lower estimated LW gain at 60 minutes than grades 2 to 3 (-43 vs 1 g, p < 0.01). CONCLUSIONS: Real-time lung measurements can predict transplant suitability and post-transplant outcomes by the early detection of extravascular lung water during the initial 1 hour of EVLP.
ABSTRACT
BACKGROUND: Ex vivo lung perfusion expands the lung transplant donor pool and extends preservation time beyond cold static preservation. We hypothesized that repeated regular ex vivo lung perfusion would better maintain lung grafts. METHODS: Ten pig lungs were randomized into 2 groups. The control underwent 16 h of cold ischemic time and 2 h of cellular ex vivo lung perfusion. The intermittent ex vivo lung perfusion group underwent cold ischemic time for 4 h, ex vivo lung perfusion (first) for 2 h, cold ischemic time for 10 h, and 2 h of ex vivo lung perfusion (second). Lungs were assessed, and transplant suitability was determined after 2 h of ex vivo lung perfusion. RESULTS: The second ex vivo lung perfusion was significantly associated with better oxygenation, limited extravascular water, higher adenosine triphosphate, reduced intraalveolar edema, and well-preserved mitochondria compared with the control, despite proinflammatory cytokine elevation. No significant difference was observed in the first and second perfusion regarding oxygenation and adenosine triphosphate, whereas the second was associated with lower dynamic compliance and higher extravascular lung water than the first. Transplant suitability was 100% for the first and 60% for the second ex vivo lung perfusion, and 0% for the control. CONCLUSIONS: The second ex vivo lung perfusion had a slight deterioration in graft function compared to the first. Intermittent ex vivo lung perfusion created a better condition for lung grafts than cold static preservation, despite cytokine elevation. These results suggested that intermittent ex vivo lung perfusion may help prolong lung preservation.
Subject(s)
Lung Transplantation , Organ Preservation , Swine , Animals , Organ Preservation/methods , Lung , Perfusion/adverse effects , Perfusion/methods , Lung Transplantation/adverse effects , Lung Transplantation/methods , Cytokines , Adenosine TriphosphateABSTRACT
Objective: Patients with thoracic aortic disease commonly present with concomitant multisegment pathology. We describe the patient population, analyze outcomes, and define the patient selection strategy for valve-preserving aortic root reimplantation (VPARR) combined with the arch procedure. Methods: From 2008 to 2018, 98 patients underwent VPARR combined with the aortic arch procedure (hemi-arch, 50% [n = 49, limited repair]; total arch, 50% [n = 49, complete repair] including 39 with elephant trunk). Indications for surgery were aneurysmal disease (61%) and aortic dissection (39%). The median follow-up was 17 months (IQR, 8 to 60 months). Results: There were no operative deaths or paraplegia, and 5 patients underwent re-exploration for bleeding. During follow-up, 2 patients required aortic valve replacement for severe aortic insufficiency at 1 and 5 years, and 4 patients died. In the limited repair group, 1 patient underwent reintervention for aortic arch replacement, whereas 4 patients underwent planned intervention (1 endovascular and 3 open thoracoabdominal aortic repair). In the complete repair group, 23 patients underwent planned intervention (15 endovascular and 8 open thoracoabdominal repair). Conclusions: Single-stage, complete, proximal aortic repair including VPARR combined with total aortic arch replacement is as safe and feasible to perform as limited arch repair and facilitates further intervention in carefully selected patients with diffuse aortic pathology at centers of expertise.
Subject(s)
Aortic Aneurysm, Thoracic , Blood Vessel Prosthesis Implantation , Aorta, Thoracic/surgery , Aortic Aneurysm, Thoracic/etiology , Aortic Aneurysm, Thoracic/surgery , Blood Vessel Prosthesis Implantation/methods , Humans , Patient Selection , Replantation , Retrospective Studies , Treatment OutcomeABSTRACT
BACKGROUND: Elevated donor lung weight may adversely affect donor lung transplant suitability and post-transplant outcomes. The objective of this study is to investigate the impact of lung weight after procurement and ex vivo lung perfusion (EVLP) on transplant suitability, post-transplant graft dysfunction, and clinical outcomes and define the donor lung weight range most relevant to clinical outcomes. METHODS: From February 2016 to August 2020, 365 human lung donors to a single transplant center were retrospectively reviewed. 239 were transplanted without EVLP, 74 treated with EVLP (50 went on to transplant), and 52 declined for transplant without EVLP consideration. Donor lung weights were measured immediately after procurement and, when performed, after EVLP. Lung weights were adjusted by donor height and divided into 4 quartiles. RESULTS: Donor lungs in the highest weight quartile at donor hospital had a significantly lower transplant suitability rate after EVLP, higher rates of primary graft dysfunction grade 3 at 72 hours, and longer intensive care unit/hospital stay. For lungs treated with lung perfusion, the highest lung weight quartile at the end of lung perfusion was associated with a significantly lower transplant suitability rate, higher incidence of primary graft dysfunction grade 3 at 72 hours, and longer intensive care unit/hospital stay, compared to the other categories. CONCLUSIONS: Donor lung weight stratified by quartile categories can assist decision-making regarding need for EVLP at the donor hospital as well as during EVLP evaluation. Caution should be used when considering donor lungs in the highest weight quartile for transplantation.
Subject(s)
Lung Transplantation , Primary Graft Dysfunction , Humans , Lung , Perfusion , Primary Graft Dysfunction/epidemiology , Retrospective Studies , Tissue DonorsABSTRACT
OBJECTIVES: Survival is poor following an orthotopic heart transplant with gender-mismatched donors and recipients. Patients bridged to an orthotopic heart transplant with a ventricular assist device (VAD) frequently become sensitized. We hypothesized that the combination of VAD bridging and gender-mismatch may result in greater rejection and poorer survival. METHODS: Data were obtained from the United Network of Organ Sharing database. Patients were divided into 4 groups: (i) VAD recipients who received a heart from a gender-matched donor (VAD-M); (ii) VAD recipients who received a heart from a gender-mismatched donor (VAD-MM); (iii) noVAD recipients who received a heart from a gender-matched donor (noVAD-M); and (iv) noVAD recipients who received a heart from a gender-mismatched donor (noVAD-MM). Rejection episodes within 1-year post-transplant and transplant survival were compared in VAD-M versus VAD-MM and noVAD-M versus noVAD-MM groups, respectively. RESULTS: Between January 2000 and June 2017, of 33 401 adult patients who underwent heart transplants, 8648, 2441, 12 761 and 4992 patients were identified as VAD-M, VAD-MM, noVAD-M and noVAD-MM, respectively. Rejection within 1-year post-transplant occurred in 23.3% and 27.3% of the VAD-M and VAD-MM groups, respectively (P < 0.01) and in 21.8% and 23.6% of the noVAD-M and noVAD-MM groups (P = 0.02), respectively. In an adjusted survival analysis, the VAD-MM group showed significantly worse survival than the VAD-M group (P < 0.01), whereas there was no significant difference between the noVAD-M and noVAD-MM groups (P = 0.21). CONCLUSIONS: Our results indicated that the combination of VAD bridging and gender-mismatch caused greater rejection and worse survival following a transplant. Further study is necessary to prove comparable post-transplant survival of gender-matched or -mismatched recipients without VAD bridging.
Subject(s)
Heart Transplantation , Heart-Assist Devices , Adult , Graft Rejection/epidemiology , Graft Survival , Humans , Retrospective Studies , Tissue Donors , Treatment OutcomeABSTRACT
BACKGROUND: Blood transfusion can have detrimental effects on the pulmonary system, leading to lung injury and respiratory decompensation with subsequent increased morbidity and mortality in surgical and critically ill patients. How much of this effect is carried from a lung donor to transplant recipient is not fully understood, raising questions regarding transplant suitability of lungs from transfused donors. METHODS: United Network for Organ Sharing data were reviewed. Lung transplants from adult donors and known donor transfusion status were included; multiorgan transplants and retransplants were excluded. Recipient mortality was evaluated based on donor and recipient characteristics using a Kaplan-Meier survival estimate, Cox proportional hazards, and logistic regression models. We further assessed whether recipient mortality risk modified the donor transfusion effect. RESULTS: From March 1996 to June 2017, 20,294 transplants were identified. Outcome analysis based on transfusion status showed nonsignificant difference in 1-year mortality (P = .214). Ninety-day recipient mortality was significantly higher with transfusion of >10 units (U) vs 1-10 U or no transfusion (8.5%, 6.1%, and 6.0%, respectively, P = .005). Multivariable analysis showed increased 90-day mortality with transfusion of >10 U compared to no transfusion (odds ratio 1.62, P < .001), whereas 1-10 U showed no difference (odds ratio 1.07, P = .390). When stratified by recipient transplant risk, transfusion of >10 U was associated with increased mortality even with the lowest-risk recipients, while transfusion of 1-10 U showed no mortality increase even in the highest-risk recipients. CONCLUSIONS: Donor transfusion of >10 U of blood was associated with increased 90-day recipient mortality even in low-risk transplants. This risk should be considered when evaluating donor lungs.
Subject(s)
Blood Transfusion , Lung Transplantation/mortality , Tissue Donors , Transplant Recipients/statistics & numerical data , Adult , Cause of Death , Female , Graft Survival , Humans , Kaplan-Meier Estimate , Male , Models, Statistical , Risk FactorsABSTRACT
BACKGROUND: Lung transplantation (LTx) is a definitive treatment for end-stage lung disease. Herein, we reviewed our center experience over 3 decades to examine the evolution of recipient characteristics and contemporary predictors of survival for LTx. METHODS: We retrospectively reviewed the data of LTx procedures performed at our institution from January 1990 to January 2019 (n = 1819). The cohort is divided into 3 eras; I: 1990-1998 (n = 152), II: 1999-2008 (n = 521), and III: 2009-2018 (n = 1146). Univariate and multivariate analyses of survival in era III were performed. RESULTS: Pulmonary fibrosis has become the leading indication for LTx (13% in era I, 57% in era III). Median recipient age increased (era I: 46 y-era III: 61 y) as well as intraoperative mechanical circulatory support (era I: 0%-era III: 6%). Higher lung allocation score was associated with primary graft dysfunction (P < 0.0001), postoperative extracorporeal mechanical support (P < 0.0001), and in-hospital mortality (P = 0.002). In era III, hypoalbuminemia, thrombocytopenia, and high primary graft dysfunction grade were multivariate predictors of early mortality. The 5-y survival in eras II (55%) and III (55%) were superior to era I (40%, P < 0.001). Risk factors for late mortality in era III included recipient age, chronic allograft dysfunction, renal dysfunction, high model for end-stage liver disease score, and single LTx. CONCLUSIONS: In this longitudinal single-center study, recipient characteristics have evolved to include sicker patients with greater complexity of procedures and risk for postoperative complications but without significant impact on hospital mortality or long-term survival. With advancing surgical techniques and perioperative management, there is room for further progress in the field.
Subject(s)
End Stage Liver Disease , Lung Transplantation , End Stage Liver Disease/etiology , Humans , Lung Transplantation/adverse effects , Postoperative Complications/etiology , Retrospective Studies , Severity of Illness IndexABSTRACT
BACKGROUND: Severe gastrointestinal (GI) complications (GICs) after cardiac surgery are associated with poor outcomes. Herein, we characterize the severe forms of GICs and associated risk factors of mortality. METHODS: We retrospectively analyzed the clinically significant postoperative GICs after cardiac surgical procedures performed at our institution from January 2010 to April 2017. Multivariable analysis was used to identify predictors for in-hospital mortality. RESULTS: Of 29,909 cardiac surgical procedures, GICs occurred in 1037 patients (3.5% incidence), with overall in-hospital mortality of 14% compared with 1.6% in those without GICs. GICs were encountered in older patients with multiple comorbidities who underwent complex prolonged procedures. The most lethal GICs were mesenteric ischemia (n = 104), hepatopancreatobiliary (HPB) dysfunction (n = 139), and GI bleeding (n = 259), with mortality rates of 45%, 27%, and 17%, respectively. In the mesenteric ischemia subset, coronary artery disease (odds ratio [OR], 4.57; P = .002], coronary bypass grafting (OR, 6.50; P = .005), reoperation for bleeding/tamponade (OR, 12.07; P = .01), and vasopressin use (OR, 11.27; P < .001) were predictors of in-hospital mortality. In the HPB complications subset, hepatic complications occurred in 101 patients (73%), pancreatitis in 38 (27%), and biliary disease in 31 (22%). GI bleeding occurred in 20 patients (31%) with HPB dysfunction. In the GI bleeding subset, HPB disease (OR, 10.99; P < .001) and bivalirudin therapy (OR, 12.84; P = .01) were predictors for in-hospital mortality. CONCLUSIONS: Although relatively uncommon, severe forms of GICs are associated with high mortality. Early recognition and aggressive treatment are mandatory to improve outcomes.
Subject(s)
Cardiac Surgical Procedures/adverse effects , Gastrointestinal Diseases/etiology , Gastrointestinal Diseases/mortality , Postoperative Complications/etiology , Postoperative Complications/mortality , Aged , Female , Hospital Mortality , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Severity of Illness IndexABSTRACT
INTRODUCTION: Lung transplantation outcomes are influenced by the intraoperative mechanical cardiopulmonary support strategy used. This surgery was historically done either on cardiopulmonary bypass (CPB) or off pump. Recently, there has been increased interest in intraoperative support with veno-arterial (VA) or veno-venous (VV) extracorporeal membrane oxygenation (ECMO). However, there is a lack of consensus on the relative risks, benefits and indications for each intraoperative support strategy. AREAS COVERED: This review includes information from cohort studies, case-control studies, and case series that compare morbidity and/or mortality of two or more intraoperative cardiopulmonary support strategies during lung transplantation. EXPERT OPINION: The optimal strategy for intraoperative cardiopulmonary support during lung transplantation remains an area of debate. Current data suggest that off pump is associated with better outcomes and could be considered whenever feasible. ECMO is generally associated with preferable outcomes to CPB, but the data supporting this association is not robust. Interestingly, whether CPB is unplanned or prolonged might influence outcomes more than the use of CPB itself. These observations can help guide surgical teams in their approach for intraoperative mechanical support strategy during lung transplantation and should serve as the basis for further investigations.
Subject(s)
Cardiopulmonary Bypass , Extracorporeal Membrane Oxygenation , Lung Transplantation , Humans , Treatment OutcomeABSTRACT
Use of prone positioning during ex vivo lung perfusion (EVLP) with the Toronto protocol reduced pulmonary edema in marginal human donor lungs. This report describes 2 cases in which prone positioning during EVLP significantly reduced lung weight. One of the 2 cases resulted in successful double-lung transplantation.
Subject(s)
Lung Transplantation/methods , Lung/physiology , Lung/surgery , Perfusion/methods , Preoperative Care/methods , Humans , Male , Middle Aged , Prone Position , Treatment Outcome , Young AdultABSTRACT
BACKGROUND: The direCt Lung Ultrasound Evaluation (CLUE) technique was proven to be an accurate method for monitoring extravascular lung water in donor lungs during ex vivo lung perfusion (EVLP) in an experimental model. The aim of this study was to examine the application of CLUE in the clinical setting. METHODS: Lungs were evaluated using acellular EVLP protocol. Ultrasound images were obtained directly from the lung surface. Images were graded according to the percentage of B-lines seen on ultrasound. CLUE scores were calculated at the beginning and end of EVLP for the whole lung, each side, and lobe based on the number (No.) of images in each grade and the total No. of images taken and evaluated retrospectively. RESULTS: A total of 23 EVLP cases were performed resulting in 13 lung transplants (LTxs) with no hospital mortality. Primary graft dysfunction (PGD) occurred in only 1 recipient (PGD3, no PGD2). Significant differences were found between suitable and non-suitable lungs in CLUE scores (1.03 vs 1.85, p < 0.001), unlike the partial pressure of oxygen/fraction of inspired oxygen ratio. CLUE had the highest area under the receiver operating characteristic curve (0.98) compared with other evaluation parameters. The initial CLUE score of standard donor lungs was significantly better than marginal lungs. The final CLUE score in proned lungs showed improvement when compared with initial CLUE score, especially in the upper lobes. CONCLUSIONS: The CLUE technique shows the highest accuracy in evaluating donor lungs for LTx suitability compared with other parameters used in EVLP. CLUE can optimize the outcomes of LTx by guiding the decision making through the whole process of clinical EVLP.
Subject(s)
Extracorporeal Circulation/methods , Lung Transplantation , Perfusion/methods , Primary Graft Dysfunction/prevention & control , Tissue Donors , Adult , Female , Humans , Male , Middle Aged , Primary Graft Dysfunction/diagnosis , ROC Curve , Retrospective Studies , UltrasonographyABSTRACT
BACKGROUND: Lung ischemia-reperfusion injury after transplantation is associated with worse clinical outcomes. MicroRNA (miR) are critical regulators of gene expression that could provide potential targets for novel gene therapy. Herein, we aim to examine the feasibility of using the ex vivo lung perfusion (EVLP) platform to examine the changes in miR expression in human lungs in response to cold ischemia and ex vivo reperfusion (CI/EVR). METHODS: Twenty-four human lungs were perfused in cellular EVLP system for 2 h, and tissue samples were obtained before and after EVLP as well as from control donors. MicroRNA expression profiling of the lung tissue was performed using next-generation sequencing and downstream predicted target genes were examined. In situ hybridization assay of the validated miR was used to identify the expressing cell type. RESULTS: After 2 h of EVLP, cytokines production was significantly increased (IL-1ß, IL-6, IL-8, IL-10, and TNF-α). MicroRNA sequencing identified a significant change in the expression of a total of 21 miR after CI and 47 miR after EVR. Validation using quantitative polymerase chain reaction showed significant upregulation of miR-17 and miR548b after CI/EVR. Downstream analysis identified abundant inflammatory and immunologic targets for miR-17 and miR-548b that are known mediators of lung injury. In situ hybridization assays detected positive signals of the 2 miR expression in alveolar epithelial cells. CONCLUSIONS: This study demonstrates the feasibility of using the EVLP platform to study miR signature in human lungs in response to CI/EVR. We found that miR-17 and miR-548b were upregulated in alveolar epithelial cells after CI/EVR, which merit further exploration.
Subject(s)
Cold Ischemia , Lung Transplantation , Lung/metabolism , MicroRNAs/physiology , Reperfusion , Cytokines/biosynthesis , Humans , Reperfusion Injury/etiology , Sequence Analysis, RNA , Up-RegulationABSTRACT
OBJECTIVE: Prone positioning has been shown to improve oxygenation in patients with lung injury. We hypothesized that prone positioning of lungs during ex vivo lung perfusion (EVLP) can not only improve oxygenation but also diminish ischemia-reperfusion injury (IRI). The aim of our study was to evaluate the potential benefits of prone positioning of lungs during EVLP compared with the standard supine position. METHODS: Ten pigs were kept in the supine position at room temperature for 2 hours after circulatory death after which lungs were procured and subjected to 5 hours of cold storage. Lungs then underwent 2 hours of cellular EVLP with either supine positioning (Control group, n = 5) or prone positioning (Prone group, n = 5). Lung function was evaluated by assessment of physiological parameters and tissue histology and cytokines. RESULTS: IRI in the Prone group was significantly less than in the Control group. Lungs in the Prone group were significantly associated with greater partial pressure of oxygen/fraction of inspired oxygen ratio median (minimum-maximum) (301 mm Hg [272-414 mm Hg] vs 166 mm Hg [109-295] mm Hg, P = .03), better static compliance (38.9 mL/cmH2O [31.1-44.3 mL/cmH2O] vs 21.5 mL/cmH2O [12.2-33.3 mL/cmH2O], P = .03), lower lung weight ratio (1.26 [1.24-1.41] vs 1.48 [1.36-2.34], P = .02), and lower interleukin-1ß levels (1.6 ng/mL [0.9-5.3 ng/mL] vs 7.5 ng/mL [5.0-16.1 ng/mL], P = .04) compared with lungs in the Control group. CONCLUSIONS: These data suggest that prone positioning of lungs during EVLP may diminish IRI during EVLP and improve lung function.
Subject(s)
Lung/blood supply , Reperfusion Injury/prevention & control , Animals , Disease Models, Animal , Female , Lung/pathology , Lung/physiology , Lung/surgery , Perfusion , Prone Position , Reperfusion Injury/pathology , Supine Position , SwineABSTRACT
OBJECTIVES: Typically, single-lung ex vivo lung perfusion (SL-EVLP) is preferred when there is concern of contamination from the opposite lung. However, a comprehensive assessment of the SL-EVLP has not been completed. The purpose of this study is to compare the physiological parameters of SL-EVLP and double-lung EVLP (DL-EVLP) in the assessment of transplant suitability. METHODS: Seven pairs of rejected donor lungs were perfused in cellular EVLP, with a tidal volume of 6 ml/kg ideal body weight and a perfusion flow of 70 ml/kg/min. The transplant suitability of each side was judged in the DL-EVLP. Subsequently, the tidal volume and flow were reduced by half. The right SL-EVLP was maintained for 10 min by clamping the left main pulmonary artery and the bronchus. Similarly, left SL-EVLP was performed. The physiological parameters were compared between SL-EVLP and DL-EVLP. RESULTS: PO2/FiO2 ratio was significantly lower in SL-EVLP than in DL-EVLP [182.5 (127.5-309.5) vs 311.5 (257.5-377.0) mmHg, P < 0.001]. There was a significant correlation with a higher shunt fraction and PCO2 in the pulmonary vein in SL-EVLP when compared to DL-EVLP. There was no difference in peak inspiratory and plateau pressures between SL-EVLP and DL-EVLP. Suitable lungs (n = 6) were associated with better PO2/FiO2 ratios and lower airway pressures than non-suitable lungs (n = 8). CONCLUSIONS: In SL-EVLP, peak inspiratory and plateau pressures have clinical utility in the assessment of the transplant suitability. It is important that PO2/FiO2 ratio in SL-EVLP is appreciably lower than that in DL-EVLP. This discrepancy should be considered in the evaluation of the transplant suitability in SL-EVLP.
Subject(s)
Lung Transplantation , Lung/blood supply , Perfusion/methods , Tissue Donors , Tissue and Organ Procurement/methods , Adult , Female , Humans , Male , Middle Aged , Pressure , Pulmonary ArteryABSTRACT
BACKGROUND: Although the safety and feasibility of combined coronary artery bypass grafting (CABG) and bone marrow stem cell (BMSC) transplantation have been proven, the efficacy of this approach remains controversial. Therefore, we conducted an updated meta-analysis of randomized controlled trials to evaluate the efficacy of this procedure. METHODS: Electronic databases were systematically searched for randomized trials comparing 4-month to 6-month follow-up outcomes in patients who underwent isolated CABG (CABG group) and patients who received BMSC transplantation with CABG (BMSC group). A random-effects meta-analysis was conducted across eligible studies. Meta-regression and subgroup analyses were utilized to identify sources of data heterogeneity. RESULTS: Thirteen trials were eligible, with a total number of 292 patients in the BMSC group and 247 patients in the CABG group. Compared with the CABG group, the BMSC group showed significant improvement of follow-up left ventricular ejection fraction (n = 539, 4.8%; 95% confidence interval [CI], 2.3%-7.3%; P = .001). The analyzed data showed significant heterogeneity (I2 = 74.2%, P < .001). The reduction in scar size (n = 120; -2.2 mL; 95% CI, -18.2 mL to 13.7 mL; P = .44) and the improvement in the 6-minute walk test (n = 212; 41 m; 95% CI, -13 m to 95 m; P = .10) did not reach statistical significance. No significant correlation was found between the number of the injected BMSCs or the method of injection and the change in ejection fraction. CONCLUSIONS: The present evidence suggests that combined CABG and BMSC transplantation is associated with improvement of left ventricular ejection fraction. However, the heterogeneity in the data suggests variations in patient response to this therapy. Further studies are required to understand these variations.