RESUMO
PURPOSE: We report outcomes associated with EVLP lungs in high-risk lung transplant recipients utilizing a national database. METHODS: We performed a retrospective analysis of the UNOS Database (1/1/2018-3/31/2024). High-risk status was defined as mean pulmonary arterial pressure > 35 mmHg, lung retransplantation, or bridge to transplant. In addition to univariable analysis, propensity score matched analysis was performed on predictors of donor and recipient characteristics. RESULTS: Risk of dying on the waitlist was significantly higher for high-risk candidates (HR: 1.69 [1.51 - 1.89], p < 0.001). Following matching, 203 EVLP cases were matched to 609 standard procurement recipients. The EVLP group was associated with higher rates of postoperative acute kidney injury requiring renal replacement therapy (27% vs 16%, p < 0.001), higher mortality on index admission (13% vs. 8%, p = 0.04), and longer length of stay (29 vs 25 days, p = 0.006). EVLP modality was associated with survival time (p < 0.001) with portable EVLP having significantly shorter survival (2.7 years) relative to standard cases (4.7 years, p < 0.02). A subgroup analysis found that this survival effect was limited to bridge and retransplant recipients. CONCLUSIONS: EVLP lungs were associated with higher rates of postoperative AKI and portable EVLP was associated with shorter survival in high-risk lung transplant recipients. However, given the high waitlist mortality in this candidate population, EVLP lungs should still be considered an alternative.
RESUMO
This special article is the 17th in an annual series for the Journal of Cardiothoracic and Vascular Anesthesia. The authors thank the editor in chief, Dr Kaplan, and the editorial board for the opportunity to continue this series, namely, the research highlights of the past year in the specialty of cardiothoracic and vascular anesthesiology.1 The major themes selected for 2024 are outlined in this introduction, and each highlight is reviewed in detail in the main article. The literature highlights in the specialty for 2024 begin with an update on perioperative rehabilitation and enhanced recovery in cardiothoracic surgery, with a focus on novel methods to best assess our patients in the preoperative period and the impact of implementing enhanced recovery care models on outcomes. The second major theme is focused on cardiac surgery, with the authors discussing new insights into anemia, transfusions, and coronary artery bypass grafting outcomes with a focus on gender disparities. The third theme is focused on cardiothoracic transplantation, with discussions focusing on techniques related to lung transplantation, including mechanical circulatory support. The 4th theme is focused on mechanical circulatory support, with discussions exploring advancements in left ventricular assist devices highlight the evolving landscape of mechanical circulatory support and discussion of anticoagulation practices. The fifth and final theme is an update on medical cardiology, with a focus on the outcomes of transcatheter management of regurgitant pathology, device management in heart failure, and new techniques in catheter ablation. The themes selected for this article are only a few of the diverse advances in the specialty during 2024. These highlights will inform the reader of key updates on a variety of topics, leading to improvement in perioperative outcomes for patients with cardiothoracic and vascular disease.
RESUMO
BACKGROUND: In lung transplantation (LuTx), various ischemic phases exist, yet the rewarming ischemia time (RIT) during implantation has often been overlooked. During RIT, lungs are deflated and exposed to the body temperature in the recipient's chest cavity. Our prior clinical findings demonstrated that prolonged RIT increases the risk of primary graft dysfunction. However, the molecular mechanisms of rewarming ischemic injury in this context remain unexplored. We aimed to characterize the rewarming ischemia phase during LuTx by measuring organ temperature and comparing transcriptome and metabolome profiles in tissue obtained at the end versus the start of implantation. METHODS: In a clinical observational study, 34 double-LuTx with ice preservation were analyzed. Lung core and surface temperature (n=65 and 55 lungs) was measured during implantation. Biopsies (n=59 lungs) were wedged from right middle lobe and left lingula at start and end of implantation. Tissue transcriptomic and metabolomic profiling were performed. RESULTS: Temperature increased rapidly during implantation, reaching core/surface temperatures of 21.5°C/25.4°C within 30min. Transcriptomics showed increased pro-inflammatory signaling and oxidative stress at the end of implantation. Upregulation of NLRP3 and NFKB1 correlated with RIT. Metabolomics indicated elevated levels of amino acids, hypoxanthine, uric acid, cysteineglutathione disulfide alongside decreased levels of glucose and carnitines. Arginine, tyrosine, and 1-carboxyethylleucine showed correlation with incremental RIT. CONCLUSIONS: The final rewarming ischemia phase in LuTx involves rapid organ rewarming, accompanied by transcriptomic and metabolomic changes indicating pro-inflammatory signaling and disturbed cell metabolism. Limiting implantation time and lung cooling represent potential interventions to alleviate rewarming ischemic injury.
RESUMO
This review evaluates the effectiveness of veno-pulmonary support with an oxygenator using extracorporeal membrane oxygenation as a bridge to lung transplantation strategy in patients undergoing veno-venous extracorporeal membrane oxygenation while awaiting lung transplantation. Examining indications, contraindications, and clinical outcomes, the study highlights potential benefits, drawing insights from successful cases in South Korea and the United States. Despite limited sample sizes, veno-pulmonary support with an oxygenator using extracorporeal membrane oxygenation emerges as a promising approach for further investigation in lung transplantation support. The review emphasizes its role in improving hemodynamic status, preventing complications during extended waiting periods, and presenting a cost-effective alternative to traditional methods, especially in developing countries. While in-hospital mortality rates range from 0% to 10%, comparable to other approaches, cautious optimism surrounds veno-pulmonary support with an oxygenator using extracorporeal membrane oxygenation, urging expanded research to solidify its standing in enhancing patient outcomes, reducing costs, and promoting transplant success.
RESUMO
We aimed to evaluate the image quality and diagnostic performance of chronic lung allograft dysfunction (CLAD) with lung ventilation single-photon emission computed tomography (SPECT) images acquired briefly using a convolutional neural network (CNN) in patients after lung transplantation and to explore the feasibility of short acquisition times. We retrospectively identified 93 consecutive lung-transplant recipients who underwent ventilation SPECT/computed tomography (CT). We employed a CNN to distinguish the images acquired in full time from those acquired in a short time. The image quality was evaluated using the structural similarity index (SSIM) loss and normalized mean square error (NMSE). The correlation between functional volume/morphological volume (F/M) ratios of full-time SPECT images and predicted SPECT images was evaluated. Differences in the F/M ratio were evaluated using Bland-Altman plots, and the diagnostic performance was compared using the area under the curve (AUC). The learning curve, obtained using MSE, converged within 100 epochs. The NMSE was significantly lower (P < 0.001) and the SSIM was significantly higher (P < 0.001) for the CNN-predicted SPECT images compared to the short-time SPECT images. The F/M ratio of full-time SPECT images and predicted SPECT images showed a significant correlation (r = 0.955, P < 0.0001). The Bland-Altman plot revealed a bias of -7.90% in the F/M ratio. The AUC values were 0.942 for full-time SPECT images, 0.934 for predicted SPECT images and 0.872 for short-time SPECT images. Our findings suggest that a deep-learning-based approach can significantly curtail the acquisition time of ventilation SPECT, while preserving the image quality and diagnostic accuracy for CLAD.
RESUMO
Lung transplantation is the only definitive therapy for end-stage pulmonary disease. Less than 20â¯% of offered lungs are successfully transplanted due to a limited ischemic time window and poor donor lung quality manifested by pulmonary edema, hypoxia, or trauma. Therefore, poor donor organ recovery and utilization are significant barriers to wider implementation of the life-saving therapy of transplantation. While ischemia reperfusion injury (IRI) is often identified as the underlying molecular insult leading to immediate poor lung function in the post-operative period, this injury encompasses several pathways of cellular injury in addition to the recruitment of the innate immune system to the site of injury to propagate this inflammatory cascade. Pyroptosis is a central molecular inflammatory pathway that is the most significant contributor to injury in this early post-operative phase. Pyroptosis is another form of programmed cell death and is often associated with IRI. The mitigation of pyroptosis in the early post-operative period following lung transplantation is a potential novel way to prevent poor allograft function and improve outcomes for all recipients. Here we detail the pyroptotic pathway, its importance in lung transplantation, and several therapeutic modalities that can mitigate this harmful inflammatory pathway.
RESUMO
Background: Lung transplantation (LTx) is a well-established option for patients in the end-stage of lung disease that is not responsive to other treatments. Although the survival rate after LTx has seen a significant increase, exercise tolerance is still limited and poses a big obstacle to recovery after LTx. Pulmonary rehabilitation (PR) is a comprehensive intervention that has many benefits for patients with chronic respiratory disease. However, the effectiveness of PR on adult patients with LTx is inconclusive. We performed this meta-analysis to assess the efficacy of PR in adult LTx recipients. Methods: Eligible randomized controlled trials (RCTs) and quasi-experimental studies published until March 25, 2024 were searched in MEDLINE, Embase, Web of Science and CINAHL. Additionally, reference lists and published systematic reviews were scanned by manual searching. Studies selection, data extraction, and risk of bias assessment were conducted independently. Stata software (version 17.0) was utilized. Results: There were 21 studies (9 RCTs and 12 quasi-experimental studies) were identified. Pooled analysis showed that PR positive effect in improving 6-minute walking distance (6MWD) [standard mean difference (SMD) =1.28, 95% confidence interval (CI): 1.05-1.50, P<0.001], maximum oxygen consumption (VO2max) (SMD =0.42, 95% CI: 0.15-0.68, P=0.002), handgrip force (HGF) (SMD =0.49, 95% CI: 0.26-0.73, P<0.001), and quadriceps force (QF) (SMD =0.63, 95% CI: 0.45-0.82, P<0.001). There was no significant publication bias in those outcomes mentioned above. Conclusions: PR shows evidence for being an effective adjunctive strategy for improving exercise capacity after LTx, but multi-center trials on larger populations are required to confirm its clinical benefits in the real-world setting.
RESUMO
BACKGROUND: Few tools exist for early identification of patients at risk for chronic lung allograft dysfunction (CLAD). We previously showed hyaluronan (HA), a matrix molecule that regulates lung inflammation and fibrosis, accumulates in bronchoalveolar lavage fluid (BALF) and blood in CLAD. We aimed to determine if early posttransplant HA elevations inform CLAD risk. METHODS: HA was quantified in 3080 BALF and 1323 blood samples collected over the first posttransplant year in 743 adult lung recipients at 5 centers. The relationship between BALF or blood HA and CLAD was assessed using Cox models with a time-dependent binary covariate for "elevated" HA. Potential thresholds for elevated HA were examined using a grid search between the 50th and 85th percentile. The optimal threshold was identified using fit statistics, and the association between the selected threshold and CLAD was internally validated through iterative resampling. A multivariable Cox model using the selected threshold was performed to evaluate the association of elevated HA with CLAD considering other factors that may influence CLAD risk. RESULTS: BALF HA levels >19.1ng/mL (65th percentile), had the largest hazard ratio for CLAD (HR 1.70, 95% CI 1.25-1.31; p<0.001), optimized fit statistics, and demonstrated robust reproducibility. In a multivariable model, the occurrence of BALF HA >19.1 ng/mL in the first posttransplant year conferred a 66% increase in the hazards for CLAD (adjusted HR 1.66, 95% CI 1.19-2.32; p=0.003). Blood HA was not significantly associated with CLAD. CONCLUSIONS: We identified and validated a precise threshold for BALF HA in the first posttransplant year that distinguishes patients at increased CLAD risk.
RESUMO
BACKGROUND: Lung transplantation is a vital option for patients with end-stage lung disease. However, it faces a significant challenge due to the shortage of compatible donors, which particularly affects individuals with small chest cavities and pediatric patients. The novel approach of cadaveric lobar lung transplantation is a promising solution to alleviate the donor shortage crisis. Both the mid-term and long-term outcomes of lobar lung transplantation are comparable to those of standard lung transplantation. However, patients undergoing lobar lung transplantation reported a significantly higher rate of primary graft dysfunction compared to patients undergoing standard lung transplantation. Therefore, careful donor selection is critical to improve outcomes after lobar transplantation. However, no established method exists to evaluate each lung lobar graft of deceased donors. This case report describes a case of cadaveric lobar lung transplantation to overcome size mismatch and donor shortage, with particular emphasis on lobar graft evaluation. CASE PRESENTATION: A 39-year-old woman with scleroderma-related respiratory failure was listed for deceased donor lung transplantation due to a rapidly progressing disease. Faced with a long waiting list and impending mortality, she underwent bilateral living-donor lobar lung transplantation donated by her relatives. Post-transplant complications included progressive pulmonary vein obstruction and pleural effusion, which ultimately required retransplantation. An oversized donor with pneumonia in the bilateral lower lobes was allocated. Lung ultrasound was used to evaluate each lung lobar graft during procurement. The right upper and middle lobes and left upper lobe were confirmed to be transplantable, and lobar lung redo transplantation was performed. The patient's post-transplant course was uneventful, and she was discharged home and returned to her daily activities. CONCLUSIONS: This case highlights the clinical impact of cadaveric lobar lung transplantation as a feasible and effective strategy to overcome the shortage of donor lungs, especially in patients with small thoracic cavities. By establishing donor lung evaluation techniques and overcoming anatomical and logistical challenges, cadaveric lobar lung transplantation can significantly expand the donor pool and offer hope to those previously considered ineligible for transplantation.
RESUMO
BACKGROUND/OBJECTIVES: Neutropenia is a frequent complication among solid organ transplant (SOT) recipients receiving immunosuppressive therapy and antimicrobial prophylaxis. However, there are limited studies analysing the frequency and impact of neutropenia in lung transplant recipients (LTRs). Our aim was to analyse the frequency of neutropenia, the need for granulocyte colony-stimulating factor (GCSF) treatment within the first 18 months post-transplant and its association with acute rejection, chronic lung allograft dysfunction (CLAD), overall survival and the development of infections. METHODS: This observational and retrospective study recruited 305 patients who underwent lung transplantation between 2009 and 2019, with outpatient quarterly follow-up during the first 18 months post-surgery. RESULTS: During this period, 51.8% of patients experienced at least one episode of neutropenia. Neutropenia was classified as mild in 50.57% of cases, moderate in 36.88% and severe in 12.54%. GCSF treatment was indicated in 23.28% of patients, with a mean dose of 3.53 units. No statistically significant association was observed between neutropenia or its severity and the development of acute rejection, CLAD or overall survival. However, the patients who received GCSF treatment had a higher mortality rate compared to those who did not. Sixteen patients (5.25%) developed infections during neutropenia, with bacterial infections being the most common. CONCLUSIONS: Neutropenia is common in the first 18 months after lung transplantation and most episodes are mild. We did not find an association between neutropenia and acute rejection, CLAD, or mortality. However, the use of GCSF were associated with worse post-transplant survival.
Assuntos
Transplante de Pulmão , Neutropenia , Humanos , Transplante de Pulmão/efeitos adversos , Masculino , Feminino , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto , Fator Estimulador de Colônias de Granulócitos/uso terapêutico , Rejeição de Enxerto , IdosoRESUMO
Mastering and monitoring immunosuppressant concentrations is central to the care of lung transplant patients and involves multiple stakeholders. The objective is to conduct a risk analysis to evaluate the impact of various actions taken. The lung transplantation team was convened to carry out a failure mode effect analysis. The process was divided into stages where different risks were identified. The risk priority number (RPN) (severity, frequency, detectability) and risk level of criticality (frequency, severity) were established before implementation of actions (before 2009) and then after (in 2022) to classify risks according to four levels of criticality. The implemented actions included the establishment of a quality assurance process, computerization of monitoring, and double analysis by a physician/pharmacist pair. Thirty-two risks were identified during the four stages of the process: biological sampling (n=13), reception (n=3), analysis and treatment of levels (n=5), transmission of information/prescriptions to the patient (n=11). The total raw RPN (before 2009) was 839, with 12 major risks. The current total RPN (in 2022) was 452 (a decrease of 46.1%), with 7 major risks identified. The analysis enabled the objective evaluation of the effectiveness of the actions taken. The most secure stage of the process is the reception of residual level results. Efforts should focus on empowering and involving patients, as well as engaging local stakeholders in collaboration with the specialized transplantation team.
RESUMO
Reducing the risk of high-grade primary graft dysfunction (PGD) is vital to achieve acceptable short- and long-term outcomes for recipients following lung transplantation. However, the utilization of injured lung allografts, which may confer a higher risk of PGD, must be considered due to the disparity between the increasing number of patients requiring lung transplantation and the limited donor pool. We describe a case in which highly marginal lung allografts were utilized with a good post-transplant outcome. Donor-recipient PGD risk compatibility was taken into consideration. Normothermic ex vivo lung perfusion (EVLP) was utilized to functionally assess the allografts. A second cold ischemia time following EVLP was avoided by converting the EVLP mode to a hypothermic oxygenated perfusion setup from which the lungs were transplanted directly. We attempted to mitigate lung ischemia-reperfusion injury in the recipient by employing cytokine adsorption both during the EVLP and intraoperatively during the implant procedure. In this case report, we describe our hypothermic oxygenated perfusion setup on EVLP for the first time. Furthermore, we describe the utilization of cytokine adsorption in two phases of the same transplant process.
RESUMO
DISCLAIMER: In an effort to expedite the publication of articles, AJHP is posting manuscripts online as soon as possible after acceptance. Accepted manuscripts have been peer-reviewed and copyedited, but are posted online before technical formatting and author proofing. These manuscripts are not the final version of record and will be replaced with the final article (formatted per AJHP style and proofed by the authors) at a later time. PURPOSE: Letermovir is used primarily for cytomegalovirus (CMV) prophylaxis in select hematopoietic cell or solid organ transplant recipients. The manufacturer has provided no guidance on whether letermovir can be crushed and administered via enteral tube. This study aimed to assess whether letermovir tablets could be manipulated (eg, through crushing) for enteral tube administration. METHODS: This was a retrospective, single-center review of patients who received crushed letermovir tablets administered via enteral tube for at least 7 days, between April 2018 and August 2023. Data collection focused on demographics, transplant history, treatment characteristics associated with letermovir, and diagnosis of CMV viremia or disease. RESULTS: Fourteen patients met the inclusion criteria for the review and received crushed letermovir for a median of 19 days (range, 7 to 42 days). All patients were on letermovir as CMV prophylaxis, the majority of whom were lung transplant recipients. On the basis of CMV serostatus at the time of transplantation, 50% of patients were classified as being at high risk and the other 50% were in the intermediate-risk category for CMV disease. One patient developed low-level viremia with a CMV viral load of 254 IU/mL. No patients developed CMV infection or disease while receiving crushed letermovir. CONCLUSION: On the basis of this case series, manipulation of letermovir immediate-release tablets was proven to be safe and effective for patients. Crushing letermovir for administration via enteral tube should be considered as an option for patients who cannot tolerate administration via the oral route.
RESUMO
Standard immunosuppressive therapy for lung transplant recipients combines a calcineurin inhibitor, an antimetabolite, and corticosteroids. In an observational, retrospective, monocentric study, we sought to compare the development of chronic lung allograft dysfunction (CLAD) between 37 patients who received this standard therapy (triple-therapy group) and 59 patients who received the mammalian target of rapamycin (mTOR) inhibitor everolimus in addition to the standard therapy (quadruple-therapy group). In the quadruple-therapy group, the time elapsed from transplantation to everolimus introduction (median [25th-75th percentile]) was 12 [7-25] months. In 46/59 cases, the indication for everolimus introduction was renal function sparing. Median follow-up durations were 36 [20-62] months and 84 [52-123] months in the triple-therapy and quadruple-therapy groups, respectively (p = 0.004). The incidence of CLAD was lower in patients receiving everolimus than in those who did not with an adjusted odds ratio of 0.303 [0.118-0.775]. In addition, the median time from transplantation to CLAD was longer in patients receiving quadruple therapy comprising everolimus than in those who did not (63 [30-92] vs. 29 [12-44] months; p = 0.025). This suggests that the addition of everolimus to a standard triple could result in a lower incidence of CLAD in lung transplant recipients.
RESUMO
Long-term survival after lung transplantation remains limited by chronic lung allograft dysfunction (CLAD), with 2 main phenotypes: bronchiolitis obliterans syndrome (BOS) and restrictive allograft syndrome (RAS). We aimed to assess CLAD lung allografts using imaging mass cytometry (IMC), a high dimensional tissue imaging system allowing a multiparametric in situ exploration at a single cell level. Four BOS, 4 RAS, and 4 control lung samples were stained with 35 heavy metal-tagged antibodies selected to assess structural and immune proteins of interest. We identified 50 immune and non-immune cell clusters. CLAD lungs had significantly reduced club cells. A Ki67-high basal cell population was mostly present in RAS and in proximity to memory T cells. Memory CD8+ T cells were more frequent in CLAD lungs, regulatory T cells more prominent in RAS. IMC is a powerful technology for detailed cellular analysis within intact organ structures that may shed further light on CLAD mechanisms.
RESUMO
BACKGROUND: Recent clinical series on donation after uncontrolled cardiovascular death (uDCD) reported successful transplantation of lungs preserved by pulmonary inflation up to 3 hours postmortem. This study aims to investigate the additive effects of in situ lowering of intrathoracic temperature and sevoflurane preconditioning on lung grafts in a porcine uDCD model. METHODS: After uDCD induction, donor pigs were allocated to one of the following groups: control-static lung inflation only (SLI); TC - SLI + continuous intrapleural topical cooling (TC); or TC+Sevo - SLI + TC + sevoflurane. Lungs were retrieved 6 hours postasystole and evaluated via ex vivo lung perfusion (EVLP) for 6 hours. A left single lung transplant was performed using lungs from the best performing group, followed by 4 hours of graft evaluation. RESULTS: Animals that received TC achieved intrathoracic temperature <15°C within 1 hour after chest filling of coolant. Only lungs from donors that received TC and TC+Sevo completed the planned postpreservation 6 hours EVLP assessment. Despite similar early performance of the 2 groups on EVLP, the TC+Sevo group was superior-associated with overall lower airway pressures, higher pulmonary compliances, less edema development, and less inflammation. Transplantation was performed using lungs from the TC+Sevo group, and excellent graft function was observed postreperfusion. CONCLUSIONS: Preservation of uDCD lungs with a combination of static lung inflation, TC and sevoflurane treatment maintains good pulmonary function up to 6 hours postmortem with excellent early post lung transplant function. These interventions may significantly expand the clinical utilization of uDCD donor lungs.
RESUMO
Background: Unilateral or bilateral anterolateral thoracotomy May lead to severe acute pain in lung transplantation (LTx). Although serratus anterior plane block (SAPB) is apparently effective for pain control after open thoracic surgery, there remains a lack of evidence for the application of SAPB for postoperative analgesia after LTx. Objective: In this case series pilot study, we describe the feasibility of continuous SAPB after lung transplantation and provide a preliminary investigation of its safety and efficacy. Methods: After chest incisions closure was complete, all patients underwent ultrasound-guided SAPB with catheter insertion. Numerical rating scale (NRS), additional opioid consumption, time to endotracheal tube removal, ICU length of stay, and catheter-related adverse events were followed up and recorded for each patient within 1 week after the procedure. Results: A total of 14 patients who received LTx at this center from August 2023 to November 2023 were included. All patients received anterolateral approaches, and 10 (71.4%) of them underwent bilateral LTx. The duration of catheter placement was 2 (2-3) days, and the Resting NRS during catheter placement was equal to or less than 4. A total of 11 patients (78.6%) were supported by extracorporeal membrane oxygenation (ECMO) in LTx, whereas 8 patients (57.1%) removed the tracheal tube on the first day after LTx. Intensive care unit (ICU) stay was 5 (3-6) days, with tracheal intubation retained for 1 (1-2) days, and only one patient was reintubated. The morphine equivalent dose (MED) in the first week after LTx was 11.95 mg, and no catheter-related adverse events were detected. Limitations: We did not assess the sensory loss plane due to the retrospective design. In addition, differences in catheter placement time May lead to bias in pain assessment. Conclusion: Although continuous SAPB May be a safe and effective fascial block technique for relieving acute pain after LTx, it should be confirmed by high-quality clinical studies.
RESUMO
OBJECTIVE: Interstitial lung diseases (ILDs) are diverse pulmonary disorders marked by diffuse lung inflammation and fibrosis. The variability in characteristics and treatment approaches complicates diagnosis and management. In advanced cases requiring transplantation, determining indications and selecting suitable candidates presents additional challenges. METHODS: Of all patients with non-IPF ILD between December 2016 to December 2022 were analyzed retrospectively. Patients were categorized into two groups: transplanted patients and deceased patients on the waiting list. Clinical data and survival outcomes were compared between groups. RESULTS: Of the 43 patients, 20 underwent lung transplantation while 23 died awaiting transplantation. Waiting list mortality was 53.4%, with median waiting times similar between groups (3 months for transplant patients and 6 months for those on the waiting list). There were no significant differences between groups in age, gender, height, BMI, 6-minute walk test (6MWT), or forced vital capacity (FVC). The prevalence of pulmonary hypertension (PH) was 76.7% in right heart catheterizations, similar in both groups. One single and 19 bilateral lung transplants were performed. Overall, 13 of the 20 patients survived to discharge from the hospital. One-year mortality was 7/20 (35%). The median follow-up was 34 months, with a 1-year conditional survival of 90.9% at 3 years and 70.7% at 5 years. CONCLUSIONS: This study underscores the importance of further research into non-IPF ILDs. Lung transplantation remains a viable option that can significantly enhance both the quality and longevity of life for patients with advanced ILD.
Assuntos
Doenças Pulmonares Intersticiais , Transplante de Pulmão , Listas de Espera , Humanos , Feminino , Masculino , Doenças Pulmonares Intersticiais/cirurgia , Doenças Pulmonares Intersticiais/mortalidade , Pessoa de Meia-Idade , Estudos Retrospectivos , Listas de Espera/mortalidade , Adulto , Idoso , Hipertensão Pulmonar/cirurgia , Hipertensão Pulmonar/mortalidade , Capacidade VitalRESUMO
Cytomegalovirus (CMV) infection and reactivation in solid organ transplant (SOT) recipients increases the risk of viremia, graft failure and death. Clinical studies of CMV serostatus indicate that donor positive recipient negative (D+/R-) patients have greater viremia risk than D-/R-. The majority of patients are R+ having intermediate serologic risk. To characterize the long-term impact of CMV infection and assess viremia risk, we sought to measure the effects of CMV on the recipient immune epigenome. Specifically, we profiled DNA methylation in 156 individuals before lung or kidney transplant. We found that the methylome of CMV positive SOT recipients is hyper-methylated at loci associated with neural development and Polycomb group (PcG) protein binding, and hypo-methylated at regions critical for the maturation of lymphocytes. In addition, we developed a machine learning-based model to predict the recipient CMV serostatus after correcting for cell type composition and ancestry. This CMV episcore measured at baseline in R+ individual stratifies viremia risk accurately in the lung transplant cohort, and along with serostatus the CMV episcore could be a potential biomarker for identifying R+ patients at high viremia risk.