RESUMO
INTRODUCTION: Lung transplantation has become increasingly utilized in patients with COVID-19. While several single-center and UNOS database studies have been published on lung transplants (LTs) for end-stage lung disease (ESLD) from Coronavirus disease 2019 (COVID-19), there is a lack of multi-center and international data. METHODS: This is a multicenter analysis from 11 high-volume lung transplant centers in the United States and Europe. Data were collected through the Multi-Institutional ECLS Registry and stratified by ESLD due to COVID-19 versus other etiologies. Demographics and clinical variables were compared using Chi-square test and Fisher's exact test. Survival was assessed by Kaplan-Meier curves and compared by log-rank test with propensity score matching. RESULTS: Of 1606 lung transplant recipients, 46 (2.9%) were transplanted for ESLD from COVID-19 compared to 1560 (97.1%) without a history of COVID-19. Among COVID-19 patients, 30 (65.2%) had COVID-19-associated ARDS and 16 (34.8%) had post-COVID-19 fibrosis. COVID-19 patients had higher lung allocation scores (78.0 vs. 44.4, p < 0.0001), had severely limited functional status (37.0% vs. 2.9%, p < 0.0001), had higher preoperative ECMO usage (65.2% vs. 5.4%, p < 0.0001), and spent less time on the waitlist (32 vs. 137 days, p < 0.0001). A 30-day survival was comparable between COVID-19 and non-COVID-19 patients before (100% vs. 98.7%, p = 0.39) and after propensity matching (p = 0.15). CONCLUSIONS: Patients who received LTs due to COVID-19 had short-term survival comparable to that of patients without COVID-19. Our findings support the idea that lung transplantation should be considered for select patients with ESLD due to COVID-19.
Assuntos
COVID-19 , Transplante de Pulmão , Sistema de Registros , SARS-CoV-2 , Humanos , COVID-19/mortalidade , COVID-19/epidemiologia , Transplante de Pulmão/mortalidade , Masculino , Feminino , Pessoa de Meia-Idade , Estados Unidos/epidemiologia , Taxa de Sobrevida , Adulto , Europa (Continente)/epidemiologia , Estudos Retrospectivos , Idoso , Resultado do TratamentoRESUMO
OBJECTIVE: To demonstrate the value of a viscoelastic-based intraoperative transfusion algorithm to reduce non-RBC product administration in adult cardiac surgical patients. DESIGN: A prospective observational study. SETTING: At a quaternary academic teaching hospital. PARTICIPANTS: Cardiac surgical patients. INTERVENTIONS: Viscoelastic-based intraoperative transfusion algorithm. MEASUREMENTS AND MAIN RESULTS: The study authors compared intraoperative blood product transfusion rates in 184 cardiac surgical patients to 236 historic controls after implementing a viscoelastic-based algorithm. The authors found a non-significant reduction in transfusion of 23.8% for fresh frozen plasma (FFP) units (0.84 ± 1.4 v 0.64 ± 1.38; p = ns), 33.4% for platelet units (0.90 ± 1.39 v 0.60 ± 131; p = ns), and 15.8% for cryoprecipitate units (0.19 ± 0.54 v 0.16 ± 0.50; p = ns). They found a 43.9% reduction in red blood cell (RBC) units transfused (1.98 ± 2.24 v 0.55 ± 1.36; p = 0.008). There were no statistically significant differences in time to extubation (8.0 hours (4.0-21.0) v 8.0 (4.0-22.3), reoperation for bleeding (15 [12.3%] v 10 [10.6%]), intensive care unit length of stay (ICU LOS) (51.0 hours [28.0-100.5] v 53.5 [33.3-99.0]) or hospital LOS (9.0 days [6.0-15.0] v 10.0 [7.0-17.0]). Deviation from algorithm adherence was 32.7% (48/147). Packed RBC, FFP, platelets, cryoprecipitate, and cell saver were significantly reduced in the Algorithm Compliant Cohort compared with historic controls, whereas times to extubation, ICU LOS, and hospital LOS did not reach significance. CONCLUSIONS: After the implementation of a viscoelastic-based algorithm, patients received fewer packed RBC, FFP, platelets, cryoprecipitate, and cell saver. Algorithm-compliant patients received fewer transfusions; however, reductions in times to extubation, ICU LOS, and hospital LOS were not statistically significant compared with historic controls.
Assuntos
Transfusão de Sangue , Procedimentos Cirúrgicos Cardíacos , Adulto , Humanos , Ponte de Artéria Coronária , Hemorragia , Algoritmos , Estudos RetrospectivosRESUMO
Mitral valve surgery (MVS), with repair preferred to replacement, is a common procedure for the treatment of severe primary mitral regurgitation related to leaflet prolapse. Structural complications after MVS include left ventricular outflow obstruction, paravalvular leak and atrial septal defect. Intraoperative transoesophageal echocardiography and predischarge transthoracic echocardiography (TTE) specifically screen for these complications. Ventricular septal defect (VSD), a known complication after aortic valve surgery, is rarely reported after MVS. Recently, unsuccessful valvuloplasty prior to replacement was suggested as a risk factor. We present such a case and explore mechanisms with advanced cardiac imaging. In this case, the patient was found to have an elongated membranous septum that likely predisposed her to septal injury. Finally, we provide guidance on specific transoesophageal/transthoracic echocardiography views to avoid a missed diagnosis.
Assuntos
Comunicação Interventricular , Obstrução do Fluxo Ventricular Externo , Humanos , Feminino , Valva Mitral/diagnóstico por imagem , Valva Mitral/cirurgia , Obstrução do Fluxo Ventricular Externo/cirurgia , Comunicação Interventricular/diagnóstico por imagem , Comunicação Interventricular/cirurgia , Comunicação Interventricular/complicações , Ecocardiografia , Ecocardiografia TransesofagianaRESUMO
BACKGROUND: The Minnesota Pectoralis Risk Score (MPRS) utilizes computed tomography-quantified thoracic muscle and clinical variables to predict survival after left ventricular assist device (LVAD) implantation. The model has not been prospectively tested in HeartMate 3 recipients. METHODS: A single-center HeartMate 3 cohort from July 2016 to July 2021 (n = 108) was utilized for this analysis. Cohort subjects with complete covariates for MPRS calculation (pectoralis muscle measures, Black race, creatinine, total bilirubin, body mass index, bridge to transplant status, and presence/absence of contrast) implanted after MPRS development were included. MPRS were calculated on each subject. Receiver operating characteristic curves were generated to test model discrimination at 30-day, 90-day, and 1-year mortality post-LVAD. Next, the performance of the 1-year post-LVAD outcome was compared to the HeartMate 3 survival risk score (HM3RS). RESULTS: The mean age was 58 (15 years), 80% (86/108) were male, and 26% (28/108) were destination therapy. The area under the curve (AUC) for the MPRS model to predict post-LVAD mortality was 0.73 at 30 days, 0.78 at 90 days, and 0.81 at 1 year. The AUC for the HM3RS for the 1-year outcome was 0.693. Each 1-unit point of the MPRS was associated with a significant increase in the hazard rate of death after LVAD (hazard ratio 2.1, 95% confidence interval 1.5-3.0, p < 0.0001). CONCLUSIONS: The MPRS had high performance in this prospective validation, particularly with respect to 90-day and 1-year post-LVAD mortality. Such a tool can provide additional information regarding risk stratification to aid informed decision-making.
Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Humanos , Masculino , Pessoa de Meia-Idade , Feminino , Insuficiência Cardíaca/cirurgia , Minnesota , Fatores de Risco , Modelos de Riscos Proporcionais , Estudos Retrospectivos , Resultado do TratamentoRESUMO
Left ventricular assist device (LVAD) is an option for bridge-to-transplant or destination therapy for patients with end-stage heart failure. Right heart failure (RHF) remains a complication after LVAD implantation that portends high morbidity and mortality, despite advances in LVAD technology. Definitions of RHF vary, but generally include the need for inotropic or pulmonary vasodilator support, or potential right ventricular (RV) mechanical circulatory support. This review covers the complex pathophysiology of RHF related to underlying myocardial dysfunction, interventricular dependence, and RV afterload, as well as treatment strategies to curtail this challenging problem.
RESUMO
BACKGROUND: Data showing the efficacy and safety of the transplantation of hearts obtained from donors after circulatory death as compared with hearts obtained from donors after brain death are limited. METHODS: We conducted a randomized, noninferiority trial in which adult candidates for heart transplantation were assigned in a 3:1 ratio to receive a heart after the circulatory death of the donor or a heart from a donor after brain death if that heart was available first (circulatory-death group) or to receive only a heart that had been preserved with the use of traditional cold storage after the brain death of the donor (brain-death group). The primary end point was the risk-adjusted survival at 6 months in the as-treated circulatory-death group as compared with the brain-death group. The primary safety end point was serious adverse events associated with the heart graft at 30 days after transplantation. RESULTS: A total of 180 patients underwent transplantation; 90 (assigned to the circulatory-death group) received a heart donated after circulatory death and 90 (regardless of group assignment) received a heart donated after brain death. A total of 166 transplant recipients were included in the as-treated primary analysis (80 who received a heart from a circulatory-death donor and 86 who received a heart from a brain-death donor). The risk-adjusted 6-month survival in the as-treated population was 94% (95% confidence interval [CI], 88 to 99) among recipients of a heart from a circulatory-death donor, as compared with 90% (95% CI, 84 to 97) among recipients of a heart from a brain-death donor (least-squares mean difference, -3 percentage points; 90% CI, -10 to 3; P<0.001 for noninferiority [margin, 20 percentage points]). There were no substantial between-group differences in the mean per-patient number of serious adverse events associated with the heart graft at 30 days after transplantation. CONCLUSIONS: In this trial, risk-adjusted survival at 6 months after transplantation with a donor heart that had been reanimated and assessed with the use of extracorporeal nonischemic perfusion after circulatory death was not inferior to that after standard-care transplantation with a donor heart that had been preserved with the use of cold storage after brain death. (Funded by TransMedics; ClinicalTrials.gov number, NCT03831048.).
Assuntos
Morte Encefálica , Transplante de Coração , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Sobrevivência de Enxerto , Preservação de Órgãos , Doadores de Tecidos , Morte , Segurança do PacienteRESUMO
A single-center continuous-flow left ventricular assist device (LVAD) cohort (n = 503) was reviewed for patients with information on cardiac rehabilitation (CR) participation (n = 273) over a 13-year period. The analysis was then limited LVAD recipients who fit into three main CR categories: those who graduated CR (n = 138), those who were able to but declined participation (n = 61), and those who were too sick to complete or start CR (n = 28). To assess the association between CR categories and mortality and hospitalizations on LVAD support, multivariate cox regression and negative binomial regression analyses were performed, respectively. Among those who started CR and had the opportunity to finish (enough follow-up time, insurance coverage), 79% graduated. Those who graduated CR had a 96% survival at 1 year (95% confidence interval [CI], 91-98). Compared with the graduated group, those in the too sick group had an increased hazards rate of mortality (hazard ratio, 2.85; 95% CI, 1.49-5.44; p < 0.01) and an increase in the incidence rate of hospitalizations (incidence rate ratio, 1.74; 95% CI, 1.14-2.66, p = 0.01). This study is the largest to date to report outcomes of LVAD recipients referred for CR. The lower readmission rates and high survival in the group that graduated CR provides further evidence for the safety of CR in LVAD recipients.
Assuntos
Reabilitação Cardíaca , Insuficiência Cardíaca , Coração Auxiliar , Humanos , Coração Auxiliar/efeitos adversos , Insuficiência Cardíaca/cirurgia , Insuficiência Cardíaca/epidemiologia , Estudos Retrospectivos , Modelos de Riscos Proporcionais , Resultado do TratamentoRESUMO
AIMS: The long-term outcomes of patients treated with extracorporeal cardiopulmonary resuscitation (ECPR) for refractory ventricular tachycardia/ventricular fibrillation (VT/VF) out-of-hospital cardiac arrest (OHCA) remain poorly defined. The purpose of this study was to describe the hospital length of stay and long-term survival of patients who were successfully rescued with ECPR after refractory VT/VF OHCA. METHODS AND RESULTS: In this retrospective cohort study, the length of index admission and long-term survival of patients treated with ECPR after OHCA at a single centre were evaluated. In a sensitivity analysis, survival of patients managed with left ventricular assist device (LVAD) implantation or heart transplantation during the same period was also evaluated. Between 1 January 2016 and 12 January 2020, 193 patients were transferred for ECPR considerations and 160 underwent peripheral veno-arterial extracorporeal membrane oxygenation cannulation. Of these, 54 (33.7%) survived the index admission. These survivors required a median 16 days of intensive care and 24 days total hospital stay. The median follow-up time of the survivors was 1216 (683, 1461) days. Of all, 79.6 and 72.2% were alive at 1 and 4 years, respectively. Most deaths within the first year occurred among the patients requiring discharge to a long-term acute care facility. Overall survival rates at 4 years were similar in the ECPR and LVAD cohorts (P = 0.30) but were significantly higher for transplant recipients (P < 0.001). CONCLUSION: This data suggest that the lengthy index hospitalization required to manage OHCA patients with ECPR is rewarded by excellent long-term clinical outcomes in an expert ECPR programme.
Assuntos
Reanimação Cardiopulmonar , Parada Cardíaca Extra-Hospitalar , Humanos , Parada Cardíaca Extra-Hospitalar/terapia , Estudos Retrospectivos , Tempo de Internação , Reanimação Cardiopulmonar/métodos , HospitaisRESUMO
BACKGROUND: Acute kidney injury (AKI) after repair of type A acute aortic dissection (TAAAD) has been shown to affect both short- and long-term outcomes. This study aimed to validate the impact of postoperative AKI on in-hospital and long-term outcomes in a large population of dissection patients presenting to multinational aortic centers. Additionally, we assessed risk factors for AKI including surgical details. METHODS: Patients undergoing surgical repair for TAAAD enrolled in the International Registry of Acute Aortic Dissection database were evaluated to determine the incidence and risk factors for the development of AKI. RESULTS: A total of 3307 patients were identified. There were 761 (23%) patients with postoperative AKI (AKI group) vs 2546 patients without (77%, non-AKI group). The AKI group had a higher rate of in-hospital mortality (n = 193, 25.4% vs n = 122, 4.8% in the non-AKI group, P < .001). Additional postoperative complications were also more common in the AKI group including postoperative cerebrovascular accident, reexploration for bleeding, and prolonged ventilation. Independent baseline characteristics associated with AKI included a history of hypertension, diabetes, chronic kidney disease, evidence of malperfusion on presentation, distal extent of dissection to abdominal aorta, and longer cardiopulmonary bypass time. Kaplan-Meier survival curves revealed decreased 5-year survival among the AKI group (P < .001). CONCLUSIONS: AKI occurs commonly after TAAAD repair and is associated with a significantly increased risk of operative and long-term mortality. In this large study using the International Registry of Acute Aortic Dissection database, several factors were elucidated that may affect risk of AKI.
Assuntos
Injúria Renal Aguda , Dissecção Aórtica , Humanos , Estudos Retrospectivos , Dissecção Aórtica/cirurgia , Fatores de Risco , Injúria Renal Aguda/epidemiologia , Injúria Renal Aguda/etiologia , Aorta , Complicações Pós-Operatórias/etiologiaRESUMO
Warfarin is the only approved anticoagulant after mechanical valve replacement, but it is a well described risk factor for calciphylaxis among patients with end-stage kidney disease. Our patient with end-stage kidney disease rapidly developed calciphylaxis after dual mechanical valve replacement in association with warfarin initiation, posing significant challenges in clinical management and a fatal outcome. (Level of Difficulty: Intermediate.).
RESUMO
AIMS: Cardiopulmonary stress test (CPX) is routinely performed when evaluating patient candidacy for left ventricular assist device (LVAD) implantation. The predictive value of hypotensive systolic blood pressure (SBP) response during CPX on clinical outcomes is unknown. This study aims to determine the effect of hypotensive SBP response during to clinical outcomes among patients who underwent LVAD implantation. METHODS AND RESULTS: This was a retrospective single center study enrolling consecutive patients implanted with a continuous flow LVAD between 2011 and 2022. Hypotensive SBP response was defined as peak exercise SBP below the resting value. Multivariable Cox-regression analysis was performed to evaluate the relationship between hypotensive SBP response and all-cause mortality within 30 and 90 days of LVAD implantation. A subgroup analysis was performed for patients implanted with a HeartMate III (HM III) device. Four hundred thirty-two patients underwent LVAD implantation during the pre-defined period and 156 with INTERMACS profiles 3-6 met our inclusion criteria. The median age was 63 years (IQR 54-69), and 52% had ischaemic cardiomyopathy. Hypotensive SBP response was present in 35% of patients and was associated with increased 90 day all-cause mortality (unadjusted HR 9.16, 95% CI 1.98-42; P = 0.0046). Hazard ratio remained significant after adjusting for age, INTERMACS profile, serum creatinine, and total bilirubin. Findings were similar in the HM III subgroup. CONCLUSIONS: Hypotensive SBP response on pre-LVAD CPX is associated with increased perioperative and 90 day mortality after LVAD implantation. Additional studies are needed to determine the mechanism of increased mortality observed.
Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Hipotensão , Humanos , Pessoa de Meia-Idade , Teste de Esforço , Estudos Retrospectivos , Coração Auxiliar/efeitos adversos , Hipotensão/complicaçõesRESUMO
Background and Aims: Transcatheter aortic valve replacement (TAVR) is the mainstay of treatment of inoperable and severe high-risk aortic stenosis and is noninferior to surgical aortic valve replacement (SAVR) for low-risk and intermediate-risk patients as well. We aim to compare the valve size, area, and transaortic mean gradients in SAVR patients before and after the implementation of TAVR since being approved by the Food and Drug Administration in 2011. Methods: Patients who underwent a bioprosthetic SAVR placement were divided into two groups based on the date of procedure: the early pre-TAVR implementation group (years 2011-2012) and the contemporary post-TAVR group (years 2019-2020). The primary endpoint was the mean gradient across the aortic valve within 16 months of surgery. The secondary endpoints included the difference in valve size and various aortic valve echocardiographic variables. Results: One hundred and thirty patients had their valves replaced in the years 2011-2012 and 134 in the years 2019-2020. The early group had a significantly higher mean gradient (median of 13 mmHg [interquartile range, IQR: 9.3-18] vs. 10 mmHg [IQR: 7.5-13.1], p = 0.001) and a smaller median effective orifice area index (0.8 cm2/m2 [IQR: 0.6-1] vs. 1.1 cm2/m2 [IQR: 0.8-1.3], p < 0.001). The median valve size was significantly smaller in the early group (median of 21 mm [IQR: 21-23] vs. 23 mm [IQR: 22.5-25], p < 0.001). Conclusion: In the contemporary era, surgical patients receive larger valves which translates into lower mean gradients, larger valve area, and lower rates of patient-prosthesis mismatch than in previous years before the routine introduction of TAVR.
RESUMO
OBJECTIVES: Does point-of-care viscoelastic testing in patients undergoing left ventricular assist device implantation or orthotopic heart transplantation reduce non-red blood cell transfusion or improve postoperative outcomes? DESIGN: A retrospective observational study. SETTING: At a single-center tertiary university hospital. PARTICIPANTS: Patients undergoing left ventricular assist device placement or heart transplantation INTERVENTIONS: The authors implemented a TEG-based transfusion algorithm to reduce non-red cell transfusion rates compared with historical controls. MEASUREMENTS AND MAIN RESULTS: From May 15, 2019, through March 20, 2020, 68 patients underwent left ventricular assist device placement or heart transplantation. Algorithm adherence was 49.2%. After adjusting for relevant variables, platelet (odds ratio [OR] 0.58 [0.39-0.84]; p = 0.004) and cryoprecipitate (OR 0.37 [0.19-0.72]; p = 0.004) transfusion rates and time to extubation (OR -14.1 [-25.8 to -2.3]; p = 0.020) were significantly reduced compared with historical controls. After adjusting for relevant clinical variables, there was a statistically significant reduction in plasma (median [interquartile range] 0.16 [0.07-0.36], p < 0.001), platelets (0.06 [0.02-0.21], p < 0.001), and cryoprecipitate (0.06 [0.01-0.47], p = 0.007) transfusion rates and time to extubation (-16.95 [-27.20 to -6.71], p = 0.002) compared with historical controls. CONCLUSIONS: The authors report a statistically significant reduction in transfusion of platelets and cryoprecipitate and time to extubation after adjusting for relevant clinical variables compared with historical controls and a significant reduction in the transfusion of plasma, platelets, and cryoprecipitate and time to extubation in those patients for whom the transfusion algorithm was followed. Their results suggest the importance of implementing transfusion algorithms for patients undergoing heart transplantation and left ventricular assist device placement and of accounting for adherence.
Assuntos
Transplante de Coração , Coração Auxiliar , Algoritmos , Transfusão de Sangue , Humanos , Estudos RetrospectivosRESUMO
data around survival and adverse events of cardiogenic shock (CS) patients supported with axillary or subclavian artery 5.0 Impella are presently unavailable. We performed a systematic search of studies reporting the outcomes of axillary or subclavian access 5.0 Impella for refractory CS in PubMed, EMBASE, and the Cochrane Library. The primary outcome was 30-day survival. Secondary outcomes included survival to next therapy and adverse events on support. Proportional meta-analysis was used to pool across studies. Of the 795 potential studies identified, 13 studies were included in the meta-analysis (n = 256 patients). The average age of patients across studies was 56 ± 5 years. Thirty-day survival for the overall cohort was 66% (95% CI: 59-73). Survival to the next therapy was 68% (95% CI: 60-76). The occurrence of adverse events over an average of 13 (95% CI: 12-14) days of support was the following: stroke 5.9%, hemolysis 27%, pump thrombosis 4.4%, limb ischemia 0.1%, major bleeding 5.4%, device malfunction 10.6%, exchange 6.6%, and infection 14%. In this systematic review and meta-analysis, we report survival and adverse event rates of axillary or subclavian access 5.0 Impella for CS. Such summary data can inform clinician decision-making.
Assuntos
Coração Auxiliar , Choque Cardiogênico , Coração Auxiliar/efeitos adversos , Hemorragia , Humanos , Pessoa de Meia-Idade , Estudos Retrospectivos , Choque Cardiogênico/terapia , Resultado do TratamentoAssuntos
COVID-19/complicações , Insuficiência Cardíaca/complicações , Ventrículos do Coração/fisiopatologia , Coração Auxiliar/efeitos adversos , Trombose/complicações , Insuficiência Cardíaca/diagnóstico , Insuficiência Cardíaca/fisiopatologia , Humanos , Masculino , Pessoa de Meia-Idade , SARS-CoV-2/patogenicidade , Trombose/diagnósticoRESUMO
BACKGROUND: Previous studies have noted significant variation in serum urate (sUA) levels, and it is unknown how this influences the accuracy of hyperuricemia classification based on single data points. Despite this known variability, hyperuricemic patients are often used as a control group in gout studies. Our objective was to determine the accuracy of hyperuricemia classifications based on single data points versus multiple data points given the degree of variability observed with serial measurements of sUA. METHODS: Data was analyzed from a cross-over clinical trial of urate-lowering therapy in young adults without a gout diagnosis. In the control phase, sUA levels used for this analysis were collected at 2-4 week intervals. Mean coefficient of variation for sUA was determined, as were rates of conversion between normouricemia (sUA ≤6.8 mg/dL) and hyperuricemia (sUA > 6.8 mg/dL). RESULTS: Mean study participant (n = 85) age was 27.8 ± 7.0 years, with 39% female participants and 41% African-American participants. Mean sUA coefficient of variation was 8.5% ± 4.9% (1 to 23%). There was no significant difference in variation between men and women, or between participants initially normouricemic and those who were initially hyperuricemic. Among those initially normouricemic (n = 72), 21% converted to hyperuricemia during at least one subsequent measurement. The subgroup with initial sUA < 6.0 (n = 54) was much less likely to have future values in the range of hyperuricemia compared to the group with screening sUA values between 6.0-6.8 (n = 18) (7% vs 39%, p = 0.0037). Of the participants initially hyperuricemic (n = 13), 46% were later normouricemic during at least one measurement. CONCLUSION: Single sUA measurements were unreliable in hyperuricemia classification due to spontaneous variation. Knowing this, if a single measurement must be used in classification, it is worth noting that those with an sUA of < 6.0 mg/dL were less likely to demonstrate future hyperuricemic measurements and this could be considered a safer threshold to rule out intermittent hyperuricemia based on a single measurement point. TRIAL REGISTRATION: Data from parent study ClinicalTrials.gov Identifier: NCT02038179 .
RESUMO
BACKGROUND: The impact of pulmonary hypertension (PH) on outcomes after surgical tricuspid valve replacement (TVR) and repair (TVr) is unclear. We sought to characterize PH in patients undergoing TVR/TVr, based on invasive hemodynamics and evaluate the effect of PH on mortality. METHODS: We identified 86 consecutive patients who underwent TVR/TVr with invasive hemodynamic measurements within 3 months before surgery. We used Kaplan-Meier survival and restricted mean survival time (RMST) analyses to quantify the effects of PH on survival. RESULTS: The mean age was 63 ± 13 years, 59% were female, 45% had TVR, 55% had TVr, 39.5% had isolated TVR/TVr, and 60.5% had TVR/TVr concomitant with other cardiac surgeries). Eighty-six percent of these patients had PH with a mean pulmonary artery pressure of 30 ± 10 mm Hg, pulmonary vascular resistance (PVR) of 2.5 (interquartile range: 1.5-3.9) Wood units (WU), pulmonary arterial compliance of 2.3 (1.6-3.6) mL/mm Hg, and pulmonary arterial elastance of 0.8 (0.6-1.2) mm Hg/mL. Cardiac output was mildly reduced at 4.0 ± 1.4 L/min, with elevated right-atrial pressure (14 ± 12 mm Hg) and pulmonary capillary wedge pressure (19 ± 7 mm Hg). Over a median follow-up of 6.3 years, 22% of patients died. Patients with PVR ≥ 2.5 WU had lower RMST over 5 years compared with patients with PVR < 2.5 WU. CONCLUSION: PH is common in patients undergoing TVR/TVr, with combined pre- and postcapillary being the most common type. PVR ≥ 2.5 WU is associated with lower survival at 5-year follow-up.
CONTEXTE: On connaît mal les répercussions de l'hypertension pulmonaire (HP) chez les patients qui ont subi une intervention chirurgicale de remplacement de la valve tricuspide (RVT) ou de réparation de la valve tricuspide (rVT). Nous avons tenté de caractériser l'HP chez les patients ayant subi un RVT ou une rVT en fonction des paramètres de surveillance hémodynamique effractive et d'évaluer l'effet de l'HP sur la mortalité. MÉTHODOLOGIE: Nous avons relevé 86 patients consécutifs ayant subi un RVT ou une rVT qui avaient fait l'objet de mesures hémodynamiques effractives dans les trois mois précédant l'intervention chirurgicale. Pour quantifier les effets de l'HP sur la survie, nous avons analysé la survie au moyen de la méthode de Kaplan-Meier et de la survie moyenne restreinte. RÉSULTATS: Les patients avaient en moyenne 63 ± 13 ans; 59 % d'entre eux étaient des femmes; 45 % avaient subi un RVT et 55 %, une rVT; 39,5 % avaient subi seulement un RVT ou une rVT lors de l'intervention chirurgicale; 60,5 % avaient subi un RVT ou une rVT en même temps qu'une autre intervention cardiaque. Quatre-vingt-six pour cent de ces patients présentaient une HP avec une pression artérielle pulmonaire moyenne de 30 ± 10 mmHg, une résistance vasculaire pulmonaire (RVP) de 2,5 (intervalle interquartile : 1,5 à 3,9) unités de Wood (UW), une compliance artérielle pulmonaire de 2,3 (1,6 à 3,6) ml/mmHg et une élastance artérielle pulmonaire de 0,8 (0,6 à 1,2) mmHg/ml. On a observé une légère baisse du débit cardiaque à 4,0 ± 1,4 L/min, ainsi qu'une augmentation de la pression auriculaire droite (14 ± 12 mmHg) et de la pression artérielle pulmonaire d'occlusion (19 ± 7 mmHg). Sur une période médiane de suivi de 6,3 ans, 22 % des patients sont décédés. Le taux de survie moyenne restreinte à 5 ans était plus faible chez les patients présentant une RVP ≥ 2,5 UW que chez les patients présentant une RVP < 2,5 UW. CONCLUSION: L'HP est fréquente chez les patients subissant un RVT ou une rVT, le type le plus courant étant l'HP mixte (pré-capillaire et post-capillaire). Une RVP ≥ 2,5 UW est associée à un taux de survie à 5 ans plus faible.