RESUMO
OBJECTIVES: Cardiogenic shock (CS) is associated with high mortality. Patients treated for CS mostly require heparin therapy, which may be associated with complications such as heparin-induced thrombocytopenia (HIT). HIT represents a serious condition associated with platelet decline and increased hypercoagulability and remains a poorly researched field in intensive care medicine. Primary purpose of this study was to: 1) determine HIT prevalence in CS, 2) assess the performance of common diagnostic tests for the workup of HIT, and 3) compare outcomes in CS patients with excluded and confirmed HIT. DESIGN: Retrospective dual-center study including adult patients 18 years old or older with diagnosed CS and suspected HIT from January 2010 to November 2022. SETTING: Cardiac ICU at the Ludwig-Maximilians University hospital in Munich and the university hospital of Bonn. PATIENTS AND INTERVENTIONS: In this retrospective analysis, adult patients with diagnosed CS and suspected HIT were included. Differences in baseline characteristics, mortality, neurologic and safety outcomes between patients with excluded and confirmed HIT were evaluated. MEASUREMENTS AND MAIN RESULTS: In cases of suspected HIT, positive screening antibodies were detected in 159 of 2808 patients (5.7%). HIT was confirmed via positive functional assay in 57 of 2808 patients, corresponding to a prevalence rate of 2.0%. The positive predictive value for anti-platelet factor 4/heparin screening antibodies was 35.8%. Total in-hospital mortality (58.8% vs. 57.9%; p > 0.999), 1-month mortality (47.1% vs. 43.9%; p = 0.781), and 12-month mortality (58.8% vs. 59.6%; p > 0.999) were similar between patients with excluded and confirmed HIT, respectively. Furthermore, no significant difference in neurologic outcome among survivors was found between groups (Cerebral Performance Category [CPC] score 1: 8.8% vs. 8.8%; p > 0.999 and CPC 2: 7.8% vs. 12.3%; p = 0.485). CONCLUSIONS: HIT was a rare complication in CS patients treated with unfractionated heparin and was not associated with increased mortality. Also, HIT confirmation was not associated with worse neurologic outcome in survivors. Future studies should aim at developing more precise, standardized, and cost-effective strategies to diagnose HIT and prevent complications.
Assuntos
Anticoagulantes , Heparina , Choque Cardiogênico , Trombocitopenia , Humanos , Heparina/efeitos adversos , Trombocitopenia/induzido quimicamente , Trombocitopenia/epidemiologia , Trombocitopenia/diagnóstico , Trombocitopenia/mortalidade , Estudos Retrospectivos , Choque Cardiogênico/induzido quimicamente , Choque Cardiogênico/epidemiologia , Choque Cardiogênico/mortalidade , Feminino , Masculino , Idoso , Pessoa de Meia-Idade , Anticoagulantes/efeitos adversos , Prevalência , Alemanha/epidemiologiaRESUMO
BACKGROUND: Baseline lung allograft dysfunction (BLAD) is characterized by the failure to achieve normal baseline lung function after double lung transplantation (DLTX) and is associated with a high risk of mortality. In single lung transplant (SLTX) recipients, however, cutoff values and associated factors have not been explored. Here, we aimed to define BLAD in SLTX recipients, investigate its impact on allograft survival, and identify potential risk factors for BLAD in SLTX recipients. METHODS: We performed a retrospective, single-center analysis of the LTX cohort of LMU Munich between 2010 and 2018. In accordance with DLTX cutoffs, BLAD in SLTX recipients was defined as failure to achieve percentage of forced expiratory volume in 1 s and percentage of forced vital capacity of >60% on 2 consecutive tests >3 wk apart. Survival analysis and regression analysis for potential predictors of BLAD were performed. RESULTS: In a cohort of 141 SLTX recipients, 43% of patients met BLAD criteria. SLTX recipients with BLAD demonstrated impaired survival. Native lung hyperinflation was associated with BLAD in obstructive disease, whereas donor/recipient lung size mismatch was associated with BLAD in both obstructive and restrictive underlying diseases. Pulmonary function testing at 3 mo after lung transplantation predicted normal baseline lung function in SLTX recipients with obstructive lung disease. CONCLUSIONS: BLAD in SLTX recipients is as relevant as in DLTX recipients and should generally be considered in the follow-up of LTX recipients. Risk factors for BLAD differed between underlying obstructive and restrictive lung disease. A better understanding of associated factors may help in the development of preventive strategies.
RESUMO
BACKGROUND: MRI (magnetic resonance imaging) represents the diagnostic image modality of choice in several conditions. With an increasing number of patients requiring MRI for diagnostic purposes, the issue of safety in patients with cardiac implantable electronic devices (CIED) undergoing this imaging modality will play an ever more important role. The purpose of this study was to assess the safety and device function following MRI in an unrestricted real-world cohort of patients with a wide array of cardiac devices. METHODS: We conducted a retrospective single-center study including 1010 MRI studies conducted in adult patients (≥18 years) with an implanted CIED treated in the University Hospital of Munich (LMU) between July 2012 and March 2024. Patients with non-MR conditionally labeled leads, abandoned or epicardial leads, as well as lead fragments, were included for analysis. RESULTS: Across a total of 1010 MRIs (920 total MR-conditional device generators) performed in patients with an implanted CIED, there were no deaths, reports of discomfort, palpitations, heating, or ventricular arrythmias in the 24 h following MRI. Only 2/1010 MRIs were followed by a reported atrial arrhythmia within 24 h, both in patients with an MR-conditional pacemaker (PM) device without an abandoned lead. No significant changes in device function following MRI from baseline were observed across all included CIEDs. Lastly, no instances of severe malfunction, such as generator failure, loss of capture, electrical reset, or inappropriate inhibition of pacing, were found in post-MRI interrogation reports across all MRI studies. CONCLUSIONS: Based on the analysis of 1010 MRIs undergone by patients with CIEDs, following standardized device interrogation, manufacturer-advised device programming, monitoring of vital function, and manufacturer-advised reprogramming, MRI can be performed safely and without adverse events or changes in device function.
RESUMO
BACKGROUND AND AIMS: Candidate selection for lung transplantation (LuTx) is pivotal to ensure individual patient benefit as well as optimal donor organ allocation. The impact of coronary artery disease (CAD) on post-transplant outcomes remains controversial. We provide comprehensive data on the relevance of CAD for short- and long-term outcomes following LuTx and identify risk factors for mortality. METHODS: We retrospectively analyzed all adult patients (≥ 18 years) undergoing primary and isolated LuTx between January 2000 and August 2021 at the LMU University Hospital transplant center. Using 1:1 propensity score matching, 98 corresponding pairs of LuTx patients with and without relevant CAD were identified. RESULTS: Among 1,003 patients having undergone LuTx, 104 (10.4%) had relevant CAD at baseline. There were no significant differences in in-hospital mortality (8.2% vs. 8.2%, p > 0.999) as well as overall survival (HR 0.90, 95%CI [0.61, 1.32], p = 0.800) between matched CAD and non-CAD patients. Similarly, cardiovascular events such as myocardial infarction (7.1% CAD vs. 2.0% non-CAD, p = 0.170), revascularization by percutaneous coronary intervention (5.1% vs. 1.0%, p = 0.212), and stroke (2.0% vs. 6.1%, p = 0.279), did not differ statistically between both matched groups. 7.1% in the CAD group and 2.0% in the non-CAD group (p = 0.078) died from cardiovascular causes. Cox regression analysis identified age at transplantation (HR 1.02, 95%CI [1.01, 1.04], p < 0.001), elevated bilirubin (HR 1.33, 95%CI [1.15, 1.54], p < 0.001), obstructive lung disease (HR 1.43, 95%CI [1.01, 2.02], p = 0.041), decreased forced vital capacity (HR 0.99, 95%CI [0.99, 1.00], p = 0.042), necessity of reoperation (HR 3.51, 95%CI [2.97, 4.14], p < 0.001) and early transplantation time (HR 0.97, 95%CI [0.95, 0.99], p = 0.001) as risk factors for all-cause mortality, but not relevant CAD (HR 0.96, 95%CI [0.71, 1.29], p = 0.788). Double lung transplant was associated with lower all-cause mortality (HR 0.65, 95%CI [0.52, 0.80], p < 0.001), but higher in-hospital mortality (OR 2.04, 95%CI [1.04, 4.01], p = 0.039). CONCLUSION: In this cohort, relevant CAD was not associated with worse outcomes and should therefore not be considered a contraindication for LuTx. Nonetheless, cardiovascular events in CAD patients highlight the necessity of control of cardiovascular risk factors and a structured cardiac follow-up.
RESUMO
Critical care cardiology (CCC) in the modern era is shaped by a multitude of innovative treatment options and an increasingly complex, ageing patient population. Generating high-quality evidence for novel interventions and devices in an intensive care setting is exceptionally challenging. As a result, formulating the best possible therapeutic approach continues to rely predominantly on expert opinion and local standard operating procedures. Fostering the full potential of CCC and the maturation of the next generation of decision-makers in this field calls for an updated training concept, that encompasses the extensive knowledge and skills required to care for critically ill cardiac patients while remaining adaptable to the trainee's individual career planning and existing educational programs. In the present manuscript, we suggest a standardized training phase in preparation of the first ICU rotation, propose a modular CCC core curriculum, and outline how training components could be conceptualized within three sub-specialization tracks for aspiring cardiac intensivists.