RESUMEN
BACKGROUND: This study evaluates the clinical trends, risk factors, and impact of waitlist blood transfusion on outcomes following isolated heart transplantation. METHODS: The UNOS registry was queried to identify adult recipients from January 1, 2014, to June 30, 2022. The recipients were stratified into two groups depending on whether they received a blood transfusion while on the waitlist. The incidence of waitlist transfusion was compared before and after the 2018 allocation policy change. The primary outcome was survival. Propensity score-matching was performed. Multivariable logistic regression was performed to identify predictors of waitlist transfusion. A sub-analysis was performed to evaluate the impact of waitlist time on waitlist transfusion. RESULTS: From the 21 926 recipients analyzed in this study, 4201 (19.2%) received waitlist transfusion. The incidence of waitlist transfusion was lower following the allocation policy change (14.3% vs. 23.7%, p < 0.001). The recipients with waitlist transfusion had significantly reduced 1-year posttransplant survival (88.8% vs. 91.9%, p < 0.001) compared to the recipients without waitlist transfusion in an unmatched comparison. However, in a propensity score-matched comparison, the two groups had similar 1-year survival (90.0% vs. 90.4%, p = 0.656). Multivariable analysis identified ECMO, Impella, and pretransplant dialysis as strong predictors of waitlist transfusion. In a sub-analysis, the odds of waitlist transfusion increased nonlinearly with longer waitlist time. CONCLUSION: There is a lower incidence of waitlist transfusion among transplant recipients under the 2018 allocation system. Waitlist transfusion is not an independent predictor of adverse posttransplant outcomes but rather a marker of the patient's clinical condition. ECMO, Impella, and pretransplant dialysis are strong predictors of waitlist transfusion.
Asunto(s)
Transfusión Sanguínea , Trasplante de Corazón , Sistema de Registros , Listas de Espera , Humanos , Masculino , Listas de Espera/mortalidad , Femenino , Trasplante de Corazón/efectos adversos , Trasplante de Corazón/mortalidad , Persona de Mediana Edad , Estudios de Seguimiento , Factores de Riesgo , Pronóstico , Tasa de Supervivencia , Transfusión Sanguínea/estadística & datos numéricos , Supervivencia de Injerto , Adulto , Estudios RetrospectivosRESUMEN
BACKGROUND: This study evaluated the outcomes of patients with cardiogenic shock (CS) supported with Impella 5.0 or 5.5 and identified risk factors for in-hospital mortality. METHODS: Adults with CS who were supported with Impella 5.0 or 5.5 at a single institution were included. Patients were stratified into three groups according to their CS etiology: (1) acute myocardial infarction (AMI), (2) acute decompensated heart failure (ADHF), and (3) postcardiotomy (PC). The primary outcome was survival, and secondary outcomes included adverse events during Impella support and length of stay. Multivariable logistic regression was performed to identify risk factors for in-hospital mortality. RESULTS: One hundred and thirty-seven patients with CS secondary to AMI (n = 47), ADHF (n = 86), and PC (n = 4) were included. The ADHF group had the highest survival rates at all time points. Acute kidney injury (AKI) was the most common complication during Impella support in all 3 groups. Increased rates of AKI and de novo renal replacement therapy were observed in the PC group, and the AMI group experienced a higher incidence of bleeding requiring transfusion. Multivariable analysis demonstrated diabetes mellitus, elevated pre-insertion serum lactate, and elevated pre-insertion serum creatinine were independent predictors of in-hospital mortality, but the etiology of CS did not impact mortality. CONCLUSIONS: This study demonstrates that Impella 5.0 and 5.5 provide effective mechanical support for patients with CS with favorable outcomes, with nearly two-thirds of patients alive at 180 days. Diabetes, elevated pre-insertion serum lactate, and elevated pre-insertion serum creatinine are strong risk factors for in-hospital mortality.
Asunto(s)
Corazón Auxiliar , Mortalidad Hospitalaria , Choque Cardiogénico , Humanos , Choque Cardiogénico/terapia , Choque Cardiogénico/mortalidad , Choque Cardiogénico/etiología , Masculino , Corazón Auxiliar/efectos adversos , Femenino , Anciano , Persona de Mediana Edad , Factores de Riesgo , Resultado del Tratamiento , Estudios Retrospectivos , Lesión Renal Aguda/terapia , Lesión Renal Aguda/etiología , Lesión Renal Aguda/mortalidad , Infarto del Miocardio/complicaciones , Infarto del Miocardio/mortalidad , Insuficiencia Cardíaca/mortalidad , Insuficiencia Cardíaca/complicacionesRESUMEN
PURPOSE OF REVIEW: Although lung transplantation stands as the gold standard curative therapy option for end-stage lung disease, the scarcity of available organs poses a significant challenge in meeting the escalating demand. This review provides an overview of recent advancements in ambulatory respiratory assist systems, selective anticoagulation therapies that target the intrinsic pathway, and innovative surface coatings to enable permanent respiratory support as a viable alternative to lung transplantation. RECENT FINDINGS: Several emerging ambulatory respiratory assist systems have shown promise in both preclinical and clinical trials. These systems aim to create more biocompatible, compact, and portable forms of extracorporeal membrane oxygenation that can provide long-term respiratory support. Additionally, innovative selective anticoagulation strategies, currently in various stages of preclinical or clinical development, present a promising alternative to currently utilized nonselective anticoagulants. Moreover, novel surface coatings hold the potential to locally prevent artificial surface-induced thrombosis and minimize bleeding risks. SUMMARY: This review of recent advancements toward permanent respiratory support summarizes the development of ambulatory respiratory assist systems, selective anticoagulation therapies, and novel surface coatings. The integration of these evolving device technologies with targeted anticoagulation strategies may allow a safe and effective mode of permanent respiratory support for patients with chronic lung disease.
Asunto(s)
Anticoagulantes , Oxigenación por Membrana Extracorpórea , Humanos , Anticoagulantes/uso terapéutico , Oxigenación por Membrana Extracorpórea/instrumentación , Oxigenación por Membrana Extracorpórea/efectos adversos , Diseño de Equipo , Enfermedades Pulmonares/terapia , Animales , Resultado del Tratamiento , Coagulación Sanguínea/efectos de los fármacos , Materiales Biocompatibles Revestidos , Trombosis/prevención & control , Trombosis/etiología , Trasplante de PulmónRESUMEN
BACKGROUND: The aim of this study is to evaluate the source of infectious complications following contemporary left ventricular assist device (LVAD) implantation and to determine the impact of infections on patient outcomes. METHODS: All patients who underwent centrifugal LVAD implantation between 2014 and 2020 at a single center were retrospectively reviewed. Postimplant infections were categorized as VAD-specific, VAD-related, or non-VAD according to previously published definitions. Postoperative survival and freedom from readmission were assessed using Kaplan-Meier analysis. Univariable and multivariable analyses were performed to determine the risk factors for postoperative infectious complications. RESULTS: A total of 212 patients underwent centrifugal LVAD implantation (70 HeartMate 3, 142 HeartWare HVAD) during the study period. One hundred and two patients (48.1%) developed an infection, including 34 VAD-specific, 11 VAD-related, and 57 non-VAD. Staphylococcus species were the most common source of postoperative infection (n = 57, 33.7%). In multivariable analysis, diabetes significantly impacted overall postoperative infection rate. At 12 and 24 months, respectively, Kaplan-Meier survival was 81.1% and 61.6% in the infection group and 83.4% and 78.1% in the noninfection group (p = 0.006). Within the total cohort, 12- and 24-month freedom from infection were 46.2% and 31.9%, respectively. Patients with infectious complication had significantly lower rate of transplantation (16.4% vs. 43.6%; p < 0.001), increased overall mortality (46.3% vs. 17.3%, p < 0.001), and increased rates of noncardiac readmission (58.2% vs. 37.3%, p = 0.007). CONCLUSIONS: Infections are common following contemporary LVAD implantation and are most commonly non-VAD related. Patients with postoperative infectious complications have significantly reduced rates of transplantation, survival, and freedom from noncardiac readmission.
Asunto(s)
Insuficiencia Cardíaca , Trasplante de Corazón , Corazón Auxiliar , Insuficiencia Cardíaca/cirugía , Corazón Auxiliar/efectos adversos , Humanos , Complicaciones Posoperatorias/epidemiología , Complicaciones Posoperatorias/etiología , Estudios Retrospectivos , Factores de Riesgo , Resultado del TratamientoRESUMEN
PURPOSE: In this study, we evaluated the impacts of ad libitum feedings on outcomes following laparoscopic pyloromyotomy in patients with infantile hypertrophic pyloric stenosis. METHODS: Pediatric patients with infantile hypertrophic pyloric stenosis who underwent laparoscopic pyloromyotomy were included. Patients were stratified into ad libitum and structured feeding groups. Primary outcomes were times from surgery completion to goal feeding and discharge. RESULTS: A total of 336 patients were included in the study with 63 patients (18.8%) in the ad libitum feeding group. The ad libitum feeding group experienced significantly shorter times from surgery completion to both goal feedings (10.7 h vs 18.7 h; p < 0.001) and hospital discharge (21.6 h vs 23.1 h; p = 0.008) compared to the structured protocol group. Postoperative emesis (47.% vs 30.8%; p = 0.011) was higher in the ab libitum cohort, but the rates of return to an emergency department and/or readmission (4.8% vs 2.2%; p = 0.26) were similar. CONCLUSION: Ad libitum feeding after pyloromyotomy decreases time to reach goal feeding and hospital discharge. While it may contribute to a higher incidence of emesis, it does not appear to significantly increase hospital readmission. Ad libitum feeding appears to be a safe and beneficial alternative to structured feeding protocols following pyloromyotomy. LEVEL OF EVIDENCE: III.
Asunto(s)
Laparoscopía , Estenosis Hipertrófica del Piloro , Piloromiotomia , Niño , Humanos , Lactante , Laparoscopía/métodos , Tiempo de Internación , Estenosis Hipertrófica del Piloro/cirugía , Píloro/cirugía , Estudios RetrospectivosRESUMEN
BACKGROUND: With septuagenarians undergoing orthotopic heart transplantation (OHT) more frequently, we aimed to develop a risk score for 1-year mortality in this population. METHODS: Septuagenarian OHT recipients were identified from the UNOS registry between 1987 and 2018. The primary outcome was 1-year post-OHT mortality. Patients were randomly divided into derivation and validation cohorts. Associated covariates were entered into a multivariable logistic regression model. A risk score was created using the magnitudes of the odds ratios from the derivation cohort, and its 1-year post-OHT mortality prediction capacity was tested in the validation cohort. RESULTS: A total of 1156 septuagenarians were included, and they were randomly divided into derivation (66.7%, n = 771) and validation (33.3%, n = 385) cohorts. An 11-point risk score incorporating 4 variables was created, which included mechanical ventilation, serum bilirubin, serum creatinine, and donor age. The predicted 1-year mortality ranged from 4.2% (0 points) to 48.1% (11-points) (p < .001). After cross-validation, the c-index was 0.67 with a Brier score of 0.10. Risk scores above 3 points portended a survival disadvantage at 1-year follow-up (p < .001). CONCLUSIONS: This 11-point risk score for septuagenarians is predictive of mortality within 1-year of OHT and has potential utilization in improving recipient evaluation and selection of elderly patients.
Asunto(s)
Trasplante de Corazón , Anciano , Estudios de Cohortes , Humanos , Modelos Logísticos , Estudios Retrospectivos , Factores de RiesgoRESUMEN
BACKGROUND: This single-center, retrospective study evaluates the impact of hepatic steatosis on outcomes after continuous-flow left ventricular assist device (LVAD) implantation. METHODS: Adults undergoing LVAD implantation between 2004 and 2018 with a preoperative noncontrast-enhanced chest and abdominal computed tomography scan were included in the study. Patients were stratified as with and without radiographic signs of hepatic steatosis. The primary outcome was survival, and secondary outcomes included rates of postimplant adverse events. RESULTS: A total of 203 patients were included in the study. 27.6% (n = 56) had radiographic signs of hepatic steatosis. Hepatic steatosis group had a higher body mass index (30.1 vs. 27.0, p < .01), model for end-stage liver disease excluding international normalized ratio score (16.8 vs. 15.1, p = .05), and incidence of diabetes (53.6% vs. 35.4%, p = .02). The rates of postimplant adverse events, including bleeding, infection, reoperation, renal failure, hepatic dysfunction, stroke, and right ventricular failure, were similar between the groups (all, p > .05). Unadjusted survival was comparable between the groups at 30-days, 90-days, 1-year, and 2-year following LVAD implantation (all, p > .05). In addition, hepatic steatosis did not impact risk-adjusted overall mortality when modeled as a categorical variable (odds ratio [OR]: 0.72, 95% confidence interval [CI]: 0.46-1.13; p = .15). CONCLUSIONS: This study demonstrates that the presence of preoperative hepatic steatosis on imaging is not predictive of increased morbidity or mortality following LVAD implantation. Despite the association with obesity, metabolic diseases, and heart failure, hepatic steatosis on imaging appears to have a limited role in patient selection or prognostication in LVAD patients.
Asunto(s)
Enfermedad Hepática en Estado Terminal , Insuficiencia Cardíaca , Corazón Auxiliar , Disfunción Ventricular Derecha , Adulto , Corazón Auxiliar/efectos adversos , Humanos , Estudios Retrospectivos , Factores de Riesgo , Índice de Severidad de la Enfermedad , Resultado del TratamientoRESUMEN
BACKGROUND: This study evaluates the impact of early massive transfusion and blood component ratios on outcomes following left ventricular assist device (LVAD) implantation. METHODS: Adults undergoing LVAD implantation between 2009 and 2018 at a single institution were included. Transfusions were analyzed during the intraoperative and the initial 24-h postoperative period. Patients were stratified into massive and nonmassive transfusion groups. The primary outcome was survival, and secondary outcomes included postoperative complications. Sub-analyses were performed to evaluate the impact of balanced transfusion. RESULTS: A total of 278 patients were included. A total of 45.3% (n = 126) required massive transfusions. The massive transfusion group experienced significantly higher rates of postimplant adverse events, including reoperation, renal failure, and hepatic dysfunction (all, p ≤ .05). Furthermore, the massive transfusion group had significantly lower 30-day, 90-day, 1-year, 2-year, and overall survival rates following LVAD implantation (all, p < .05). In multivariable analysis, massive transfusion significantly impacted overall risk-adjusted mortality rate (hazard ratio: 2.402, 95% confidence Interval: 1.677-3.442, p < .001). In the sub-analyses evaluating the impact of balanced massive transfusion, balanced fresh frozen plasma to packed red blood cell (pRBC) transfusion did not provide any survival benefit (all, p > .05). However, balanced platelet to pRBC massive transfusion did improve 2-year and overall mortality rates in the massive transfusion cohort (both, p ≤ .05). CONCLUSIONS: This study demonstrates a significant association between early massive transfusion and adverse outcomes following LVAD implantation. Balancing platelet to pRBC transfusion in the early postoperative period may help mitigate some of these detrimental effects of massive transfusion on subsequent survival.
Asunto(s)
Corazón Auxiliar , Transfusión de Componentes Sanguíneos , Transfusión Sanguínea , Humanos , Estudios Retrospectivos , Resultado del TratamientoRESUMEN
BACKGROUND: This study evaluates the impact of secondary functional tricuspid regurgitation (TR) and concomitant tricuspid valve repair (TVr) at the time of left-sided valve operations. METHODS: Adults undergoing left-sided valve operations between 2010 and 2019 at a multihospital academic institution were included. Patients were stratified into three groups: less-than-moderate TR without TVr (Group 1), moderate-or-greater TR without TVr (Group 2), and moderate-or-greater TR with TVr (Group 3). Primary outcomes included survival and hospital readmissions. Secondary outcomes included major postoperative morbidities. Multivariable logistic regression evaluated risk-adjusted mortality and readmission. RESULTS: About 3444 patients were included in the analysis and were stratified into Group 1 (n = 2612, 75.8%), Group 2 (n = 563, 16.3%), and Group 3 (n = 269, 7.8%). Patients with moderate or greater TR (Groups 2 and 3) had higher rates of mortality, hospital readmissions and major postoperative complications including reoperations, renal failure requiring dialysis, blood transfusions, and prolonged ventilation (all, p < .05). When assessed individually, the Group 3 had substantially higher rates of renal failure requiring dialysis, prolonged ventilation, and reoperations, although the Group 2 had higher rates of 30-day mortality (all, p < .05). These findings persisted in risk-adjusted analysis with the highest hazards for mortality (hazard ratio [HR] 1.9, 95% confidence interval [CI] 1.7-2.2) and readmission (HR 1.3, 95% CI 1.2-1.5) appreciated in the Group 2. CONCLUSIONS: In this analysis of 3444 patients, those with moderate-to-severe TR who did not undergo a TVr at the time of their left-sided valve operation had substantially higher risks of mortality and hospital readmissions compared with those who did undergo TV surgery.
Asunto(s)
Implantación de Prótesis de Válvulas Cardíacas , Insuficiencia de la Válvula Tricúspide , Adulto , Humanos , Diálisis Renal , Estudios Retrospectivos , Resultado del Tratamiento , Válvula Tricúspide/cirugía , Insuficiencia de la Válvula Tricúspide/cirugíaRESUMEN
BACKGROUND: The aim of this study is to evaluate the predictive utility of preoperative right ventricular (RV) global longitudinal strain (GLS) and free wall strain (FWS) on outcomes following left ventricular assist devices (LVADs) implantation. METHODS: Preoperative transthoracic echocardiograms were retrospectively reviewed in adults undergoing continuous-flow LVAD implantation between 2004 and 2018 at a single center. Patients undergoing pump exchange were excluded. RV GLS and FWS were calculated using commercially available software with the apical four-chamber view. The primary outcome was RV failure as defined by the Interagency Registry for Mechanically Assisted Circulatory Support within 1-year post-LVAD insertion. RESULTS: A total of 333 patients underwent continuous-flow LVAD implantation during the study period and 137 had adequate preoperative studies for RV strain evaluation. RV FWS was found to be a significant predictor of postoperative RV failure in univariate analysis (odds ratio [OR] = 1.12, p = .03), and this finding persisted after risk adjustment in multivariable analysis (OR = 1.14, p = .04). Using the optimal cutoff value of -5.64%, the c-index of FWS in predicting RV failure was 0.65. RV GLS was not associated with post-LVAD RV failure (OR = 1.07, p = .29). PCWP was the only additional significant predictor of RV failure using multivariable analysis (OR = 0.90, p = .02). CONCLUSION: Pre-implant RV FWS is predictive of RV failure in the first postoperative year after LVAD implantation.
Asunto(s)
Insuficiencia Cardíaca , Corazón Auxiliar , Disfunción Ventricular Derecha , Adulto , Ecocardiografía , Ventrículos Cardíacos/diagnóstico por imagen , Humanos , Estudios Retrospectivos , Disfunción Ventricular Derecha/diagnóstico por imagen , Disfunción Ventricular Derecha/etiologíaRESUMEN
BACKGROUND: The purpose of this study was to investigate the incidence, predictors, and long-term impact of gastrointestinal (GI) complications following adult cardiac surgery. METHODS: Index Society of Thoracic Surgeons (STS) adult cardiac operations performed between January 2010 and February 2018 at a single institution were included. Patients were stratified by the occurrence of postoperative GI complications. Outcomes included early and late survival as well as other associated major postoperative complications. A subanalysis of propensity score-matched patients was also performed. RESULTS: A total of 10,285 patients were included, and the overall rate of GI complications was 2.4% (n = 246). Predictors of GI complications included dialysis dependency, intra-aortic balloon pump, congestive heart failure, chronic obstructive pulmonary disease, and longer aortic cross-clamp times. Thirty-day (2.6% vs. 24.8%), 1- (6.3% vs. 41.9%), and 3-year (11.1% vs. 48.4%) mortality were substantially higher in patients who experienced GI complications (all p < .001). GI complication was associated with a threefold increased hazard for mortality (hazard ratio = 3.1, 95% confidence interval = 2.6-3.7) after risk adjustment, and there was an association between the occurrence of GI complications and increased rates of renal failure (39.4% vs. 2.5%), new dialysis dependency (31.3% vs. 1.5%), multisystem organ failure (21.5% vs .1.0%), and deep sternal wound infections (2.6% vs. 0.2%; all p < .001). These results persisted in propensity-matched analysis. CONCLUSION: GI complications are infrequent but have a profound impact on early and late survival, and often occur in association with other major complications. Risk factor modification, heightened awareness, and early detection and management of GI complications appear warranted.
Asunto(s)
Procedimientos Quirúrgicos Cardíacos , Enfermedades Gastrointestinales , Cirugía Torácica , Adulto , Procedimientos Quirúrgicos Cardíacos/efectos adversos , Enfermedades Gastrointestinales/epidemiología , Enfermedades Gastrointestinales/etiología , Humanos , Incidencia , Complicaciones Posoperatorias/epidemiología , Diálisis Renal , Estudios Retrospectivos , Factores de RiesgoRESUMEN
BACKGROUND: This single-center, the retrospective study evaluates the impact of preoperative serum prealbumin levels on outcomes after left ventricular assist device (LVAD) implantation. METHODS: Adults undergoing LVAD implantation, with a recorded preoperative prealbumin level, between 2004 to 2018 were included. Primary outcomes included rates of 1-year survival and secondary outcomes included rates of postimplant adverse events. Threshold regression and restricted cubic splines were utilized to identify a cut-point to dichotomize prealbumin level. Prealbumin was also evaluated as a continuous variable. Multivariable logistic regression was used for risk-adjustment. RESULTS: A total of 333 patients were included. Patients were dichotomized according to an optimal prealbumin threshold of 15 mg/dL: 47.4% (n = 158) had levels below and 52.6% (n = 175) had levels above this threshold, respectively. The rates of postimplant adverse events, including bleeding, infection, stroke, renal failure, and right heart failure, were similar between the groups (all P > .05). Furthermore, the rates of cardiac transplantation and device explantation were also similar (all P > .05). Unadjusted survival was comparable between the groups at 30-days, 90-days, and 1-year following LVAD implantation (all P > .05). In addition, lower prealbumin did not impact risk-adjusted 1-year mortality when modeled either as a categorical (OR, 1.08; 95% CI, 0.48-2.12; P = .82) or continuous variable (OR, 1.99; 95% CI, 0.73-2.34; P = .96). CONCLUSIONS: This study demonstrates that lower prealbumin levels were not predictive of increased post-LVAD morbidity or mortality. Although an established marker of nutritional and inflammatory status, the role of prealbumin in patient selection or prognostication appears limited in LVAD patients.
Asunto(s)
Insuficiencia Cardíaca/terapia , Corazón Auxiliar/efectos adversos , Resultados Negativos , Prealbúmina , Adulto , Anciano , Biomarcadores/sangre , Femenino , Humanos , Inflamación , Masculino , Persona de Mediana Edad , Estado Nutricional , Selección de Paciente , Valor Predictivo de las Pruebas , Periodo Preoperatorio , Pronóstico , Estudios RetrospectivosRESUMEN
BACKGROUND: This study evaluates the impact of a history of malignancy on outcomes of left ventricular assist device (LVAD) implantation. METHODS: Adult patients with a preimplant history of malignancy who underwent LVAD implantation between 2006 and 2018 were included. The primary outcome was post-LVAD survival. RESULTS: A total of 250 patients underwent LVAD implant during the study period, including 37 (14.8%) patients with a history of malignancy. Of these 37 patients, five (13.5%) had active malignancy at the time of LVAD implantation, and seven had more than one type of cancer. The median disease-free duration before LVAD was 3.5 years (interquartile range [IQR] 1.0-7.75 years). The most common types of malignancy included urologic (n = 20; 45.5%), skin (n = 7, 15.9%), and leukemia or lymphoma (n = 6; 13.6%). Median follow-up was 244 (IQR, 126-571) days and 313 (IQR 127-738) days for those with and without a history of malignancy, respectively (P = .49). Unadjusted post-LVAD survival was reduced in those with a malignancy history (2-year survival 53.4% vs 66.9%; P = .01), a finding that persisted after risk-adjustment (hazard ratio 1.89, 95% confidence interval, 1.13-3.14; P = .01). Only one (2.7%) patient died post-LVAD from their cancer. CONCLUSIONS: Although a history of malignancy is associated with reduced survival after LVAD implantation, more than half of the patients are alive at 2 years. This combined with the fact that most do not die from causes directly related to their cancer suggest that LVAD implantation is reasonable to perform in carefully selected patients with a history of malignancy.
Asunto(s)
Insuficiencia Cardíaca , Corazón Auxiliar , Neoplasias , Procedimientos Quirúrgicos Torácicos , Adulto , Humanos , Modelos de Riesgos Proporcionales , Estudios Retrospectivos , Resultado del TratamientoRESUMEN
INTRODUCTION: This study evaluated surgical outcomes of infective endocarditis (IE), with particular attention to the impact of intravenous drug use (IVDU). METHODS: Adult patients undergoing surgery for IE between 2011 and 2018 at a single center were included and stratified by IVDU. The primary outcome was overall survival. Secondary outcomes included postoperative complications and hospital readmissions. Kaplan-Meier and multivariable Cox regression were utilized for unadjusted and risk-adjusted survival analyses, respectively. Cumulative incidence function curves were compared for hospital readmissions. RESULTS: A total of 831 patients (mean age 55 years, 34.4% female) were operated on for IE, including 318 (38.3%) with IVDU. Cultures were most commonly positive for streptococcus (25.2%), methicillin-sensitive Staphylococcus aureus (17.7%), enterococcus (14.3%), or methicillin-resistant Staphylococcus aureus (8.4%). The most common procedures included isolated aortic valve repair/replacement (18.8%), aortic root replacement (15.9%), mitral valve repair/replacement (26.7%), aortic and mitral valve replacement (8.4%), and tricuspid valve repair/replacement (7.6%). Mean follow-up was 3.4 ± 2.4 years. Overall 5-year survival was 64% and was similar between IVDU and non-IVDU. Multivariable analysis demonstrated that IVDU was not associated with mortality risk. IVDU patients displayed higher rates of all-cause readmission (61.6% vs 53.9%; P = .03), drug-use readmission (15.4% vs 1.4%; P < .001), and recurrent endocarditis readmission (33.0% vs 13.0%; P < .001). CONCLUSIONS: The majority of patients undergoing surgical treatment of IE are alive at 5-years although readmission rates are high. IVDU is not a risk factor for longitudinal mortality although patients with IVDU are at higher overall readmission risk, driven largely by greater readmissions for drug-use and recurrent endocarditis.
Asunto(s)
Endocarditis/cirugía , Adulto , Anciano , Aorta/cirugía , Implantación de Prótesis Vascular , Anuloplastia de la Válvula Cardíaca , Endocarditis/microbiología , Endocarditis/mortalidad , Femenino , Implantación de Prótesis de Válvulas Cardíacas , Válvulas Cardíacas/cirugía , Humanos , Masculino , Persona de Mediana Edad , Readmisión del Paciente/estadística & datos numéricos , Abuso de Sustancias por Vía Intravenosa , Tasa de Supervivencia , Resultado del TratamientoRESUMEN
OBJECTIVE: To quantitate the impact of heart donation after circulatory death (DCD) donor utilization on both waitlist and post-transplant outcomes in the United States. METHODS: The United Network for Organ Sharing database was queried to identify all adult waitlisted and transplanted candidates between October 18, 2018, and December 31, 2022. Waitlisted candidates were stratified according to whether they had been approved for donation after brain death (DBD) offers only or also approved for DCD offers. The cumulative incidence of transplantation was compared between the 2 cohorts. In a post-transplant analysis, 1-year post-transplant survival was compared between unmatched and propensity-score-matched cohorts of DBD and DCD recipients. RESULTS: A total of 14,803 candidates were waitlisted, including 12,287 approved for DBD donors only and 2516 approved for DCD donors. Overall, DCD approval was associated with an increased sub-hazard ratio (HR) for transplantation and a lower sub-HR for delisting owing to death/deterioration after risk adjustment. In a subgroup analysis, candidates with blood type B and status 4 designation received the greatest benefit from DCD approval. A total of 12,238 recipients underwent transplantation, 11,636 with DBD hearts and 602 with DCD hearts. Median waitlist times were significantly shorter for status 3 and status 4 recipients receiving DCD hearts. One-year post-transplant survival was comparable between unmatched and propensity score-matched cohorts of DBD and DCD recipients. CONCLUSIONS: The use of DCD hearts confers a higher probability of transplantation and a lower incidence of death/deterioration while on the waitlist, particularly among certain subpopulations such as status 4 candidates. Importantly, the use of DCD donors results in similar post-transplant survival as DBD donors.
Asunto(s)
Trasplante de Corazón , Obtención de Tejidos y Órganos , Adulto , Humanos , Muerte Encefálica , Donantes de Tejidos , Trasplante de Corazón/efectos adversos , Probabilidad , Encéfalo , Estudios Retrospectivos , Supervivencia de InjertoRESUMEN
BACKGROUND: Acute incisional hernia incarceration is associated with high morbidity and mortality yet there is little evidence to guide which patients will benefit most from prophylactic repair. We explored baseline computed tomography (CT) characteristics associated with incarceration. METHODS: A case-control study design was utilized to explore adults (≥18 years) diagnosed with an incisional hernia between 2010 and 2017 at a single institution with a 1-year minimum follow-up. Computed tomography imaging at the time of initial hernia diagnosis was examined. Following propensity score matching for baseline characteristics, multivariable logistic regression was performed to identify independent predictors associated with acute incarceration. RESULTS: A total of 532 patients (27.26% male, mean 61.55 years) were examined, of whom 238 experienced an acute incarceration. Between two well-matched cohorts with and without incarceration, the presence of small bowel in the hernia sac (odds ratio [OR], 7.50; 95% confidence interval [CI], 3.35-16.38), increasing sac height (OR, 1.34; 95% CI, 1.10-1.64), more acute hernia angle (OR, 0.98 per degree; 95% CI, 0.97-0.99), decreased fascial defect width (OR, 0.68; 95% CI, 0.58-0.81), and greater outer abdominal fat (OR, 1.28; 95% CI, 1.02-1.60) were associated with acute incarceration. Using threshold analysis, a hernia angle of <91 degrees and a sac height of >3.25 cm were associated with increased incarceration risk. CONCLUSION: Computed tomography features present at the time of hernia diagnosis provide insight into later acute incarceration risk. Improved understanding of acute incisional hernia incarceration can guide selection for prophylactic repair and thereby may mitigate the excess morbidity associated with incarceration. LEVEL OF EVIDENCE: Prognostic and Epidemiological; Level III.
Asunto(s)
Hernia Ventral , Hernia Incisional , Adulto , Humanos , Masculino , Femenino , Hernia Incisional/diagnóstico por imagen , Hernia Incisional/cirugía , Estudios de Casos y Controles , Hernia , Tomografía Computarizada por Rayos X/métodos , Hernia Ventral/cirugía , HerniorrafiaRESUMEN
OBJECTIVE: This study evaluates the impact of donor age on outcomes following donation after circulatory death heart transplantation. METHODS: The United Network for Organ Sharing registry was queried to analyze adult recipients who underwent isolated donation after circulatory heart transplantation from January 1, 2019, to September 30, 2023. The cohort was stratified into 2 groups according to donor age, where advanced donor age was defined as 40 years or more. Outcomes were 90-day and 1-year post-transplant survival. Propensity score matching was performed. Subgroup analysis was performed to evaluate the effects of recipient age on 90-day survival among the recipients with advanced-age donors. RESULTS: A total of 994 recipients were included in the study period, and 161 patients (17.1%) received allografts from advanced-age donors. During the study period, the annual incidence of donation after circulatory heart transplantation with advanced-age donors substantially increased. The recipients with advanced-age donors had similar 90-day and 1-year post-transplant survivals compared with the recipients with younger donors. The comparable 90-day survival persisted in a propensity score-matched comparison. In the subgroup analysis among the recipients with advanced-age donors, the recipients aged 60 years or more had significantly reduced 90-day survival compared with the recipients aged less than 60 years. CONCLUSIONS: The use of appropriately selected donation after circulatory donors aged 40 years or more has similar survival compared with that of younger donors. With careful candidate risk stratification and selection, consideration of using donation after circulatory donors aged more than 40 years may further ameliorate ongoing organ shortage with comparable early post-transplant outcomes.
RESUMEN
Prior studies assessing the effects of Impella 5.5 support duration on posttransplant outcomes have been limited to single-center case reports and series. This study evaluates the impact of Impella 5.5 support duration on outcomes following heart transplantation using the United Network for Organ Sharing database. Adult heart transplant recipients who were directly bridged to primary isolated heart transplantation with Impella 5.5 were included. The cohort was stratified into two groups based on the duration of Impella support: less than or equal to 14 and greater than 14 days. The primary outcome was 90 day posttransplant survival. Propensity score matching was performed. Sub-analysis was conducted to evaluate the impact of greater than 30 days of Impella support on 90 day survival. Three hundred thirty-two recipients were analyzed. Of these, 212 recipients (63.9%) were directly bridged to heart transplantation with an Impella support duration of greater than 14 days. The two groups had comparable 90 day posttransplant survival and complication rates. The comparable posttransplant survival persisted in a propensity score-matched comparison. In the sub-analysis, Impella support duration of greater than or equal to 30 days did not adversely impact 90 day survival. This study demonstrates that extended duration of support with Impella 5.5 as a bridge to transplantation does not adversely impact posttransplant outcomes. Impella 5.5 is a safe and effective bridging modality to heart transplantation.
RESUMEN
BACKGROUND: This study evaluates the interaction of donor and recipient age with outcomes following heart transplantation under the 2018 heart allocation system. METHODS: The United Network for Organ Sharing registry was queried to analyze adult primary isolated orthotopic heart transplant recipients and associated donors from August 18, 2018, to June 30, 2021. Both recipient and donor cohorts were grouped according to age: <65 and ≥65 y for recipients and <50 and ≥50 y for donors. The primary outcome was survival. Subanalyses were performed to evaluate the impact of donor age. RESULTS: A total of 7601 recipients and 7601 donors were analyzed. Of these, 1584 recipients (20.8%) were ≥65 y old and 560 donors (7.4%) were ≥50 y old. Compared with recipients <65, recipients ≥65 had decreased 1-y (88.8% versus 92.3%) and 2-y (85.1% versus 88.5%) survival rates (P < 0.001). The association of recipient age ≥65 with lower survival persisted after adjusting for potential cofounders (hazard ratio, 1.38; 95% confidence interval, 1.18-1.61; P < 0.001). Recipients <65 with donors ≥50 had comparable 1-y and 2-y survival rates to recipients <65 with donors <50 (P =0.997). Conversely, transplantation of older allografts was associated with lower 1-y (84.2% versus 89.4%) and 2-y (79.5% versus 85.8%) survival rates in recipients ≥65 (P = 0.025). CONCLUSIONS: Recipient age ≥65 continues to be associated with worse survival following heart transplantation in the 2018 heart allocation system compared with younger recipients. Donors ≥50 may be acceptable among recipients <65 with comparable outcomes. However, careful donor age selection should be considered for recipients ≥65, as the use of younger donor allografts appears to improve posttransplantation survival.
RESUMEN
OBJECTIVE: This study aimed to investigate the clinical trends and the impact of the 2018 heart allocation policy change on both waitlist and post-transplant outcomes in simultaneous heart-kidney transplantation in the United States. METHODS: The United Network for Organ Sharing registry was queried to compare adult patients before and after the allocation policy change. This study included 2 separate analyses evaluating the waitlist and post-transplant outcomes. Multivariable analyses were performed to determine the 2018 allocation system's risk-adjusted hazards for 1-year waitlist and post-transplant mortality. RESULTS: The initial analysis investigating the waitlist outcomes included 1779 patients listed for simultaneous heart-kidney transplantation. Of these, 1075 patients (60.4%) were listed after the 2018 allocation policy change. After the policy change, the waitlist outcomes significantly improved with a shorter waitlist time, lower likelihood of de-listing, and higher likelihood of transplantation. In the subsequent analysis investigating the post-transplant outcomes, 1130 simultaneous heart-kidney transplant recipients were included, where 738 patients (65.3%) underwent simultaneous heart-kidney transplantation after the policy change. The 90-day, 6-month, and 1-year post-transplant survival and complication rates were comparable before and after the policy change. Multivariable analyses demonstrated that the 2018 allocation system positively impacted risk-adjusted 1-year waitlist mortality (sub-hazard ratio, 0.66, 95% CI, 0.51-0.85, P < .001), but it did not significantly impact risk-adjusted 1-year post-transplant mortality (hazard ratio, 1.03; 95% CI, 0.72-1.47, P = .876). CONCLUSIONS: This study demonstrates increased rates of simultaneous heart-kidney transplantation with a shorter waitlist time after the 2018 allocation policy change. Furthermore, there were improved waitlist outcomes and comparable early post-transplant survival after simultaneous heart-kidney transplantation under the 2018 allocation system.