RESUMO
OBJECTIVES: To examine E-aminocaproic acid effectiveness in reducing transfusion requirements in overall and less-invasive cardiac surgery, and to assess its safety. DESIGN: Retrospective cohort study. SETTING: Single-center tertiary academic medical center. PARTICIPANTS: A total of 19,111 adult patients who underwent elective surgery requiring cardiopulmonary bypass from January 1, 2008, through December 31, 2016. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Propensity matching was used to create well-balanced groups and separately compare both overall cohort and less-invasive surgery with and without E-aminocaproic acid. Supplementary zero-inflated negative binomial regression analysis was used because outcome data were zero-inflated. Effectiveness was assessed by transfusion requirements, and safety by comparison of in-hospital outcomes. In the overall cohort, patients receiving E-aminocaproic acid received fewer red blood cells postoperatively and fewer intra- and postoperativel blood products. In a less-invasive cohort, there was no significant difference in red blood cell transfusion either intra- or postoperatively, but the E-aminocaproic group received fewer intra- and postoperative platelets, intraoperative cryoprecipitate, and postoperative plasma. There were no significant differences for in-hospital outcomes in both less-invasive and overall cohorts. CONCLUSIONS: The reduction of postoperative red blood cell requirement observed when analyzing the overall cohort did not translate to less-invasive cardiac surgery in the authors' patient population; however, both overall and less-invasive cohorts had lower requirements for other blood components with E-aminocaproic acid. There was no association with major Society of thoracic surgeons (STS)-defined morbidity and mortality in both groups.
Assuntos
Antifibrinolíticos , Procedimentos Cirúrgicos Cardíacos , Adulto , Ácido Aminocaproico/efeitos adversos , Antifibrinolíticos/efeitos adversos , Perda Sanguínea Cirúrgica/prevenção & controle , Procedimentos Cirúrgicos Cardíacos/efeitos adversos , Ponte Cardiopulmonar/efeitos adversos , Humanos , Estudos RetrospectivosRESUMO
The prelisting variables essential for creating an accurate heart transplant allocation score based on survival are unknown. To identify these we studied mortality of adults on the active heart transplant waiting list in the Scientific Registry of Transplant Recipients database from January 1, 2004 to August 31, 2015. There were 33 069 candidates awaiting heart transplantation: 7681 UNOS Status 1A, 13 027 Status 1B, and 12 361 Status 2. During a median waitlist follow-up of 4.3 months, 5514 candidates died. Variables of importance for waitlist mortality were identified by machine learning using Random Survival Forests. Strong correlates predicting survival were estimated glomerular filtration rate (eGFR), serum albumin, extracorporeal membrane oxygenation, ventricular assist device, mechanical ventilation, peak oxygen capacity, hemodynamics, inotrope support, and type of heart disease with less predictive variables including antiarrhythmic agents, history of stroke, vascular disease, prior malignancy, and prior tobacco use. Complex interactions were identified such as an additive risk in mortality based on renal function and serum albumin, and sex-differences in mortality when eGFR >40 mL/min/1.73 m. Most predictive variables for waitlist mortality are in the current tiered allocation system except for eGFR and serum albumin which have an additive risk and complex interactions.
Assuntos
Bases de Dados Factuais , Insuficiência Cardíaca/mortalidade , Transplante de Coração/mortalidade , Sistema de Registros/estatística & dados numéricos , Obtenção de Tecidos e Órgãos/métodos , Transplantados/estatística & dados numéricos , Listas de Espera/mortalidade , Feminino , Seguimentos , Insuficiência Cardíaca/cirurgia , Humanos , Aprendizado de Máquina , Masculino , Pessoa de Meia-Idade , Prognóstico , Alocação de Recursos/métodos , Fatores de Risco , Taxa de Sobrevida , Fatores de TempoRESUMO
BACKGROUND: We observed an apparent increase in the rate of device thrombosis among patients who received the HeartMate II left ventricular assist device, as compared with preapproval clinical-trial results and initial experience. We investigated the occurrence of pump thrombosis and elevated lactate dehydrogenase (LDH) levels, LDH levels presaging thrombosis (and associated hemolysis), and outcomes of different management strategies in a multi-institutional study. METHODS: We obtained data from 837 patients at three institutions, where 895 devices were implanted from 2004 through mid-2013; the mean (±SD) age of the patients was 55±14 years. The primary end point was confirmed pump thrombosis. Secondary end points were confirmed and suspected thrombosis, longitudinal LDH levels, and outcomes after pump thrombosis. RESULTS: A total of 72 pump thromboses were confirmed in 66 patients; an additional 36 thromboses in unique devices were suspected. Starting in approximately March 2011, the occurrence of confirmed pump thrombosis at 3 months after implantation increased from 2.2% (95% confidence interval [CI], 1.5 to 3.4) to 8.4% (95% CI, 5.0 to 13.9) by January 1, 2013. Before March 1, 2011, the median time from implantation to thrombosis was 18.6 months (95% CI, 0.5 to 52.7), and from March 2011 onward, it was 2.7 months (95% CI, 0.0 to 18.6). The occurrence of elevated LDH levels within 3 months after implantation mirrored that of thrombosis. Thrombosis was presaged by LDH levels that more than doubled, from 540 IU per liter to 1490 IU per liter, within the weeks before diagnosis. Thrombosis was managed by heart transplantation in 11 patients (1 patient died 31 days after transplantation) and by pump replacement in 21, with mortality equivalent to that among patients without thrombosis; among 40 thromboses in 40 patients who did not undergo transplantation or pump replacement, actuarial mortality was 48.2% (95% CI, 31.6 to 65.2) in the ensuing 6 months after pump thrombosis. CONCLUSIONS: The rate of pump thrombosis related to the use of the HeartMate II has been increasing at our centers and is associated with substantial morbidity and mortality.
Assuntos
Coração Auxiliar/efeitos adversos , L-Lactato Desidrogenase/sangue , Trombose/etiologia , Biomarcadores/sangue , Seguimentos , Transplante de Coração , Humanos , Incidência , Estimativa de Kaplan-Meier , Auditoria Médica , Desenho de Prótese , Falha de Prótese , Risco , Estatísticas não Paramétricas , Trombose/epidemiologia , Trombose/mortalidade , Trombose/terapiaRESUMO
BACKGROUND: Recent data suggest that the Berlin Heart EXCOR Pediatric ventricular assist device is superior to extracorporeal membrane oxygenation for bridge to heart transplantation. Published data are limited to 1 in 4 children who received the device as part of the US clinical trial. We analyzed outcomes for all US children who received the EXCOR to characterize device outcomes in an unselected cohort and to identify risk factors for mortality to facilitate patient selection. METHODS AND RESULTS: This multicenter, prospective cohort study involved all children implanted with the Berlin Heart EXCOR Pediatric ventricular assist device at 47 centers from May 2007 through December 2010. Multiphase nonproportional hazards modeling was used to identify risk factors for early (<2 months) and late mortality. Of 204 children supported with the EXCOR, the median duration of support was 40 days (range, 1-435 days). Survival at 12 months was 75%, including 64% who reached transplantation, 6% who recovered, and 5% who were alive on the device. Multivariable analysis identified lower weight, biventricular assist device support, and elevated bilirubin as risk factors for early mortality and bilirubin extremes and renal dysfunction as risk factors for late mortality. Neurological dysfunction occurred in 29% and was the leading cause of death. CONCLUSIONS: Use of the Berlin Heart EXCOR has risen dramatically over the past decade. The EXCOR has emerged as a new treatment standard in the United States for pediatric bridge to transplantation. Three-quarters of children survived to transplantation or recovery; an important fraction experienced neurological dysfunction. Smaller patient size, renal dysfunction, hepatic dysfunction, and biventricular assist device use were associated with mortality, whereas extracorporeal membrane oxygenation before implantation and congenital heart disease were not.
Assuntos
Transplante de Coração , Coração Auxiliar , Tamanho Corporal , Causas de Morte , Criança , Pré-Escolar , Comorbidade , Ensaios de Uso Compassivo , Desenho de Equipamento , Oxigenação por Membrana Extracorpórea/estatística & dados numéricos , Feminino , Cardiopatias Congênitas/sangue , Cardiopatias Congênitas/cirurgia , Cardiopatias/sangue , Cardiopatias/cirurgia , Transplante de Coração/estatística & dados numéricos , Hemorragia/epidemiologia , Humanos , Hiperbilirrubinemia/epidemiologia , Lactente , Nefropatias/epidemiologia , Hepatopatias/epidemiologia , Masculino , Mortalidade , Insuficiência de Múltiplos Órgãos/epidemiologia , Modelos de Riscos Proporcionais , Risco , Acidente Vascular Cerebral/epidemiologia , Taxa de Sobrevida , Resultado do Tratamento , Listas de EsperaRESUMO
OBJECTIVES: To identify preoperative predictors of postcardiotomy cardiogenic shock in patients with ischemic and nonischemic cardiomyopathy and evaluate trajectory of postoperative ventricular function. METHODS: From January 2017 to January 2020, 238 patients with ejection fraction <30% (206/238) or 30% to 34% with at least moderately severe mitral regurgitation (32/238) underwent conventional cardiac surgery at Cleveland Clinic, 125 with ischemic and 113 with nonischemic cardiomyopathy. Preoperative ejection fraction was 25 ± 4.5%. The primary outcome was postcardiotomy cardiogenic shock, defined as need for microaxial temporary left ventricular assist device, extracorporeal membrane oxygenation, or vasoactive-inotropic score >25. RandomForestSRC was used to identify its predictors. RESULTS: Postcardiotomy cardiogenic shock occurred in 27% (65/238). Pulmonary artery pulsatility index <3.5 and pulmonary capillary wedge pressure >19 mm Hg were the most important factors predictive of postcardiotomy cardiogenic shock in ischemic cardiomyopathy. Cardiac index <2.2 L·min-1 m-2 and pulmonary capillary wedge pressure >21 mm Hg were the most important predictive factors in nonischemic cardiomyopathy. Operative mortality was 1.7%. Ejection fraction at 12 months after surgery increased to 39% (confidence interval, 35-40%) in the ischemic group and 37% (confidence interval, 35-38%) in the nonischemic cardiomyopathy group. CONCLUSIONS: Predictors of postcardiotomy cardiogenic shock were different in ischemic and nonischemic cardiomyopathy. Right heart dysfunction, indicated by low pulmonary artery pulsatility index, was the most important predictor in ischemic cardiomyopathy, whereas greater degree of cardiac decompensation was the most important in nonischemic cardiomyopathy. Therefore, preoperative right heart catheterization will help identify patients with low ejection fraction who are at greater risk of postcardiotomy cardiogenic shock.
RESUMO
OBJECTIVE: The study objective was to determine effects of donor smoking and substance use on primary graft dysfunction, allograft function, and survival after lung transplant. METHODS: From January 2007 to February 2020, 1366 lung transplants from 1291 donors were performed in 1352 recipients at Cleveland Clinic. Donor smoking and substance use history were extracted from the Uniform Donor Risk Assessment Interview and medical records. End points were post-transplant primary graft dysfunction, longitudinal forced expiratory volume in 1 second (% of predicted), and survival. RESULTS: Among lung transplant recipients, 670 (49%) received an organ from a donor smoker, 163 (25%) received an organ from a donor with a 20 pack-year or more history (median pack-years 8), and 702 received an organ from a donor with substance use (51%). There was no association of donor smoking, pack-years, or substance use with primary graft dysfunction (P > .2). Post-transplant forced expiratory volume in 1 second was 74% at 1 year in donor nonsmoker recipients and 70% in donor smoker recipients (P = .0002), confined to double-lung transplant, where forced expiratory volume in 1 second was 77% in donor nonsmoker recipients and 73% in donor smoker recipients. Donor substance use was not associated with allograft function. Donor smoking was associated with 54% non-risk-adjusted 5-year survival versus 59% (P = .09) and greater pack-years with slightly worse risk-adjusted long-term survival (P = .01). Donor substance use was not associated with any outcome (P ≥ 8). CONCLUSIONS: Among well-selected organs, lungs from smokers were associated with non-clinically important worse allograft outcomes without an inflection point for donor smoking pack-years. Substance use was not associated with worse allograft function. Given the paucity of organs, donor smoking or substance use alone should not preclude assessment for lung donation or transplant.
Assuntos
Transplante de Pulmão , Disfunção Primária do Enxerto , Humanos , Estudos Retrospectivos , Fumar/efeitos adversos , Doadores de Tecidos , Transplante de Pulmão/efeitos adversos , Sobrevivência de EnxertoRESUMO
OBJECTIVE: To characterize patients with right heart failure undergoing isolated tricuspid valve surgery, focusing on right heart morphology and function. PATIENTS AND METHODS: From January 2007 to January 2014, 62 patients underwent isolated tricuspid valve surgery. Forty-five patients (73%) had undergone previous heart operations. Right heart morphology and function variables were measured de novo from stored echocardiographic images, and clinical and hemodynamic data were extracted from patient registries and records. Cluster analysis was performed and outcomes assessed. RESULTS: On average, the right ventricle was dilated (diastolic area 32 cm2), but its function was preserved (free-wall strain -17% ± 5.8%) and right heart failure manifestations were moderate, with 40 (65%) having congested neck veins, 35 (56%) dependent edema, and 15 (24%) ascites. Average model for end-stage liver disease with sodium score was 11 ± 4.4, but individual values varied widely. Tricuspid valve variables split patients into 2 equal clusters: those with functional tricuspid regurgitation (TR) and those with structural TR. These groups had similar right ventricular function, but the functional TR group had worse right ventricular morphology and more severe manifestations of right heart failure, including greater model for end-stage liver disease with sodium scores (12 ± 44 vs 9.1 ± 3.9; P = .008). Both groups survived operation with low morbidity, but patients with functional TR had worse long-term survival, 48% versus 73% at 10 years after surgery. CONCLUSIONS: The cluster analysis of patients with right heart failure undergoing isolated tricuspid valve surgery separated functional and structural tricuspid valve disease. Good early outcomes suggest expanding criteria for tricuspid valve surgery and earlier intervention for functional TR with right heart failure.
Assuntos
Doença Hepática Terminal , Insuficiência Cardíaca , Implante de Prótese de Valva Cardíaca , Insuficiência da Valva Tricúspide , Humanos , Valva Tricúspide/diagnóstico por imagem , Valva Tricúspide/cirurgia , Doença Hepática Terminal/cirurgia , Seleção de Pacientes , Resultado do Tratamento , Índice de Gravidade de Doença , Insuficiência da Valva Tricúspide/diagnóstico por imagem , Insuficiência da Valva Tricúspide/cirurgia , Insuficiência Cardíaca/diagnóstico por imagem , Insuficiência Cardíaca/etiologia , Insuficiência Cardíaca/cirurgia , Sódio , Estudos RetrospectivosRESUMO
Objective: Post-Norwood mortality remains high and unpredictable. Current models for mortality do not incorporate interstage events. We sought to determine the association of time-related interstage events, along with (pre)operative characteristics, with death post-Norwood and subsequently predict individual mortality. Methods: From the Congenital Heart Surgeons' Society Critical Left Heart Obstruction cohort, 360 neonates underwent Norwood operations from 2005 to 2016. Risk of death post-Norwood was modeled using a novel application of parametric hazard analysis, in which baseline and operative characteristics and time-related adverse events, procedures, and repeated weight and arterial oxygen saturation measurements were considered. Individual predicted mortality trajectories that dynamically update (increase or decrease) over time were derived and plotted. Results: After the Norwood, 282 patients (78%) progressed to stage 2 palliation, 60 patients (17%) died, 5 patients (1%) underwent heart transplantation, and 13 patients (4%) were alive without transitioning to another end point. In total, 3052 postoperative events occurred and 963 measures of weight and oxygen saturation were obtained. Risk factors for death included resuscitated cardiac arrest, moderate or greater atrioventricular valve regurgitation, intracranial hemorrhage/stroke, sepsis, lower longitudinal oxygen saturation, readmission, smaller baseline aortic diameter, smaller baseline mitral valve z-score, and lower longitudinal weight. Each patient's predicted mortality trajectory varied as risk factors occurred over time. Groups with qualitatively similar mortality trajectories were noted. Conclusions: Risk of death post-Norwood is dynamic and most frequently associated with time-related postoperative events and measures, rather than baseline characteristics. Dynamic predicted mortality trajectories for individuals and their visualization represent a paradigm shift from population-derived insights to precision medicine at the patient level.
RESUMO
OBJECTIVE: Left ventricular assist devices require a psychosocial assessment to determine candidacy despite limited data correlating with outcome. Our objective is to determine whether the Stanford Integrated Psychosocial Assessment for Transplant, a tool validated for transplant and widely used by left ventricular assist device programs, predicts left ventricular assist device program hospital readmissions and death. METHODS: We performed a retrospective analysis of adults at the Cleveland Clinic with Stanford Integrated Psychosocial Assessment for Transplant scores before primary left ventricular assist device program implantation from April 1, 2013, to December 31, 2018. The primary outcome was unplanned hospital readmissions censored at death, transplantation, and transfer of care. The secondary outcome was death. RESULTS: There were 263 patients in the left ventricular assist device program with a median (Q1, Q3) Stanford Integrated Psychosocial Assessment for Transplant score of 16 (8, 28). During a median follow-up 1.2 years, 56 died, 65 underwent transplantation, and 21 had transferred care. There were 640 unplanned hospital readmissions among 250 patients with at least 1 outpatient visit at our center. In a multivariable analysis, Stanford Integrated Psychosocial Assessment for Transplant components but not total Stanford Integrated Psychosocial Assessment for Transplant score was associated with readmissions. Psychopathology (Stanford Integrated Psychosocial Assessment for Transplant C-IX) was associated with hemocompatibility (coefficient 0.21 ± standard error 0.11, P = .040) and cardiac (0.15 ± 0.065, P = .02) readmissions. Patient readiness was associated with noncardiac (Stanford Integrated Psychosocial Assessment for Transplant A-III, 0.24 ± 0.099, P = .016) and cardiac (Stanford Integrated Psychosocial Assessment for Transplant A-low total, 0.037 ± 0.014, P = .007) readmissions. Poor living environment (Stanford Integrated Psychosocial Assessment for Transplant B-VIII) was associated with device-related readmissions (0.83 ± 0.34, P = .014). Death was associated with organic psychopathology or neurocognitive impairment (Stanford Integrated Psychosocial Assessment for Transplant C-X, 0.59 ± 0.21, P = .006). CONCLUSIONS: Total Stanford Integrated Psychosocial Assessment for Transplant score was not associated with left ventricular assist device program readmission or mortality. However, we identified certain Stanford Integrated Psychosocial Assessment for Transplant components that were associated with outcome and could be used to create a left ventricular assist device program specific psychosocial tool.
Assuntos
Insuficiência Cardíaca , Transplante de Coração , Coração Auxiliar , Adulto , Humanos , Insuficiência Cardíaca/diagnóstico , Insuficiência Cardíaca/cirurgia , Estudos Retrospectivos , Resultado do TratamentoRESUMO
OBJECTIVES: Gastroparesis is a debilitating and difficult to manage problem that has been reported in 20% to 90% of lung and heart-lung transplant recipients. The primary objective was to evaluate the safety and clinical effectiveness of per-oral endoscopic pyloromyotomy in relieving gastroparesis after lung transplant. Secondary objectives evaluated the effect of per-oral endoscopic pyloromyotomy on gastroesophageal reflux and allograft function. METHODS: Fifty-two lung transplant recipients underwent per-oral endoscopic pyloromyotomy for refractory gastroparesis. Gastroparesis was assessed by a pre-per-oral endoscopic pyloromyotomy and post-per-oral endoscopic pyloromyotomy radionuclide gastric emptying test and Gastroparesis Cardinal Symptom Index. Secondary outcomes included 90-day complications, gastroesophageal reflux as measured by pH testing, and longitudinal spirometry measurements. RESULTS: Median time from lung transplant to per-oral endoscopic pyloromyotomy was 10.5 months. Twenty-eight patients had prior pyloric botulinum injection with either no improvement or relapse of symptoms. Post-per-oral endoscopic pyloromyotomy gastric emptying tests were available for 32 patients and showed a decrease in median gastric retention at 4 hours from 63.5% pre-per-oral endoscopic pyloromyotomy to 5.5% post-per-oral endoscopic pyloromyotomy (P < .0001). Complete normalization of gastric emptying time was noted in 19 patients. Gastroparesis Cardinal Symptom Index score significantly improved after per-oral endoscopic pyloromyotomy (median, 23-3.5; P < .0001). Post-per-oral endoscopic pyloromyotomy pH testing showed improved or stable DeMeester score in all patients except 1. Graft function (forced expiratory volume in 1 second) remained stable 1 year after per-oral endoscopic pyloromyotomy. CONCLUSIONS: The improvements in symptom score and radionuclide imaging observed in this uncontrolled study suggest that per-oral endoscopic pyloromyotomy is an effective strategy in the lung transplant population and can be performed with minimal morbidity.
Assuntos
Refluxo Gastroesofágico , Gastroparesia , Transplante de Pulmão , Piloromiotomia , Refluxo Gastroesofágico/complicações , Gastroparesia/diagnóstico por imagem , Gastroparesia/etiologia , Gastroparesia/cirurgia , Humanos , Transplante de Pulmão/efeitos adversos , Recidiva Local de Neoplasia , Piloromiotomia/efeitos adversos , Resultado do TratamentoRESUMO
Introduction: Patients with end stage liver disease (ESLD) have a hyperdynamic state due to decreased systemic vascular resistance and increased cardiac output. Preoperative evaluation with dobutamine stress echocardiography (DSE) is used to risk-stratify patients prior to liver transplant. We sought to identify the impact of inducible left ventricular outflow tract obstruction (LVOTO) on DSE on post-operative liver transplant outcomes. Methods: Patients with ESLD who underwent liver transplant at Cleveland Clinic between January 2007 and August 2016 were identified. Pre-operative DSE data, and post-operative intensive care unit (ICU) data were extracted. Patients with inducible LVOTO were compared to those without LVOTO. Results: Of the 515 patients identified who underwent DSE prior to liver transplant, 165 (30%) were female, and 95 (18%) had LVOTO. There were no major differences in baseline characteristics between the two groups. In the LVOTO group, rest gradients were 10.8 ± 3 mm Hg while peak gradients were 90 ± 48.2 mm Hg. No significant differences in ICU length of stay or duration of mechanical ventilation between both groups were noted. There were 21 deaths at 30 days. There were 2 (2.1%) deaths in the LVOTO group, versus 19 (4.5%) deaths in the non LVOTO group (p = 0.28). Higher Model for End Stage Liver Disease (MELD) scores predicted longer duration of mechanical ventilation and ICU length of stay. Conclusion: Inducible LVOTO on DSE does not adversely affect the short-term outcomes post liver transplant. Presence of inducible LVOTO should not be the mere reason to deny liver transplant among patients with ESLD.
RESUMO
BACKGROUND: Lung transplantation (LTx) is a definitive treatment for end-stage lung disease. Herein, we reviewed our center experience over 3 decades to examine the evolution of recipient characteristics and contemporary predictors of survival for LTx. METHODS: We retrospectively reviewed the data of LTx procedures performed at our institution from January 1990 to January 2019 (n = 1819). The cohort is divided into 3 eras; I: 1990-1998 (n = 152), II: 1999-2008 (n = 521), and III: 2009-2018 (n = 1146). Univariate and multivariate analyses of survival in era III were performed. RESULTS: Pulmonary fibrosis has become the leading indication for LTx (13% in era I, 57% in era III). Median recipient age increased (era I: 46 y-era III: 61 y) as well as intraoperative mechanical circulatory support (era I: 0%-era III: 6%). Higher lung allocation score was associated with primary graft dysfunction (P < 0.0001), postoperative extracorporeal mechanical support (P < 0.0001), and in-hospital mortality (P = 0.002). In era III, hypoalbuminemia, thrombocytopenia, and high primary graft dysfunction grade were multivariate predictors of early mortality. The 5-y survival in eras II (55%) and III (55%) were superior to era I (40%, P < 0.001). Risk factors for late mortality in era III included recipient age, chronic allograft dysfunction, renal dysfunction, high model for end-stage liver disease score, and single LTx. CONCLUSIONS: In this longitudinal single-center study, recipient characteristics have evolved to include sicker patients with greater complexity of procedures and risk for postoperative complications but without significant impact on hospital mortality or long-term survival. With advancing surgical techniques and perioperative management, there is room for further progress in the field.
Assuntos
Doença Hepática Terminal , Transplante de Pulmão , Doença Hepática Terminal/etiologia , Humanos , Transplante de Pulmão/efeitos adversos , Complicações Pós-Operatórias/etiologia , Estudos Retrospectivos , Índice de Gravidade de DoençaRESUMO
BACKGROUND: Despite advances in lung transplantation, 5-year survival remains at 56%. Although the focus has been on chronic lung allograft dysfunction and infection, pleural complications in some may contribute to adverse outcomes. Therefore, we determined (1) the prevalence of, and risk factors for, pleural complications after lung transplantation and (2) their association with allograft function and mortality. METHODS: From 2006 to 2017, 1039 adults underwent primary lung transplantation at Cleveland Clinic in Cleveland, Ohio. Multivariable analyses were performed in the multiphase mixed longitudinal and hazard function domains to identify risk factors associated with allograft function and survival. RESULTS: A total of 468 patients (45%) had pleural complications, including pleural effusion in 271 (26%), pneumothorax in 152 (15%), hemothorax in 128 (12%), empyema in 47 (5%), and chylothorax in 9 (1%). Risk factors for pleural complications within the first 3 months included higher recipient-to-donor weight ratio, lower recipient albumin, and recipient-to-donor race mismatch; risk factors extending beyond 3 months included older age, hypertension, smoking history, lower lung allocation score, and donor death from anoxia. Cardiopulmonary bypass and previous thoracic interventions were not risk factors in patients with pleural effusions who were treated with thoracentesis only, and forced expiratory volume in 1 second improved after drainage; however, repeat percutaneous or surgical interventions did not impart a similar benefit. Pleural complications were associated with worse survival. CONCLUSIONS: Pleural complications are common after lung transplantation and are associated with worse allograft function and survival. These complications are likely secondary to other underlying clinical problems. Malnourishment and size mismatch are modifiable risk factors.
Assuntos
Transplante de Pulmão/efeitos adversos , Doenças Pleurais/etiologia , Complicações Pós-Operatórias , Feminino , Seguimentos , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Ohio/epidemiologia , Doenças Pleurais/epidemiologia , Doenças Pleurais/cirurgia , Estudos Retrospectivos , Taxa de Sobrevida/tendências , Toracentese/métodosRESUMO
BACKGROUND: Pleural complications after lung transplant may restrict allograft expansion, requiring decortication. However, its extent, indications, risk factors, and effect on allograft function and survival are unclear. METHODS: From January 2006 to January 2017, 1,039 patients underwent primary lung transplant and 468 had pleural complications, 77 (16%) of whom underwent 84 surgical decortications for pleural space management. Multivariable time-related analysis was performed to identify risk factors for decortication. Mixed-effect longitudinal modeling was used to assess allograft function before and after decortication. RESULTS: Cumulative number of decortications per 100 transplants was 1.8, 7.8, and 8.8 at 1 month, 1 year, and 3 years after transplant, respectively. Indications for the 84 decortications were complex effusion in 47 (56%), fibrothorax in 17 (20%), empyema in 11 (13%), and hemothorax in 9 (11%). Thoracoscopic operations were performed in 52 (62%) and full lung re-expansion was achieved in 76 (90%). Complications occurred after 30 (36%) decortications, with 15 pulmonary complications (18%), including 2 patients requiring extracorporeal support due to worsening function. Ten reinterventions occurred via thoracentesis (2), tube thoracostomy (1), and reoperation (7). In-hospital and 30-day mortality was 5.2% (nâ¯=â¯4/77). Forced expiratory volume in 1 second increased from 50% to 60% within the first year after decortication, followed by a slow decline to 55% at 5 years. Postdecortication survival was 87%, 68%, and 48% at 1, 3, and 5 years, respectively. CONCLUSIONS: Despite high risk of reoperative surgery, decortication after lung transplant allows salvage of pleural space and graft function with a reasonable morbidity profile.
Assuntos
Transplante de Pulmão/efeitos adversos , Pleura/cirurgia , Doenças Pleurais/epidemiologia , Complicações Pós-Operatórias/epidemiologia , Toracotomia/métodos , Feminino , Seguimentos , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Ohio/epidemiologia , Doenças Pleurais/cirurgia , Complicações Pós-Operatórias/cirurgia , Reoperação , Estudos Retrospectivos , Fatores de Risco , Fatores de TempoRESUMO
BACKGROUND: Age has been implicated as a factor in the plateau of long-term survival after lung transplant. METHODS: We used data from the Scientific Registry of Transplant Recipients to identify all recipients of lung transplant aged ≥18 years of age between January 1, 2006, and February 19, 2015. A total of 14,253 patients were included in the analysis. Survival was estimated using a nonproportional hazard model and random-survival forest methodology was used to examine risk factors for death. Final selection of model variables was performed using bootstrap aggregation. Age was analyzed as both a continuous and categorical variable (age <30, 30-55, and >55 years). Risk factors for death were obtained for the entire cohort and additional age-specific risk factors were identified for each age category. RESULTS: The median age at transplant was 59 years. There were 1,098 (7.7%) recipients <30 years, 4,201 (29.5%) 30 to 55 years, and 8,954 (62.8%) >55 years of age. Age was the most significant risk factor for death at all time-points following transplant and its impact becomes more prominent as time from transplant increases. Risk factors for death for all patients included extremes of age, higher creatinine, single lung transplant, hospitalization before transplant, and increased bilirubin. Risk factors for death differed by age with social determinants of health disproportionately affecting survival for those in the youngest age category. CONCLUSIONS: The youngest and oldest adult recipients experienced the lowest posttransplant survival through divergent pathways that may present opportunities for intervention to improve survival after lung transplant.
Assuntos
Fatores Etários , Pneumopatias , Transplante de Pulmão , Transplantados/estatística & dados numéricos , Adulto , Feminino , Humanos , Pneumopatias/epidemiologia , Pneumopatias/cirurgia , Transplante de Pulmão/métodos , Transplante de Pulmão/mortalidade , Transplante de Pulmão/estatística & dados numéricos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Sistema de Registros/estatística & dados numéricos , Medição de Risco/métodos , Medição de Risco/estatística & dados numéricos , Fatores de Risco , Determinantes Sociais da Saúde/estatística & dados numéricos , Taxa de Sobrevida , Obtenção de Tecidos e Órgãos/métodos , Obtenção de Tecidos e Órgãos/estatística & dados numéricos , Estados Unidos/epidemiologiaRESUMO
OBJECTIVES: To (1) determine outcomes after urgent listing compared with elective listing for lung transplant and (2) compare in-hospital morbidity and mortality, survival, and allograft function in these 2 groups. METHODS: From January 2006 to September 2017, 201 patients were urgently and 1423 electively listed. Among urgently listed patients, 130 subsequently underwent primary lung transplant as did 995 electively listed patients. Competing-risks analysis for death and transplant after listing and weighted balancing score matching (76 pairs) were used to compare in-hospital morbidity and survival. Mixed-effect longitudinal modeling was used to compare allograft function to 8 years post-transplant. RESULTS: At 1 month, mortality was 26% in urgently listed patients, and 58% were transplanted. Risk factors for death included older age, higher bilirubin, and transfer from an outside hospital. At transplantation, urgently listed transplant patients were younger (53 ± 13 vs 55 ± 12 years), had more ventilator and extracorporeal membrane oxygenation support (32/25% vs 20/2.0%), more restrictive lung disease (95/73% vs 509/51%), and a higher lung allocation score (82 ± 13 vs 47 ± 17). In-hospital morbidity and mortality, time-related survival, and longitudinal allograft function were similar between matched groups. CONCLUSIONS: Urgent listing more often than not leads to transplantation. Although urgently listed patients are sicker overall, after transplant their perioperative morbidity and mortality, overall survival, and allograft function are similar to those of electively listed patients. Appropriate patient selection and aggressive supportive care allow urgently listed lung transplant patients to achieve these similar post-transplant outcomes.
RESUMO
OBJECTIVES: This study aims to understand the complex factors affecting heart transplant survival and to determine the importance of possible sex-specific risk factors. BACKGROUND: Heart transplant allocation is primarily focused on preventing waitlist mortality. To prevent organ wastage, future allocation must balance risk of waitlist mortality with post-transplantation mortality. However, more information regarding risk factors after heart transplantation is needed. METHODS: We included all adults (30,606) in the Scientific Registry of Transplant Recipients database who underwent isolated heart transplantation from January 1, 2004, to July 1, 2018. Mortality (8,278 deaths) was verified with the complete Social Security Death Index with a median follow-up of 3.9 years. Temporal decomposition was used to identify phases of survival and phase-specific risk factors. The random survival forests method was used to determine importance of mortality risk factors and their interactions. RESULTS: We identified 3 phases of mortality risk: early post-transplantation, constant, and late. Sex was not a significant risk factor. There were several interactions predicting early mortality such as pretransplantation mechanical ventilation with presence of end-organ function (bilirubin, renal function) and interactions predicting later mortality such as diabetes and older age (donor and recipient). More complex interactions predicting early-, mid-, and late-mortality existed and were identified with machine learning (i.e., elevated bilirubin, mechanical ventilation, and dialysis). CONCLUSIONS: Post-heart transplant mortality risk is complex and dynamic, changing with time and events. Sex is not an important mortality risk factor. To prevent organ wastage, end-organ dysfunction should be resolved before transplantation as much as possible.
Assuntos
Insuficiência Cardíaca/cirurgia , Transplante de Coração/mortalidade , Sistema de Registros , Doadores de Tecidos , Obtenção de Tecidos e Órgãos/métodos , Adulto , Fatores Etários , Feminino , Seguimentos , Sobrevivência de Enxerto , Insuficiência Cardíaca/mortalidade , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Fatores de Risco , Fatores Sexuais , Taxa de Sobrevida/tendências , Fatores de Tempo , Estados Unidos/epidemiologia , Listas de Espera/mortalidadeRESUMO
BACKGROUND: The aim of this study was to examine the associations between 16 specific single nucleotide polymorphisms (SNPs) in 8 obesity-related genes and overall and cause-specific mortality. We also examined the associations between the SNPs and body mass index (BMI) and change in BMI over time. METHODS: Data were analyzed from 9,919 individuals who participated in two large community-based cohort studies conducted in Washington County, Maryland in 1974 (CLUE I) and 1989 (CLUE II). DNA from blood collected in 1989 was genotyped for 16 SNPs in 8 obesity-related genes: monoamine oxidase A (MAOA), lipoprotein lipase (LPL), paraoxonase 1 and 2 (PON1 and PON2), leptin receptor (LEPR), tumor necrosis factor-alpha (TNFalpha), and peroxisome proliferative activated receptor-gamma and -delta (PPARG and PPARD). Data on height and weight in 1989 (CLUE II baseline) and at age 21 were collected from participants at the time of blood collection. All participants were followed from 1989 to the date of death or the end of follow-up in 2005. Cox proportional hazards regression was used to obtain the relative risk (RR) estimates and 95% confidence intervals (CI) for each SNP and mortality outcomes. RESULTS: The results showed no patterns of association for the selected SNPs and the all-cause and cause-specific mortality outcomes, although statistically significant associations (p < 0.05) were observed between PPARG rs4684847 and all-cause mortality (CC: reference; CT: RR 0.99, 95% CI 0.89, 1.11; TT: RR 0.60, 95% CI 0.39, 0.93) and cancer-related mortality (CC: reference; CT: RR 1.01, 95% CI 0.82, 1.25; TT: RR 0.22, 95% CI 0.06, 0.90) and TNFalpha rs1799964 and cancer-related mortality (TT: reference; CT: RR 1.23, 95% CI 1.03, 1.47; CC: RR 0.83, 95% CI 0.54, 1.28). Additional analyses showed significant associations between SNPs in LEPR with BMI (rs1137101) and change in BMI over time (rs1045895 and rs1137101). CONCLUSION: Findings from this cohort study suggest that the selected SNPs are not associated with overall or cause-specific death, although several LEPR SNPs may be related to BMI and BMI change over time.
Assuntos
Predisposição Genética para Doença , Obesidade/genética , Obesidade/mortalidade , Polimorfismo de Nucleotídeo Único , Adulto , Idoso , Índice de Massa Corporal , Feminino , Marcadores Genéticos , Humanos , Masculino , Pessoa de Meia-Idade , Estudos ProspectivosRESUMO
AIMS: The risk of HeartMate II (HMII) left ventricular assist device (LVAD) thrombosis has been reported, and serum lactate dehydrogenase (LDH), a biomarker of haemolysis, increases secondary to LVAD thrombosis. This study evaluated longitudinal measurements of LDH post-LVAD implantation, hypothesizing that LDH trends could timely predict future LVAD thrombosis. METHODS AND RESULTS: From October 2004 to October 2014, 350 HMIIs were implanted in 323 patients at Cleveland Clinic. Of these, patients on 339 HMIIs had at least one post-implant LDH value (7996 total measurements). A two-step joint model combining longitudinal biomarker data and pump thrombosis events was generated to assess the effect of changing LDH on thrombosis risk. Device-specific LDH trends were first smoothed using multivariate boosted trees, and then used as a time-varying covariate function in a multiphase hazard model to analyse time to thrombosis. Pre-implant variables associated with time-varying LDH values post-implant using boostmtree were also investigated. Standardized variable importance for each variable was estimated as the difference between model-based prediction error of LDH when the variable was randomly permuted and prediction error without permuting the values. The larger this difference, the more important a variable is for predicting the trajectory of post-implant LDH. Thirty-five HMIIs (10%) had either confirmed (18) or suspected (17) thrombosis, with 15 (43%) occurring within 3 months of implant. LDH was associated with thrombosis occurring both early and late after implant (P < 0.0001 for both hazard phases). The model demonstrated increased probability of HMII thrombosis as LDH trended upward, with steep changes in LDH trajectory paralleling trajectories in probability of pump thrombosis. The most important baseline variables predictive of the longitudinal pattern of LDH were higher bilirubin, higher pre-implant LDH, and older age. The effect of some pre-implant variables such as sodium on the post-implant LDH longitudinal pattern differed across time. CONCLUSIONS: Longitudinal trends in surveillance LDH for patients on HMII support are useful for dynamic prediction of pump thrombosis, both early after implant and late. Incorporating upward and downward trends in LDH that dynamically update a model of LVAD thrombosis risk provides a useful tool for clinical management and decisions.
Assuntos
Doenças das Valvas Cardíacas/cirurgia , Coração Auxiliar/efeitos adversos , L-Lactato Desidrogenase/sangue , Isquemia Miocárdica/cirurgia , Trombose/etiologia , Adulto , Idoso , Bilirrubina/sangue , Biomarcadores/sangue , Estudos de Casos e Controles , Feminino , Doenças das Valvas Cardíacas/etnologia , Coração Auxiliar/estatística & dados numéricos , Hemólise/fisiologia , Humanos , Doença Iatrogênica/epidemiologia , Doença Iatrogênica/prevenção & controle , Análise de Intenção de Tratamento/tendências , Masculino , Pessoa de Meia-Idade , Isquemia Miocárdica/etnologia , Valor Preditivo dos Testes , Estudos RetrospectivosRESUMO
BACKGROUND: Few studies of reintervention after Heller myotomy for achalasia set patients' expectations, assist therapeutic decision making, and direct follow-up. Therefore, we investigated the frequency and type of symptoms and reinterventions after myotomy based on achalasia type. METHODS: From January 2006 to March 2013, 248 patients who had preoperative high-resolution manometry and a timed barium esophagram (TBE) underwent Heller myotomy, 62 (25%) for type I, 162 (65%) for type II, and 24 (10%) for type III achalasia. Postoperative surveillance, including TBE, was performed at 8 weeks, then annually. Median follow-up was 36 months. End points were all symptom types and modes of reintervention, endoscopic or surgical. Reintervention was based on both symptoms and objective TBE measurements. RESULTS: Eventually most patients (169 of 218; 69%) experienced at least one symptom after myotomy. Fifty patients underwent 85 reinterventions, 41 endoscopic only, 4 surgical only, and 5 both. Five-year freedom from reintervention was 62% for type I, 74% for type II, and 87% for type III, most occurring within 6 months, although later in type III. At 5 years, number of reinterventions per 100 patients was 72 for type I, 51 for type II, and 13 for type III. After each reintervention, there was approximately a 50% chance of another within 2 years. CONCLUSIONS: Patients' expectations when undergoing Heller myotomy for achalasia must be that symptoms will only be palliated, and patients who have worse esophageal function-achalasia type I-may require one or more postoperative reinterventions. Thus, we recommend that patients with achalasia have lifelong annual surveillance after Heller myotomy that includes TBE.