Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 51
Filter
1.
Article in English | MEDLINE | ID: mdl-38797580

ABSTRACT

BACKGROUND: Outflow graft obstruction (OGO) is an uncommon yet life-threatening complication in patients with left ventricular assist devices (LVADs). In this retrospective, single-center case series, we identify the baseline demographics and presenting features of patients who develop LVAD OGO and the procedural details and outcomes surrounding percutaneous endovascular intervention (PEI). METHODS: We conducted a retrospective review of patients with LVADs at our institution between January 2010 and February 2023 who developed OGO and were treated with PEI. Details of the PEI including procedure time, fluoroscopy time, contrast use, stent size, number of stents, change in gradient, and change in flow after intervention were collected. RESULTS: A total of 12 patients who had 14 cases of OGO were identified from January 2010 to February 2023. The average age at presentation was 64.78 years. Nine of the 14 cases occurred in male patients. Eleven of the 14 cases occurred with Heartware devices (2 recurrences), 2 in Heartmate 2 and 1 in Heartmate 3. Notable procedural details include a mean procedure time of 90.86 min and mean contrast use of 162.5 mL. The initial gradient across the OGO was reduced by an average of 72 %, to a mean post-PEI of 11.57 mmHg. The average number of stents to achieve this gradient was around 2.08, with the most common stent diameter being 10 mm. Thirty-day mortality after PEI was 7 % (1/14) in this high-risk patient population. CONCLUSION: In our single-center experience, PEI can be a safe and effective treatment for LVAD OGO.

3.
Ann Palliat Med ; 13(3): 598-606, 2024 May.
Article in English | MEDLINE | ID: mdl-38462933

ABSTRACT

BACKGROUND AND OBJECTIVE: Left ventricular assist devices (LVADs) have revolutionized the care of patients with advanced heart failure (HF). Compared to guideline-directed medical and device therapies, LVAD technology improves quality of life and reduces mortality. Palliative care specialists have an important role to play in the pre-LVAD evaluation phase, in the post-operative longitudinal care phase, and at the end-of-life in patients with LVADs. The objective of this narrative review is to describe the evidence regarding the role of palliative care for patients with LVAD across the care continuum: pre-implantation, post-implantation, and at the end-of-life. METHODS: Clinical trials relevant to care of patients with HF, LVADs, and the role of palliative care were analyzed for this narrative review. KEY CONTENT AND FINDINGS: Palliative care involvement in 'preparedness planning' has been described in the literature, though no standardized protocol for preparedness planning exists, to date. In the longitudinal care phase after LVAD implantation, the role of palliative care is less defined; depending on institutional culture and availability of palliative care, patients may be referred based on symptom-management needs or for advance care planning (ACP). At the end-of-life, either due to an acute event or a gradually worsening condition, palliative care is often engaged to participate in discussions regarding treatment preferences and to consider transitions in care from disease-directed treatments to comfort-focused treatments. Given the medical complexity of dying with LVADs, most patients with an LVAD die in hospital with support from palliative care teams for the physical, existential, and psychosocial distress that accompanies end-of-life and LVAD deactivation. CONCLUSIONS: In this narrative review, we describe the integral role of palliative care throughout the care continuum of patients living with LVADs and suggest opportunities for further research.


Subject(s)
Heart Failure , Heart-Assist Devices , Palliative Care , Humans , Palliative Care/methods , Heart Failure/therapy , Quality of Life , Terminal Care , Advance Care Planning
5.
J Card Fail ; 2023 Oct 29.
Article in English | MEDLINE | ID: mdl-37907147

ABSTRACT

BACKGROUND: Transplantation of hearts from hepatitis C virus (HCV)-positive donors has increased substantially in recent years following development of highly effective direct-acting antiviral therapies for treatment and cure of HCV. Although historical data from the pre-direct-acting antiviral era demonstrated an association between HCV-positive donors and accelerated cardiac allograft vasculopathy (CAV) in recipients, the relationship between the use of HCV nucleic acid test-positive (NAT+) donors and the development of CAV in the direct-acting antiviral era remains unclear. METHODS AND RESULTS: We performed a retrospective, single-center observational study comparing coronary angiographic CAV outcomes during the first year after transplant in 84 heart transplant recipients of HCV NAT+ donors and 231 recipients of HCV NAT- donors. Additionally, in a subsample of 149 patients (including 55 in the NAT+ cohort and 94 in the NAT- cohort) who had serial adjunctive intravascular ultrasound examination performed, we compared development of rapidly progressive CAV, defined as an increase in maximal intimal thickening of ≥0.5 mm in matched vessel segments during the first year post-transplant. In an unadjusted analysis, recipients of HCV NAT+ hearts had reduced survival free of CAV ≥1 over the first year after heart transplant compared with recipients of HCV NAT- hearts. After adjustment for known CAV risk factors, however, there was no significant difference between cohorts in the likelihood of the primary outcome, nor was there a difference in development of rapidly progressive CAV. CONCLUSIONS: These findings support larger, longer-term follow-up studies to better elucidate CAV outcomes in recipients of HCV NAT+ hearts and to inform post-transplant management strategies.

6.
ASAIO J ; 69(8): 766-773, 2023 08 01.
Article in English | MEDLINE | ID: mdl-37145800

ABSTRACT

Refractory right ventricular failure has significant morbidity and mortality. Extracorporeal membrane oxygenation is indicated when medical interventions are deemed ineffective. However, it is still being determined if one configuration is better. We conducted a retrospective analysis of our institutional experience comparing the peripheral veno-pulmonary artery (V-PA) configuration versus the dual-lumen cannula with the tip in the pulmonary artery (C-PA). The analysis of a cohort of 24 patients (12 patients in each group). There was no difference in survival after hospital discharge (58.3% in the C-PA group compared to 41.7% in the V-PA group, p = 0.4). Among the C-PA group, there was a statistically significant shorter ICU length of stay (23.5 days [interquartile range {IQR} = 19-38.5] vs. 43 days [IQR = 30-50], p = 0.043) and duration of mechanical ventilation (7.5 days [IQR = 4.5-9.5] compared to (16.5 days [IQR = 9.5-22.5], p = 0.006) in the V-PA group. In the C-PA group, there were lower incidents of bleeding (33.33% vs. 83.33%, p =0.036) and combined ischemic events (0 vs. 41.67%, p = 0.037). In our single-center experience, the C-PA configuration might have a better outcome than the V-PA one. Further studies are needed to confirm our findings.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart Failure , Humans , Cannula , Pulmonary Artery , Extracorporeal Membrane Oxygenation/adverse effects , Retrospective Studies , Catheterization , Heart Failure/surgery
7.
ASAIO J ; 69(8): 782-788, 2023 08 01.
Article in English | MEDLINE | ID: mdl-37084328

ABSTRACT

Infection remains a common cause of morbidity and mortality in patients with both left ventricular assist devices (LVADs) and cardiac implanted electronic devices (CIEDs) with limited data describing outcomes in patients who have both devices implanted. We performed a single-center, retrospective, observational cohort study of patients with both a transvenous CIED and LVAD who developed bacteremia. Ninety-one patients were evaluated. Eighty-one patients (89.0%) were treated medically and nine patients (9.9%) underwent surgical management. A multivariable logistic regression showed that blood culture positivity for >72 hours was associated with inpatient death, when controlled for age and management strategy (odds ratio [OR] = 3.73 [95% confidence interval {CI} = 1.34-10.4], p = 0.012). In patients who survived the initial hospitalization, the use of long-term suppressive antibiotics was not associated with the composite outcome of death or infection recurrence within 1 year, when controlled for age and management strategy (OR = 2.31 [95% CI = 0.88-2.62], p = 0.09). A Cox proportional hazards model showed that blood culture positivity for >72 hours was associated with a trend toward increased mortality in the first year, when controlled for age, management strategy, and staphylococcal infection (hazard ratio = 1.72 [95% CI = 0.88-3.37], p = 0.11). Surgical management was associated with a trend toward decreased mortality (hazard ratio = 0.23 [95% CI = 0.05-1.00], p = 0.05).


Subject(s)
Bacteremia , Defibrillators, Implantable , Heart Failure , Heart-Assist Devices , Humans , Retrospective Studies , Heart Failure/complications , Heart Failure/surgery , Heart-Assist Devices/adverse effects , Cohort Studies , Bacteremia/etiology , Treatment Outcome
8.
Ann Thorac Surg ; 116(3): 492-498, 2023 09.
Article in English | MEDLINE | ID: mdl-35108502

ABSTRACT

BACKGROUND: Hospitalizations for drug-use associated infective endocarditis (DUA-IE) have led to increasing surgical consultation for valve replacement. Cardiothoracic surgeons' perspectives about the process of decision making around operation for people with DUA-IE are largely unknown. METHODS: This multisite semiqualitative study sought to gather the perspectives of cardiothoracic surgeons on initial and repeat valve surgery for people with DUA-IE through purposeful sampling of surgeons at 7 hospitals: University of Alabama, Tufts Medical Center, Boston Medical Center, Massachusetts General Hospital, University of North Carolina-Chapel Hill, Vanderbilt University Medical Center, and Rhode Island Hospital-Brown University. RESULTS: Nineteen cardiothoracic surgeons (53% acceptance) were interviewed. Perceptions of the drivers of addiction varied as well as approaches to repeat valve operations. There were mixed views on multidisciplinary meetings, although many surgeons expressed an interest in more efficient meetings and more intensive postoperative and posthospitalization multidisciplinary care. CONCLUSIONS: Cardiothoracic surgeons are emotionally and professionally impacted by making decisions about whether to perform valve operation for people with DUA-IE. The use of efficient, agenda-based multidisciplinary care teams is an actionable solution to improve cross-disciplinary partnerships and outcomes for people with DUA-IE.


Subject(s)
Endocarditis, Bacterial , Endocarditis , Heart Valve Prosthesis Implantation , Substance-Related Disorders , Surgeons , Humans , Endocarditis, Bacterial/surgery , Endocarditis, Bacterial/complications , Endocarditis/surgery , Endocarditis/complications , Substance-Related Disorders/complications
9.
JACC Heart Fail ; 10(6): 397-403, 2022 06.
Article in English | MEDLINE | ID: mdl-35654524

ABSTRACT

BACKGROUND: As utilization of veno-arterial extracorporeal life support (VA-ECLS) in treatment of cardiogenic shock (CS) continues to expand, clinical variables that guide clinicians in early recognition of myocardial recovery and therefore, improved survival, after VA-ECLS are critical. There remains a paucity of literature on early postinitiation blood pressure measurements that predict improved outcomes. OBJECTIVES: The objective of this study is to help identify early blood pressure variables associated with improved outcomes in VA-ECLS. METHODS: The authors queried the ELSO (Extracorporeal Life Support Organization) registry for cardiogenic shock patients treated with VA-ECLS or venovenous arterial ECLS between 2009 and 2020. Their inclusion criteria included treatment with VA-ECLS or venovenous arterial ECLS; absence of pre-existing durable right, left, or biventricular assist devices; no pre-ECLS cardiac arrest; and no surgical or percutaneously placed left ventricular venting devices during their ECLS runs. Their primary outcome of interest was the survival to discharge during index hospitalization. RESULTS: A total of 2,400 CS patients met the authors' inclusion criteria and had complete documentation of blood pressures. Actual mortality during index hospitalization in their cohort was 49.5% and survivors were younger and more likely to be Caucasian, intubated for >30 hours pre-ECLS initiation, and had a favorable baseline SAVE (Survival After Veno-arterial ECMO) score (P < 0.05 for all). Multivariable regression analyses adjusting for SAVE score, age, ECLS flow at 4 hours, and race showed that every 10-mm Hg increase in baseline systolic blood pressure (HR: 0.92 [95% CI: 0.89-0.95]; P < 0.001), and baseline pulse pressure (HR: 0.88 [95% CI: 0.84-0.91]; P < 0.001) at 24 hours was associated with a statistically significant reduction in mortality. CONCLUSIONS: Early (within 24 hours) improvements in pulse pressure and systolic blood pressure from baseline are associated with improved survival to discharge among CS patients treated with VA-ECLS.


Subject(s)
Extracorporeal Membrane Oxygenation , Heart Failure , Blood Pressure , Heart Failure/etiology , Humans , Registries , Shock, Cardiogenic
11.
J Card Surg ; 37(4): 1076-1079, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35092068

ABSTRACT

Normothermic machine perfusion of organs is growing in popularity and has been used for both abdominal and thoracic organ preservation before transplantation. The use of normothermic machine perfusion for donation after cardiac death organs can reduce cold ischemia time and help prevent ischemia-related complications. We present a successful case of a donation after cardiac death procurement with both liver and heart allografts preserved by normothermic machine perfusion. Both allografts were perfused without complications and transplanted successfully. As the technology continues to become more prevalent, the situation described will become more commonplace, and we offer a view of the future in transplantation.


Subject(s)
Tissue and Organ Procurement , Humans , Liver , Organ Preservation , Perfusion , Tissue Donors
13.
J Heart Lung Transplant ; 40(11): 1408-1418, 2021 11.
Article in English | MEDLINE | ID: mdl-34334301

ABSTRACT

BACKGROUND: Given the shortage of suitable donor hearts for cardiac transplantation and the growing interest in donation after circulatory death (DCD), our institution recently began procuring cardiac allografts from DCD donors. METHODS: Between October 2020 and March 2021, 15 patients with heart failure underwent cardiac transplantation using DCD allografts. Allografts were procured using a modified extracorporeal membrane oxygenation circuit for thoracic normothermic regional perfusion (TA-NRP) and were subsequently transported using cold static storage. Data collection and analysis were performed with institutional review board approval. RESULTS: The mean age of the DCD donors was 23 ± 7 years and average time on TA-NRP was 56 ± 8 minutes. Total ischemic time was 183 ± 31 minutes and distance from transplant center was 373 ± 203 nautical miles. Recipient age was 55 ± 14 years, with 8 (55.3%) recipients on durable left ventricular assist device support. Post-transplant, 6 (40%) recipients experienced mild left ventricle primary graft dysfunction (PGD-LV), 3 (20%) recipients experienced moderate PGD-LV, and no recipients experienced severe PGD-LV. Postoperative transthoracic echocardiogram demonstrated left ventricular ejection fraction >55% in all recipients. One recipient (6.6%) developed International Society for Heart and Lung Transplantation 2R acute cellular rejection on first biopsy. At last follow-up, all 15 recipients were alive past 30-days. CONCLUSIONS: Cardiac DCD provides an opportunity to increase the availability of donor hearts for transplantation. Utilizing TA-NRP with cold static storage, we have extended the cold ischemic time of DCD allografts to almost 3 hours, allowing for inter-hospital organ transport.


Subject(s)
Cold Ischemia/methods , Graft Rejection/prevention & control , Heart Failure/surgery , Heart Transplantation/methods , Organ Preservation/methods , Perfusion/methods , Tissue and Organ Procurement/methods , Adolescent , Adult , Child , Female , Follow-Up Studies , Graft Survival , Humans , Male , Retrospective Studies , Time Factors , Young Adult
14.
J Card Surg ; 36(10): 3619-3628, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34235763

ABSTRACT

BACKGROUND: On October 18, 2018, several changes to the donor heart allocation system were enacted. We hypothesize that patients undergoing orthotopic heart transplantation (OHT) under the new allocation system will see an increase in ischemic times, rates of primary graft dysfunction, and 1-year mortality due to these changes. METHODS: In this single-center retrospective study, we reviewed the charts of all OHT patients from October 2017 through October 2019. Pre- and postallocation recipient demographics were compared. Survival analysis was performed using the Kaplan-Meier method. RESULTS: A total of 184 patients underwent OHT. Recipient demographics were similar between cohorts. The average distance from donor increased by more than 150 km (p = .006). Patients in the postallocation change cohort demonstrated a significant increase in the rate of severe left ventricle primary graft dysfunction from 5.4% to 18.7% (p = .005). There were no statistically significant differences in 30-day mortality or 1-year survival. Time on the waitlist was reduced from 203.8 to 103.7 days (p = .006). CONCLUSIONS: Changes in heart allocation resulted in shorter waitlist times at the expense of longer donor distances and ischemic times, with an associated negative impact on early post-transplantation outcomes. No significant differences in 30-day or 1-year mortality were observed.


Subject(s)
Heart Transplantation , Adult , Humans , Retrospective Studies , Survival Analysis , Tissue Donors , Waiting Lists
15.
J Card Surg ; 36(9): 3217-3221, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34137079

ABSTRACT

BACKGROUND: Coronavirus disease 2019 (COVID-19) has significantly impacted the healthcare landscape in the United States in a variety of ways including a nation-wide reduction in operative volume. The impact of COVID-19 on the availability of donor organs and the impact on solid organ transplant remains unclear. We examine the impact of COVID-19 on a single, large-volume heart transplant program. METHODS: A retrospective chart review was performed examining all adult heart transplants performed at a single institution between March 2020 and June 2020. This was compared to the same time frame in 2019. We examined incidence of primary graft dysfunction, continuous renal replacement therapy (CRRT) and 30-day survival. RESULTS: From March to June 2020, 43 orthotopic heart transplants were performed compared to 31 performed during 2019. Donor and recipient demographics demonstrated no differences. There was no difference in 30-day survival. There was a statistically significant difference in incidence of postoperative CRRT (9/31 vs. 3/43; p = .01). There was a statistically significant difference in race (23 W/8B/1AA vs. 30 W/13B; p = .029). CONCLUSION: We demonstrate that a single, large-volume transplant program was able to grow volume with little difference in donor variables and clinical outcomes following transplant. While multiple reasons are possible, most likely the reduction of volume at other programs allowed us to utilize organs to which we would not have previously had access. More significantly, our growth in volume was coupled with no instances of COVID-19 infection or transmission amongst patients or staff due to an aggressive testing and surveillance program.


Subject(s)
COVID-19 , Heart Transplantation , Tissue and Organ Procurement , Adult , Humans , Pandemics , Retrospective Studies , SARS-CoV-2 , Tissue Donors , United States/epidemiology
16.
Br J Anaesth ; 126(3): 599-607, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33549321

ABSTRACT

BACKGROUND: Increased intravascular volume has been associated with protection from acute kidney injury (AKI), but in patients with congestive heart failure, venous congestion is associated with increased AKI. We tested the hypothesis that intraoperative venous congestion is associated with AKI after cardiac surgery. METHODS: In patients enrolled in the Statin AKI Cardiac Surgery trial, venous congestion was quantified as the area under the curve (AUC) of central venous pressure (CVP) >12, 16, or 20 mm Hg during surgery (mm Hg min). AKI was defined using Kidney Disease Improving Global Outcomes (KDIGO) criteria and urine concentrations of tissue inhibitor of metalloproteinase-2 and insulin-like growth factor binding protein 7 ([TIMP-2]⋅[IGFBP7]), a marker of renal stress. We measured associations between venous congestion, AKI and [TIMP-2]⋅[IGFBP7], adjusted for potential confounders. Values are reported as median (25th-75th percentile). RESULTS: Based on KDIGO criteria, 104 of 425 (24.5%) patients developed AKI. The venous congestion AUCs were 273 mm Hg min (81-567) for CVP >12 mm Hg, 66 mm Hg min (12-221) for CVP >16 mm Hg, and 11 mm Hg min (1-54) for CVP >20 mm Hg. A 60 mm Hg min increase above the median venous congestion AUC above each threshold was independently associated with increased AKI (odds ratio=1.06; 95% confidence interval [CI], 1.02-1.10; P=0.008; odds ratio=1.12; 95% CI, 1.02-1.23; P=0.013; and odds ratio=1.30; 95% CI, 1.06-1.59; P=0.012 for CVP>12, >16, and >20 mm Hg, respectively). Venous congestion before cardiopulmonary bypass was also associated with increased [TIMP-2]⋅[IGFBP7] measured during cardiopulmonary bypass and after surgery, but neither venous congestion after cardiopulmonary bypass nor venous congestion throughout surgery was associated with postoperative [TIMP-2]⋅[IGFBP7]. CONCLUSION: Intraoperative venous congestion was independently associated with increased AKI after cardiac surgery.


Subject(s)
Acute Kidney Injury/etiology , Cardiac Surgical Procedures/adverse effects , Central Venous Pressure , Hyperemia/etiology , Acute Kidney Injury/epidemiology , Aged , Cohort Studies , Female , Humans , Hyperemia/epidemiology , Intraoperative Period , Male , Prospective Studies , Risk Factors , Treatment Outcome
17.
J Card Surg ; 36(2): 457-465, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33283358

ABSTRACT

BACKGROUND: Data on out-of-ice implantation ischemia in heart transplant are scarce. We examined implantation time's impact on allograft dysfunction. METHODS: We conducted a single-site retrospective review of all primary adult heart transplants from June 2012 to August 2019 for implantation warm ischemic time (WIT), defined as first atrial stitch to aortic crossclamp removal. Univariate regression was used to assess the relationship of perioperative variables to primary graft dysfunction (PGD) and to pulmonary artery pulsatility index (PAPi) at postoperative hour 24. A threshold of p < .10 was set for the inclusion of covariates in multivariate regression. Secondary analyses evaluated for consistency among alternative criteria for allograft dysfunction. A post hoc subgroup analysis examined WIT effect in prolonged total ischemia of 240 min or longer. RESULTS: Complete data were available for 201 patients. Baseline characteristics were similar between patients who did and did not have WIT documented. In univariate analysis, female gender, longer total ischemic time (TIT), longer bypass time, greater blood transfusions, and pretransplant intensive care unit (ICU) care were associated with PGD, whereas longer bypass time was associated with worse PAPi and pretransplant ICU care was associated with better PAPi. In multivariate analysis, longer bypass time predicted PGD, and worse PAPi and preoperative ICU admission predicted PGD and better PAPi. Results did not differ in secondary or subgroup analyses. CONCLUSIONS: This study is one of few examining the functional impact of cardiac implantation ischemia. Results suggest allograft implantation time alone may not impact postoperative graft function, which was driven by intraoperative bypass duration and by preoperative ICU care, instead.


Subject(s)
Heart Transplantation , Lung Transplantation , Primary Graft Dysfunction , Adult , Female , Humans , Pulmonary Artery , Retrospective Studies , Risk Factors
18.
Ann Thorac Surg ; 111(4): 1258-1263, 2021 04.
Article in English | MEDLINE | ID: mdl-32896546

ABSTRACT

BACKGROUND: Bundled payments for coronary artery bypass grafting (CABG) provide a single reimbursement for care provided from admission through 90 days post-discharge. We aim to explore the impact of complications on total institutional costs, as well as the drivers of high costs for index hospitalization. METHODS: We linked clinical and internal cost data for patients undergoing CABG from 2014 to 2017 at a single institution. We compared unadjusted average variable direct costs, reporting excess cost from an uncomplicated baseline. We stratified by The Society of Thoracic Surgeons preoperative risk and quality outcome measures as well as value-based outcomes (readmission, post-acute care utilization). We performed multivariable linear regression to evaluate drivers of high costs, adjusting for preoperative and intraoperative characteristics and postoperative complications. RESULTS: We reviewed 1789 patients undergoing CABG with an average of 2.7 vessels (SD 0.89). A significant proportion of patients were diabetic (51.2%) and obese (mean body mass index 30.6, SD 6.1). Factors associated with increased adjusted costs were preoperative renal failure (P = .001), diabetes (P = .001) and body mass index (P = .05), and postoperative stroke (P < .001), prolonged ventilation (P < .001), rebleeding requiring reoperation (P < .001) and renal failure (P < .001) with varying magnitude. Preoperative ejection fraction and insurance status were not associated with increased adjusted costs. CONCLUSIONS: Preoperative characteristics had less of an impact on costs post-CABG than postoperative complications. Postoperative complications vary in their impact on internal costs, with reoperation, stroke, and renal failure having the greatest impact. In preparation for bundled payments, hospitals should focus on understanding and preventing drivers of high cost.


Subject(s)
Coronary Artery Bypass/economics , Coronary Artery Disease/surgery , Hospital Costs , Postoperative Complications/economics , Coronary Artery Disease/economics , Cost-Benefit Analysis , Female , Health Resources/economics , Humans , Male , Middle Aged , Reoperation , Retrospective Studies , Risk Factors
19.
Semin Thorac Cardiovasc Surg ; 32(1): 47-56, 2020.
Article in English | MEDLINE | ID: mdl-31557512

ABSTRACT

Minimally invasive mitral valve surgery (mini-MVS) with hypothermic fibrillatory arrest has been associated with an increased risk of stroke. We aim to investigate the incidence, predictors, and outcomes of stroke in a large cohort of patient who underwent clampless mini-MVS. Between January 2008 and June 2017, we performed 1247 mini-MVSs. The clinical, operative, and postoperative outcomes were analyzed. Univariable and multivariable regression analyses were used to identify predictors of postoperative stroke. The median follow-up was 5.2 years (interquartile range 2.6-7.5). The etiology of mitral valve (MV) disease was degenerative (60.4%, n = 753), functional (12.8%, n = 160), rheumatic (8.7%, n = 109), endocarditis (3.1%, n = 39), and reoperative MV surgery (14.9%, n = 186). The overall incidence of postoperative neurologic event was 2.5% (n = 31/1247). Univariable predictors of stroke were a higher Society of Thoracic Surgeons mortality risk (6.0 ± 11.8% vs 3.3 ± 5.2%, P < 0.001), advanced age, (69.6 ± 12.1 years vs 63.0 ± 13.6 years, P = 0.002), female gender (71.0% vs 46.3%, P = 0.007), and a history of a cerebrovascular accident (22.6% vs 8.7%, P = 0.008). Stroke patients had a higher 30-day mortality (22.6% vs 1.6%, P < 0.001) and a higher risk for long-term mortality (hazard ratio = 5.56, 95% confidence interval [CI] 3.2-9.6, P < 0.001). Advanced age (odds ratio [OR] 2.1; 95% CI 1.1-4.0; P = 0.02), female gender (OR 2.3; 95% CI 0.9-5.2; P = 0.05), and history of cerebrovascular accident (OR 3.1; 95% CI 0.98-10.1; P = 0.05) remained as independent predictors of stroke in the multivariable analysis. Our decade-long experience indicates that clampless mini-MVS is associated with a low incidence of postoperative stroke, and that the predictors of stroke are not specific to this approach.


Subject(s)
Heart Valve Diseases/surgery , Heart Valve Prosthesis Implantation/adverse effects , Mitral Valve Annuloplasty/adverse effects , Mitral Valve/surgery , Stroke/etiology , Thoracotomy/adverse effects , Adult , Aged , Female , Heart Valve Diseases/diagnostic imaging , Heart Valve Diseases/mortality , Heart Valve Diseases/physiopathology , Heart Valve Prosthesis Implantation/mortality , Hemodynamics , Humans , Male , Middle Aged , Mitral Valve/diagnostic imaging , Mitral Valve/physiopathology , Mitral Valve Annuloplasty/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Stroke/diagnosis , Thoracotomy/mortality , Time Factors , Treatment Outcome
20.
ASAIO J ; 66(5): 553-558, 2020 05.
Article in English | MEDLINE | ID: mdl-31425256

ABSTRACT

Donor-derived hepatitis C (dd-HCV) infection may increase the risk of renal impairment (RI) among heart transplantation (HT) recipients. Sofosbuvir, an integral component of HCV direct-acting antivirals (DAAs) has also been linked to RI. To date, no study has examined the trends in renal function for HT recipients of dd-HCV infection and assessed safety and efficacy of Sofosbuvir-based DAAs. Between September 2016 and June 2018, 46 HCV-naive patients and one patient with a history of HCV treated pretransplant, underwent HT from HCV-positive donors (follow-up available through October 10, 2018). Patients were treated with Ledipasvir-Sofosbuvir (genotype 1) or Sofosbuvir-Velpatasvir (genotype 3) for 12 or 24 weeks; no dose adjustments were made for renal function. Data on renal function were available for 23 patients who achieved a sustained virologic response at 12 weeks after the treatment (SVR12; cohort A) and 18 patients who completed 1 year of follow-up (cohort B). Treatment of dd-HCV infection was initiated at a median of 6 weeks post-HT. In both cohorts, a nonsignificant reduction in median estimated glomerular filtration rate (eGFR; ml/min/1.73 m) was noted (cohort A: pretransplant eGFR: 62 [interquartile range {IQR}: 1-84] to SVR12 eGFR: 49 [IQR: 37-82]; p = 0.43; cohort B: pretransplant eGFR: 65 [IQR: 54-84] to 1 year post-HT eGFR: 56 [IQR: 39-75]; p = 0.29). Pretreatment renal function had no significant impact on changes in renal function during treatment. All patients tolerated DAAs well with 100% completion rate to the assigned therapy and duration and 100% success at achieving SVR12. In this first and largest reported case series to date of HT recipients with dd-HCV infection, we observed that neither the dd-HCV infection nor its treatment with Sofosbuvir-based DAAs increased the risk of RI. Sofosbuvir-based DAAs appear safe, tolerable, and effective for HCV treatment even in presence of severe RI.


Subject(s)
Antiviral Agents/therapeutic use , Heart Transplantation , Hepatitis C/drug therapy , Hepatitis C/etiology , Kidney Diseases/epidemiology , Adult , Benzimidazoles/therapeutic use , Carbamates/therapeutic use , Drug Therapy, Combination/methods , Female , Fluorenes/therapeutic use , Heterocyclic Compounds, 4 or More Rings/therapeutic use , Humans , Kidney Diseases/etiology , Male , Middle Aged , Sofosbuvir/therapeutic use , Sustained Virologic Response , Tissue Donors , Transplant Recipients , Uridine Monophosphate/analogs & derivatives , Uridine Monophosphate/therapeutic use
SELECTION OF CITATIONS
SEARCH DETAIL
...