Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Article in English | MEDLINE | ID: mdl-38759766

ABSTRACT

BACKGROUND: Molecular testing with gene expression profiling (GEP) and donor-derived cell-free DNA (dd-cfDNA) is increasingly used in the surveillance for acute cellular rejection (ACR) after heart transplant. However, the performance of dual testing over each test individually has not been established. Further, the impact of dual non-invasive surveillance on clinical decision-making has not been widely investigated. METHODS: We evaluated 2077 subjects from the SHORE registry who were enrolled between 2018 and 2021 and had verified biopsy data, and were categorized as dual negative, GEP positive/dd-cfDNA negative, GEP negative/dd-cfDNA positive, or dual positive. Incidence of ACR and follow-up testing rates for each group were evaluated. Positive likelihood ratios (LR+) were calculated and biopsy rates over time were analyzed. RESULTS: The incidence of ACR was 1.5% for dual negative, 1.9% for GEP positive/dd-cfDNA negative, 4.3% for GEP negative/dd-cfDNA positive and 9.2% for dual positive groups. Follow-up biopsies were performed after 8.8% for dual negative, 14.2% for GEP positive/dd-cfDNA negative, 22.8% for GEP negative/dd-cfDNA positive and 35.4% for dual positive results. The LR+ for ACR was 1.37, 2.91 and 3.90 for GEP positive, dd-cfDNA positive and dual positive testing, respectively. From 2018-2021, first-year biopsy rates declined from 5.9 to 5.3 biopsies/patient, and second year from 1.5 to 0.9 biopsies/patient. At two-years, survival was 94.9% and only 2.7% had graft dysfunction. CONCLUSIONS: Dual molecular testing demonstrated improved performance for ACR surveillance compared to single molecular testing. Use of dual non-invasive testing was associated with lower biopsy rates over time, excellent survival, and low incidence of graft dysfunction.

2.
Circ Heart Fail ; 17(5): e010826, 2024 May.
Article in English | MEDLINE | ID: mdl-38708598

ABSTRACT

BACKGROUND: While tricuspid annular plane systolic excursion (TAPSE) captures the predominant longitudinal motion of the right ventricle (RV), it does not account for ventricular morphology and radial motion changes in various forms of pulmonary hypertension. This study aims to account for both longitudinal and radial motions by dividing TAPSE by RV area and to assess its clinical significance. METHODS: We performed a retrospective analysis of 71 subjects with New York Heart Association class II to III dyspnea who underwent echocardiogram and invasive cardiopulmonary exercise testing (which defined 4 hemodynamic groups: control, isolated postcapillary pulmonary hypertension, combined postcapillary pulmonary hypertension, and pulmonary arterial hypertension). On the echocardiogram, TAPSE was divided by RV area in diastole (TAPSE/RVA-D) and systole (TAPSE/RVA-S). Analyses included correlations (Pearson and linear regression), receiver operating characteristic, and survival curves. RESULTS: On linear regression analysis, TAPSE/RVA metrics (versus TAPSE) had a stronger correlation with pulmonary artery compliance (r=0.48-0.54 versus 0.38) and peak VO2 percentage predicted (0.23-0.30 versus 0.18). Based on the receiver operating characteristic analysis, pulmonary artery compliance ≥3 mL/mm Hg was identified by TAPSE/RVA-D with an under the curve (AUC) of 0.79 (optimal cutoff ≥1.1) and by TAPSE/RVA-S with an AUC of 0.83 (optimal cutoff ≥1.5), but by TAPSE with only an AUC of 0.67. Similarly, to identify peak VO2 <50% predicted, AUC of 0.66 for TAPSE/RVA-D and AUC of 0.65 for TAPSE/RVA-S. Death or cardiovascular hospitalization at 12 months was associated with TAPSE/RVA-D ≥1.1 (HR, 0.38 [95% CI, 0.11-0.56]) and TAPSE/RVA-S ≥1.5 (HR, 0.44 [95% CI, 0.16-0.78]), while TAPSE was not associated with adverse outcomes (HR, 0.99 [95% CI, 0.53-1.94]). Among 31 subjects with available cardiac magnetic resonance imaging, RV ejection fraction was better correlated with novel metrics (TAPSE/RVA-D r=0.378 and TAPSE/RVA-S r=0.328) than TAPSE (r=0.082). CONCLUSIONS: In a broad cohort with suspected pulmonary hypertension, TAPSE divided by RV area was superior to TAPSE alone in correlations with pulmonary compliance and exercise capacity. As a prognostic marker of right heart function, TAPSE/RVA-D <1.1 and TAPSE/RVA-S <1.5 predicted adverse cardiovascular outcomes.


Subject(s)
Exercise Test , Exercise Tolerance , Pulmonary Artery , Ventricular Function, Right , Humans , Male , Female , Retrospective Studies , Middle Aged , Exercise Tolerance/physiology , Ventricular Function, Right/physiology , Pulmonary Artery/physiopathology , Pulmonary Artery/diagnostic imaging , Aged , Heart Ventricles/physiopathology , Heart Ventricles/diagnostic imaging , Hypertension, Pulmonary/physiopathology , Tricuspid Valve/diagnostic imaging , Tricuspid Valve/physiopathology , Echocardiography , Predictive Value of Tests , Prognosis
3.
J Pharm Pract ; : 8971900241237057, 2024 Feb 23.
Article in English | MEDLINE | ID: mdl-38395741

ABSTRACT

Invasive aspergillosis (IA) is a rare and often fatal complication of immunosuppression following orthotopic heart transplant. Prophylaxis plays a crucial role in preventing the emergence of this opportunistic infection. The azole class of medications are the bellwether agents utilized in this patient population. Unfortunately, given their impact on the Cytochrome P450 enzyme system, significant fluctuations in serum tacrolimus concentrations occur when initiating and stopping azole therapy, increasing the risk for prolonged periods of sub-optimal immunosuppression. While there are recommended dosing adjustments for these transition periods based on small data sets primarily with fluconazole, there is no published literature on recommended dosing adjustments for posaconazole. Given our institution utilizes posaconazole as the primary therapeutic for aspergillosis prophylaxis, we aimed to explore and report our local data to better guide dosing decisions during these transition periods.

4.
J Thorac Imaging ; 2023 Sep 20.
Article in English | MEDLINE | ID: mdl-37732694

ABSTRACT

PURPOSE: Intravenous contrast poses challenges to computed tomography (CT) muscle density analysis. We developed and tested corrections for contrast-enhanced CT muscle density to improve muscle analysis and the utility of CT scans for the assessment of myosteatosis. MATERIALS AND METHODS: Using retrospective images from 240 adults who received routine abdominal CT imaging from March to November 2020 with weight-based iodine contrast, we obtained paraspinal muscle density measurements from noncontrast (NC), arterial, and venous-phase images. We used a calibration sample to develop 9 different mean and regression-based corrections for the effect of contrast. We applied the corrections in a validation sample and conducted equivalence testing. RESULTS: We evaluated 140 patients (mean age 52.0 y [SD: 18.3]; 60% female) in the calibration sample and 100 patients (mean age 54.8 y [SD: 18.9]; 60% female) in the validation sample. Contrast-enhanced muscle density was higher than NC by 8.6 HU (SD: 6.2) for the arterial phase (female, 10.4 HU [SD: 5.7]; male, 6.0 HU [SD:6.0]) and by 6.4 HU [SD:8.1] for the venous phase (female, 8.0 HU [SD: 8.6]; male, 4.0 HU [SD: 6.6]). Corrected contrast-enhanced and NC muscle density was equivalent within 3 HU for all correctionns. The -7.5 HU correction, independent of sex and phase, performed well for arterial (95% CI: -0.18, 1.80 HU) and venous-phase data (95% CI: -0.88, 1.41 HU). CONCLUSIONS: Our validated correction factor of -7.5 HU renders contrast-enhanced muscle density statistically similar to NC density and is a feasible rule-of-thumb for clinicians to implement.

5.
JAMA Netw Open ; 6(6): e2319191, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37351886

ABSTRACT

Importance: Pretransplant obesity and higher pulmonary vascular resistance (PVR) are risk factors for death after heart transplant. However, it remains unclear whether appropriate donor-to-recipient size matching using predicted heart mass (PHM) is associated with lower risk. Objective: To investigate the association of size matching using PHM with risk of death posttransplant among patients with obesity and/or higher PVR. Design, Setting, and Participants: All adult patients (>18 years) who underwent heart transplant between 2003 and 2022 with available information using the United Network for Organ Sharing cohort database. Multivariable Cox models and multivariable-adjusted spline curves were used to examine the risk of death posttransplant with PHM matching. Data were analyzed from October 2022 to March 2023. Exposure: Recipient's body mass index (BMI) in categories (<18.0 [underweight], 18.1-24.9 [normal weight, reference], 25.0-29.9 [overweight], 30.0-34.9 [obese 1], 35-39.9 [obese 2], and ≥40.0 [obese 3]) and recipient's pretransplant PVR in categories of less than 4 (29 061 participants), 4 to 6 (2842 participants), and more than 6 Wood units (968 participants); and less than 3 (24 950 participants), 3 to 5 (6115 participants), and 5 or more (1806 participants) Wood units. Main Outcome: All-cause death posttransplant on follow-up. Results: The mean (SD) age of the cohort of 37 712 was 52.8 (12.8) years, 27 976 (74%) were male, 25 342 were non-Hispanic White (68.0%), 7664 were Black (20.4%), and 3139 were Hispanic or Latino (8.5%). A total of 12 413 recipients (32.9%) had a normal BMI, 13 849 (36.7%) had overweight, and 10 814 (28.7%) had obesity. On follow-up (median [IQR] 5.05 [0-19.4] years), 12 785 recipients (3046 female) died. For patients with normal weight, overweight, or obese 2, receiving a PHM-undermatched heart was associated with an increased risk of death (normal weight hazard ratio [HR], 1.20; 95% CI, 1.07-1.34; overweight HR, 1.12; 95% CI, 1.02-1.23; and obese 2 HR, 1.07; 95% CI, 1.01-1.14). Moreover, patients with higher pretransplant PVR who received an undermatched heart had a higher risk of death posttransplant in multivariable-adjusted spline curves in graded fashion until appropriately matched. In contrast, risk of death among patients receiving a PHM-overmatched heart did not differ from the appropriately matched group, including in recipients with an elevated pretransplant PVR. Conclusion and Relevance: In this cohort study, undermatching donor-to-recipient size according to PHM was associated with higher posttransplant mortality, specifically in patients with normal weight, overweight, or class II obesity and in patients with elevated pretransplant PVR. Overmatching donor-to-recipient size was not associated with posttransplant survival.


Subject(s)
Heart Transplantation , Overweight , Adult , Humans , Male , Female , Middle Aged , Overweight/complications , Cohort Studies , Obesity/complications , Obesity/epidemiology , Risk Factors , Vascular Resistance
6.
J Card Fail ; 29(3): 407-413, 2023 03.
Article in English | MEDLINE | ID: mdl-36243340

ABSTRACT

BACKGROUND: Cardiopulmonary exercise testing (CPET) can identify mechanisms of exercise intolerance in heart failure with preserved ejection fraction (HFpEF), but exercise modalities with differing body positions (eg, recumbent ergometer, treadmill) are broadly used. In this study, we aimed to determine whether body position affects CPET parameters in patients with HFpEF. METHODS: Subjects with stable HFpEF (n = 23) underwent noninvasive treadmill CPET, followed by an invasive recumbent-cycle ergometer CPET within 3 months. A comparison group undergoing similar studies included healthy subjects (n = 5) and subjects with pulmonary arterial hypertension (n = 6). RESULTS: The peak oxygen consumption (VO2peak) and peak heart rate were significantly lower in the recumbent vs the upright position (10.1 vs 13.1 mL/kg/min [Δ-3 mL/kg/min]; P < 0.001; and 95 vs 113 bpm [Δ-18 bpm]; P < 0.001, respectively). No significant differences were found in the minute ventilation to carbon dioxide production ratio, end-tidal pressure of carbon dioxide or respiratory exchange ratio. A similar pattern was observed in the comparison groups. CONCLUSIONS: Compared to recumbent ergometer, treadmill CPET revealed higher VO2peak and peak heart rate response. When determining chronotropic incompetence to adjust beta-blocker administration in HFpEF, body position should be taken into account.


Subject(s)
Exercise Test , Heart Failure , Humans , Heart Failure/diagnosis , Stroke Volume/physiology , Carbon Dioxide , Exercise Tolerance/physiology , Oxygen Consumption/physiology
7.
J Heart Lung Transplant ; 42(1): 33-39, 2023 01.
Article in English | MEDLINE | ID: mdl-36347767

ABSTRACT

BACKGROUND: Continuous flow left ventricular assist devices have improved outcomes in patients with end-stage heart failure that require mechanical circulatory support. Current devices have an adverse event profile that has hindered widespread application. The EVAHEART®2 left ventricular assist device (EVA2) has design features such as large blood gaps, lower pump speeds and an inflow cannula that does not protrude into the left ventricle that may mitigate the adverse events currently seen with other continuous flow devices. METHODS: A prospective, multi-center randomized non-inferiority study, COMPETENCE Trial, is underway to assess non-inferiority of the EVA2 to the HeartMate 3 LVAS when used for the treatment of refractory advanced heart failure. The primary end-point is a composite of the individual primary outcomes: Survival to cardiac transplant or device explant for recovery; Free from disabling stroke; Free from severe Right Heart Failure after implantation of original device. Randomization is in a 2:1 (EVA2:HM3) ratio. RESULTS: The first patient was enrolled into the COMPETENCE Trial in December of 2020, and 25 subjects (16 EVA2 and 9 HM3) are currently enrolled. Enrollment of a safety cohort is projected to be completed by third quarter of 2022 at which time an interim analysis will be performed. Short-term cohort (92 EVA2 subjects) and long-term cohort is expected to be completed by the end of 2023 and 2024, respectively. CONCLUSIONS: The design features of the EVA2 such as a novel inflow cannula and large blood gaps may improve clinical outcomes but require further study. The ongoing COMPETENCE trial is designed to determine if the EVA2 is non-inferior to the HM3.


Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Humans , Heart-Assist Devices/adverse effects , Prospective Studies , Heart Failure/surgery , Heart Ventricles , Treatment Outcome
8.
Egypt Heart J ; 74(1): 37, 2022 May 08.
Article in English | MEDLINE | ID: mdl-35527310

ABSTRACT

BACKGROUND: Right ventricular (RV) dilation has been used to predict adverse outcomes in acute pulmonary conditions. It has been used to categorize the severity of novel coronavirus infection (COVID-19) infection. Our study aimed to use chest CT-angiogram (CTA) to assess if increased RV dilation, quantified as an increased RV:LV (left ventricle) ratio, is associated with adverse outcomes in the COVID-19 infection, and if it occurs out of proportion to lung parenchymal disease. RESULTS: We reviewed clinical, laboratory, and chest CTA findings in COVID-19 patients (n = 100), and two control groups: normal subjects (n = 10) and subjects with organizing pneumonia (n = 10). On a chest CTA, we measured basal dimensions of the RV and LV in a focused 4-chamber view, and dimensions of pulmonary artery (PA) and aorta (AO) at the PA bifurcation level. Among the COVID-19 cohort, a higher RV:LV ratio was correlated with adverse outcomes, defined as ICU admission, intubation, or death. In patients with adverse outcomes, the RV:LV ratio was 1.06 ± 0.10, versus 0.95 ± 0.15 in patients without adverse outcomes. Among the adverse outcomes group, compared to the control subjects with organizing pneumonia, the lung parenchymal damage was lower (22.6 ± 9.0 vs. 32.7 ± 6.6), yet the RV:LV ratio was higher (1.06 ± 0.14 vs. 0.89 ± 0.07). In ROC analysis, RV:LV ratio had an AUC = 0.707 with an optimal cutoff of RV:LV ≥ 1.1 as a predictor of adverse outcomes. In a validation cohort (n = 25), an RV:LV ≥ 1.1 as a cutoff predicted adverse outcomes with an odds ratio of 76:1. CONCLUSIONS: In COVID-19 patients, RV:LV ratio ≥ 1.1 on CTA chest is correlated with adverse outcomes. RV dilation in COVID-19 is out of proportion to parenchymal lung damage, pointing toward a vascular and/or thrombotic injury in the lungs.

9.
Clin Cardiol ; 45(7): 742-751, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35419844

ABSTRACT

BACKGROUND: Among subjects with exercise intolerance and suspected early-stage pulmonary hypertension (PH), early identification of pulmonary vascular disease (PVD) with noninvasive methods is essential for prompt PH management. HYPOTHESIS: Rest gas exchange parameters (minute ventilation to carbon dioxide production ratio: VE /VCO2 and end-tidal carbon dioxide: ETCO2 ) can identify PVD in early-stage PH. METHODS: We conducted a retrospective review of 55 subjects with early-stage PH (per echocardiogram), undergoing invasive exercise hemodynamics with cardiopulmonary exercise test to distinguish exercise intolerance mechanisms. Based on the rest and exercise hemodynamics, three distinct phenotypes were defined: (1) PVD, (2) pulmonary venous hypertension, and (3) noncardiac dyspnea (no rest or exercise PH). For all tests, *p < .05 was considered statistically significant. RESULTS: The mean age was 63.3 ± 13.4 years (53% female). In the overall cohort, higher rest VE /VCO2 and lower rest ETCO2 (mm Hg) correlated with high rest and exercise pulmonary vascular resistance (PVR) (r ~ 0.5-0.6*). On receiver-operating characteristic analysis to predict PVD (vs. non-PVD) subjects with noninvasive metrics, area under the curve for pulmonary artery systolic pressure (echocardiogram) = 0.53, rest VE /VCO2 = 0.70* and ETCO2 = 0.73*. Based on this, optimal thresholds of rest VE /VCO2 > 40 mm Hg and rest ETCO2 < 30 mm Hg were applied to the overall cohort. Subjects with both abnormal gas exchange parameters (n = 12, vs. both normal parameters, n = 19) had an exercise PVR 5.2 ± 2.6* (vs. 1.9 ± 1.2), mPAP/CO slope with exercise 10.2 ± 6.0* (vs. 2.9 ± 2.0), and none included subjects from the noncardiac dyspnea group. CONCLUSIONS: In a broad cohort of subjects with suspected early-stage PH, referred for invasive exercise testing to distinguish mechanisms of exercise intolerance, rest gas exchange parameters (VE /VCO2 > 40 mm Hg and ETCO2 < 30 mm Hg) identify PVD.


Subject(s)
Hypertension, Pulmonary , Carbon Dioxide , Dyspnea/diagnosis , Dyspnea/etiology , Exercise Test/methods , Female , Hemodynamics , Humans , Hypertension, Pulmonary/diagnosis , Male , Oxygen Consumption
10.
Transplantation ; 106(6): 1143-1158, 2022 06 01.
Article in English | MEDLINE | ID: mdl-34856598

ABSTRACT

Cardiovascular events, including ischemic heart disease, heart failure, and arrhythmia, are common complications after kidney transplantation and continue to be leading causes of graft loss. Kidney transplant recipients have both traditional and transplant-specific risk factors for cardiovascular disease. In the general population, modification of cardiovascular risk factors is the best strategy to reduce cardiovascular events; however, studies evaluating the impact of risk modification strategies on cardiovascular outcomes among kidney transplant recipients are limited. Furthermore, there is only minimal guidance on appropriate cardiovascular screening and monitoring in this unique patient population. This review focuses on the limited scientific evidence that addresses cardiovascular events in kidney transplant recipients. Additionally, we focus on clinical management of specific cardiovascular entities that are more prevalent among kidney transplant recipients (ie, pulmonary hypertension, valvular diseases, diastolic dysfunction) and the use of newer evolving drug classes for treatment of heart failure within this cohort of patients. We note that there are no consensus documents describing optimal diagnostic, monitoring, or management strategies to reduce cardiovascular events after kidney transplantation; however, we outline quality initiatives and research recommendations for the assessment and management of cardiovascular-specific risk factors that could improve outcomes.


Subject(s)
Cardiovascular Diseases , Heart Failure , Kidney Transplantation , Cardiovascular Diseases/diagnosis , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/etiology , Cohort Studies , Heart Failure/etiology , Humans , Kidney Transplantation/adverse effects , Risk Factors , Transplant Recipients
11.
J Am Heart Assoc ; 10(15): e019655, 2021 08 03.
Article in English | MEDLINE | ID: mdl-34315285

ABSTRACT

Background It is unclear whether the recent increase in the number of heart transplants performed annually in the United States is only because of higher availability of donors and if it affected recipients' survival. Methods and Results We examined characteristics of donors and recipients from 2008 to 2012 (n=11 654) and 2013 to 2017 (n=14 556) and compared them with 2003 to 2007 (n=10 869). Cox models examined 30-day and 1-year risk of recipients' death post transplant. From 2013 to 2017, there was an increase in the number of transplanted hearts and number of donor offers but an overall decline in the ratio of hearts transplanted to available donors. Donors between 2013 and 2017 were older, heavier, more hypertensive, diabetic, and likely to have abused illicit drugs compared with previous years. Drug overdose and hepatitis C positive donors were additional contributors to donor risk in recent years. In Cox models, risk of death post transplant between 2013 and 2017 was 15% lower at 30 days (hazard ratio [HR] 0.85; 95% CI, 0.74-0.98) and 21% lower at 1 year (HR, 0.79; 95% CI, 0.73-0.87) and between 2008 and 2012 was 9% lower at 30 days (HR, 0.91; 95% CI, 0.79-1.05) and 14% lower at 1 year (HR, 0.86; 95% CI, 0.79-0.94) compared with 2003 to 2007. Conclusions Despite a substantial increase in heart donor offers in recent years, the ratio of transplants performed to available donors has decreased. Even though hearts from donors who are older, more hypertensive, and have diabetes mellitus are being used, overall recipient survival continues to improve. Broader acceptance of drug overdose and hepatitis C positive donors may increase the number and percentage of heart transplants further without jeopardizing short-term outcomes.


Subject(s)
Donor Selection/trends , Heart Failure/surgery , Heart Transplantation/trends , Tissue Donors/supply & distribution , Adolescent , Adult , Cause of Death/trends , Child , Child, Preschool , Databases, Factual , Drug Overdose/mortality , Female , Heart Failure/diagnosis , Heart Failure/mortality , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Hepatitis C/diagnosis , Humans , Infant , Infant, Newborn , Male , Middle Aged , Patient Safety , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome , United States , Young Adult
12.
Clin Transplant ; 35(10): e14400, 2021 10.
Article in English | MEDLINE | ID: mdl-34181771

ABSTRACT

BACKGROUND: Orthotopic heart transplant (OHT) recipients with a body mass index (BMI) > = 35 have worse survival than those with a BMI < 35. Diabetes is a risk factor for mortality. We evaluated the impact of diabetes on mortality rates after OHT in patients with a BMI > 35. METHODS: Patients > 18 years who underwent OHT 2008-2017 with a BMI > = 35 were identified in the United Network for Organ Sharing (UNOS) database. Recipient and donor characteristics were compared. A Kaplan Meier analysis was performed. A multivariable Cox proportional hazards model examined the relationship between diabetes and survival. The equivalence of survival outcomes was examined by an unadjusted Cox proportional hazards model and the two one-sided test procedure, using a pre-specified equivalence region. RESULTS: Patients with diabetes were older, had a higher creatinine, lower bilirubin, fewer months on the waitlist, and the donor was less likely to be on inotropes. Kaplan-Meier analysis showed no difference in patient survival. Recipient factors associated with an increased risk of death were increasing bilirubin and machine ventilation. Increasing ischemic time resulted in an increased hazard of death. Long-term survival outcomes were equivalent. CONCLUSIONS: In OHT recipients with a BMI > 35, there is no statistical difference in longterm survival in recipients with or without diabetes. These results encourage continued consideration for OHT in patients BMI > 35 with coexisting diabetes.


Subject(s)
Diabetes Mellitus , Heart Transplantation , Body Mass Index , Humans , Kaplan-Meier Estimate , Proportional Hazards Models , Retrospective Studies , Survival Rate , Treatment Outcome , Waiting Lists
13.
J Card Surg ; 36(7): 2342-2347, 2021 Jul.
Article in English | MEDLINE | ID: mdl-33861471

ABSTRACT

BACKGROUND: Left ventricular assist devices (LVAD) are standardly implanted via full sternotomy. Nonsternotomy approaches are gaining popularity, but potential benefits of this approach have not been well-studied. We hypothesized that LVAD implantation by bi-thoracotomy (BT) would demonstrate smaller and more consistent inflow cannula angles leading to improved postoperative outcomes compared to sternotomy. METHODS: Charts of patients who underwent LVAD implantation between June 2018 and June 2020 at a single academic institution were retrospectively reviewed. Patient demographics, surgical approach (sternotomy vs. BT), laboratory values, and postoperative course were compared. The inflow cannula angle was measured on the first chest radiograph available postoperatively. RESULTS: Of 40 patients studied, BT approach was used in 17 (42.5%). Mean inflow cannula angles were smaller in BT patients (23.0 vs. 37.1 degrees, p = .018) and had a smaller standard deviation (13.8 vs. 20.3). Excluding patients who went on to receive a heart transplant or died in the same hospitalization, there was no difference in median length of hospital stay after surgery (16.0 vs. 17.5 days, p = .768). However, BT patients required fewer days of postoperative inotrope support (4.0 vs. 7.0 days, p = .012). CONCLUSIONS: Our data suggest inflow cannula angles are smaller and more consistent with the BT approach, which leads to a shorter duration of postoperative inotropic support. This finding may suggest improved right heart function following LVAD implant via BT approach. Further study is warranted to determine additional benefits of the BT approach.


Subject(s)
Heart Failure , Heart-Assist Devices , Heart Failure/surgery , Humans , Prosthesis Implantation , Retrospective Studies , Sternotomy , Thoracotomy
14.
Circ Arrhythm Electrophysiol ; 14(3): e007954, 2021 03.
Article in English | MEDLINE | ID: mdl-33685207

ABSTRACT

Orthotropic heart transplantation remains the most effective therapy for patients with end-stage heart failure, with a median survival of ≈13 years. Yet, a number of complications are observed after orthotropic heart transplantation, including atrial and ventricular arrhythmias. Several factors contribute to arrhythmias, such as autonomic denervation, effect of the surgical technique, acute and chronic rejection, and transplant vasculopathy among others. To minimize risk of future arrhythmias, the bicaval technique and minimizing ischemic time are current surgical standards. Sinus node dysfunction is the most common indication for early (within 30 days) pacemaker implantation, whereas atrioventricular block incidence increases as time from transplant increases. Atrial fibrillation can occur in the first few weeks following transplantation but is uncommon in the long term unless secondary to a precipitant such as acute rejection. The most common atrial arrhythmias are atrial flutters, which are mainly typical, but atypical circuits can be observed such as those that involve the remnant donor atrium in regions immediately adjacent to the atrioatrial anastomosis suture line. Choosing the appropriate pharmacological therapy requires careful consideration due to the potential interaction with immunosuppressive agents. Despite historical concerns, adenosine is effective and safe at reduced doses if administered under cardiac monitoring. Catheter ablation has emerged as an effective treatment strategy for symptomatic supraventricular tachycardias, including ablation of atypical flutter circuits. Cardiac allograft vasculopathy is an important risk factor for sudden cardiac death, yet the role of prophylactic implantable cardioverter-defibrillator implant for sudden death prevention is unclear. Current indications for implantable cardioverter-defibrillator implantation are as in the nontransplant population. A number of questions for future research are posed.


Subject(s)
Anti-Arrhythmia Agents/therapeutic use , Arrhythmias, Cardiac/therapy , Catheter Ablation , Electric Countershock , Heart Rate/drug effects , Heart Transplantation/adverse effects , Action Potentials , Animals , Anti-Arrhythmia Agents/adverse effects , Arrhythmias, Cardiac/etiology , Arrhythmias, Cardiac/mortality , Arrhythmias, Cardiac/physiopathology , Catheter Ablation/adverse effects , Catheter Ablation/mortality , Defibrillators, Implantable , Electric Countershock/adverse effects , Electric Countershock/instrumentation , Electric Countershock/mortality , Heart Transplantation/mortality , Humans , Risk Factors , Treatment Outcome
15.
Am J Transplant ; 21(9): 3005-3013, 2021 09.
Article in English | MEDLINE | ID: mdl-33565674

ABSTRACT

There are no prior studies assessing the risk factors and outcomes for kidney delayed graft function (K-DGF) in simultaneous heart and kidney (SHK) transplant recipients. Using the OPTN/UNOS database, we sought to identify risk factors associated with the development of K-DGF in this unique population, as well as outcomes associated with K-DGF. A total of 1161 SHK transplanted between 1998 and 2018 were included in the analysis, of which 311 (27%) were in the K-DGF (+) group and 850 in the K-DGF (-) group. In the multivariable analysis, history of pretransplant dialysis (OR: 3.95; 95% CI: 2.94 to 5.29; p < .001) was significantly associated with the development of K-DGF, as was donor death from cerebrovascular accident and longer cold ischemia time of either organ. SHK recipients with K-DGF had increased mortality (HR: 1.99; 95% CI: 1.52 to 2.60; p < .001) and death censored kidney graft failure (HR: 3.51; 95% CI: 2.29 to 5.36; p < .001) in the multivariable analysis. Similar outcomes were obtained when limiting our study to 2008-2018. Similar to kidney-only recipients, K-DGF in SHK recipients is associated with worse outcomes. Careful matching of recipients and donors, as well as peri-operative management, may help reduce the risk of K-DGF and the associated detrimental effects.


Subject(s)
Delayed Graft Function , Kidney Transplantation , Delayed Graft Function/etiology , Graft Rejection/etiology , Graft Survival , Humans , Kidney , Kidney Transplantation/adverse effects , Retrospective Studies , Risk Factors , Tissue Donors
16.
J Card Surg ; 36(3): 801-805, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33415793

ABSTRACT

OBJECTIVES: Patients on left ventricular assist device (LVAD) support receive extensive care and education before discharge home. We investigated the impact of patient's residential distance from LVAD implantation center on outcomes and survival. METHODS: A total of 214 patients received a LVAD between 2006 and 2018 at our institution. Patient's residential distance from the LVAD implantation center, LVAD complications, hospitalization, and death were recorded. Patients were divided into two groups: patients living less than or equal to 100 miles (Group 1), patients living more than 100 (Group 2). RESULTS: A total of 106 patients were assigned to Group 1 and 108 patients were assigned to Group 2. Destination therapy was intended in 20% of patients in Group 1 and 34% in Group 2 (p = .023). Mean length of stay was 13 ± 9 days for Group 1 and 21 ± 12 for Group 2 (p < .001). Major postoperative complications were unplanned readmissions due to infections (9% and 12%), gastrointenstinal bleeding (15% and 14%), cerebrovascular accidents (6% and 7.4%), and acute kidney injury (5% and 2%), respectively for Group 1 and Group 2. There was no difference in major complications (all p > .05) and survival between patients in both groups (p > .05). CONCLUSIONS: Distance from implanting center had no impact on adverse outcomes after LVAD implantation. There was a significant increase in hospital stay for patients who live far from the implanting center, suggesting that distance should not be a contraindication when considering patients for LVAD therapy, but plans should be made for prolonged hospital stay or extended local stay near the hospital for close follow-up.


Subject(s)
Heart Failure , Heart-Assist Devices , Heart Failure/therapy , Humans , Length of Stay , Postoperative Complications/epidemiology , Retrospective Studies , Treatment Outcome
17.
J Card Surg ; 36(3): 1148-1149, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33448478

ABSTRACT

Coronavirus disease-2019 has created unprecedented challenges for society, and specifically the medical community. While the pandemic continues to unfold, the transplant community has had to pivot to keep recipients, donors, and institutional transplant teams safe given the unique circumstances inherent to solid organ transplantation.


Subject(s)
COVID-19/epidemiology , Heart Failure/surgery , Heart Transplantation/methods , Pandemics , Tissue and Organ Procurement/methods , Transplant Recipients , Comorbidity , Heart Failure/epidemiology , Humans , Male , Middle Aged , SARS-CoV-2 , Tissue Donors
18.
Int J Cardiol ; 326: 131-138, 2021 03 01.
Article in English | MEDLINE | ID: mdl-33091520

ABSTRACT

AIM: Heart failure following myocardial infarction (MI) is a potentially lethal problem with a staggering incidence. The CardiAMP Heart Failure trial represents the first attempt to personalize marrow-derived cell-based therapy to individuals with cell characteristics associated with beneficial responses in prior trials. Before the initiation of the randomized pivotal trial, an open-label "roll-in cohort" was completed to ensure the feasibility of the protocol's procedures. METHODS: Patients with chronic post-MI heart failure (NYHA class II-III) receiving stable, guideline-directed medical therapy with a left ventricular ejection fraction between 20 and 40% were eligible. Two weeks prior to treatment, a ~ 5 mL bone marrow aspiration was performed to examine "cell potency". On treatment day, a 60 mL bone marrow aspiration, bone marrow mononuclear cell (BM MNC) enrichment and transendocardial injection of 200 million BM MNC's was performed in a single, point of care encounter. Patients were then followed to assess clinical outcomes. RESULTS: The cell potency small volume bone marrow aspirate, the 60 mL bone marrow aspirate, and transendocardial injections were well tolerated in 10 patients enrolled. There were no serious adverse events related to bone marrow aspiration or cell delivery. Improvement in 6-min walk distance was observed at 6 months (+47.8 m, P = 0.01) and trended to improvement at 12 months (+46.4, P = 0.06). Similarly, trends to improved NYHA heart failure functional class, quality of life, left ventricular ejection fraction and recruitment of previously akinetic left ventricular wall segments were observed. CONCLUSION: All CardiAMP HF protocol procedures were feasible and well tolerated. Favorable functional, echo and quality of life trends suggest this approach may offer promise for patients with post MI heart failure. The randomized CardiAMP Heart Failure pivotal trial is underway to confirm the efficacy of this approach. CLINICAL TRIAL REGISTRATION: https://clinicaltrials.gov/ct2/show/NCT02438306.


Subject(s)
Heart Failure , Myocardial Ischemia , Bone Marrow , Bone Marrow Transplantation , Cell- and Tissue-Based Therapy , Feasibility Studies , Heart Failure/diagnosis , Heart Failure/therapy , Humans , Point-of-Care Systems , Quality of Life , Stroke Volume , Treatment Outcome , Ventricular Function, Left
19.
Cardiovasc Drugs Ther ; 35(1): 33-40, 2021 02.
Article in English | MEDLINE | ID: mdl-33074524

ABSTRACT

PURPOSE: It remains unclear if use of amiodarone pre-cardiac transplantation impacts early post-transplant survival. METHODS: We selected all patients undergoing heart transplant from 2004 to 2006 with available information using the United Network for Organ Sharing database (n = 4057). Multivariable Cox models compared the risk of death within 30 days post-transplant in patients who were taking amiodarone at the time of transplant listing (n = 1227) to those who were not (n = 2830). RESULTS: Mean age was 52 (± 12) years, and 23% were women. Patients who died within 30 days (n = 168) were older; had higher panel reactive antibody levels, higher bilirubin levels, and higher prevalence of prior cardiac surgery; were often at status 1B; and had higher use of amiodarone at listing compared to those who survived (5.3% versus 3.6%; p = 0.02). Cause of death was unknown in 49% and was reported as graft failure in 43% of cases. In multivariable Cox models, patients on amiodarone at the time of listing had 1.56-fold higher risk of post-transplant death within 30 days (95% confidence intervals 1.08-2.27) compared to patients who were not on amiodarone at listing (C-statistic 0.70). CONCLUSION: In conclusion, patients who reported taking amiodarone at the time of listing for transplant had a higher risk of death within 30 days post-transplant.


Subject(s)
Amiodarone/therapeutic use , Anti-Arrhythmia Agents/therapeutic use , Heart Transplantation/mortality , Adult , Age Factors , Aged , Amiodarone/administration & dosage , Anti-Arrhythmia Agents/administration & dosage , Female , Graft Survival/physiology , Humans , Male , Middle Aged , Patient Acuity , Proportional Hazards Models , Retrospective Studies
20.
Clin Nephrol ; 94(6): 273-280, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32909547

ABSTRACT

BACKGROUND: This study aimed to determine the risk factors associated with cardiac events 1 year after transplant in kidney transplant recipients (KTRs). MATERIAL AND METHODS: We analyzed the incidence of cardiac events in all KTRs transplanted at our center between 01/2000 and 12/2016, who had non-obstructive cardiac catheterization findings at their pre-transplant evaluation. RESULTS: We identified 141 patients with non-obstructive pre-transplant cardiac catheterization. 83 patients (59%) had cardiac events 1 year after the kidney transplant during a mean follow-up of 7.3 ± 5.3 years. Multivariate Cox regression analysis determined dialysis ≥ 1 year (HR = 2.27, 95% Cl 1.41 - 3.67, p = 0.001), body mass index (BMI) ≥ 35 kg/m2 at time of transplant (HR = 2.24, 95% Cl 1.43 - 3.52, p = 0.0004), tacrolimus trough ≥ 7 ng/mL at 1 year post-transplant (HR = 4.24, 95% Cl 1.95 - 9.22, p = 0.0003), and HBA1-c ≥ 7% at 1 year post-transplant (HR = 1.71, 95% Cl 1.09 - 2.70, p = 0.02) as significant predictors of cardiac events 1 year post-transplant. In unadjusted Kaplan-Meier analysis, any cardiac event post-transplant was associated with a significant risk of death or graft loss (p = 0.02). CONCLUSION: Dialysis duration, morbid obesity, diabetes control, and tacrolimus levels may represent modifiable risk factors to reduce cardiac events in kidney transplant recipients with non-obstructive cardiac catheterization findings at the time of transplant.


Subject(s)
Coronary Angiography , Coronary Disease , Graft Rejection , Kidney Transplantation , Transplant Recipients/statistics & numerical data , Coronary Disease/complications , Coronary Disease/diagnostic imaging , Coronary Disease/epidemiology , Graft Rejection/complications , Graft Rejection/epidemiology , Humans , Incidence , Kidney Transplantation/adverse effects , Kidney Transplantation/statistics & numerical data
SELECTION OF CITATIONS
SEARCH DETAIL
...