Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 48
Filter
1.
J Heart Lung Transplant ; 43(6): 878-888, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38244649

ABSTRACT

BACKGROUND: This study evaluates the clinical trends, risk factors, and effects of post-transplant stroke and subsequent functional independence on outcomes following orthotopic heart transplantation under the 2018 heart allocation system. METHODS: The United Network for Organ Sharing registry was queried to identify adult recipients from October 18, 2018 to December 31, 2021. The cohort was stratified into 2 groups with and without post-transplant stroke. The incidence of post-transplant stroke was compared before and after the allocation policy change. Outcomes included post-transplant survival and complications. Multivariable logistic regression was performed to identify risk factors for post-transplant stroke. Sub-analysis was performed to evaluate the impact of functional independence among recipients with post-transplant stroke. RESULTS: A total of 9,039 recipients were analyzed in this study. The incidence of post-transplant stroke was higher following the policy change (3.8% vs 3.1%, p = 0.017). Thirty-day (81.4% vs 97.7%) and 1-year (66.4% vs 92.5%) survival rates were substantially lower in the stroke cohort (p < 0.001). The stroke cohort had a higher rate of post-transplant renal failure, longer hospital length of stay, and worse functional status. Multivariable analysis identified extracorporeal membrane oxygenation, durable left ventricular assist device, blood type O, and redo heart transplantation as strong predictors of post-transplant stroke. Preserved functional independence considerably improved 30-day (99.2% vs 61.2%) and 1-year (97.7% vs 47.4%) survival rates among the recipients with post-transplant stroke (p < 0.001). CONCLUSIONS: There is a higher incidence of post-transplant stroke under the 2018 allocation system, and it is associated with significantly worse post-transplant outcomes. However, post-transplant stroke recipients with preserved functional independence have improved survival, similar to those without post-transplant stroke.


Subject(s)
Heart Transplantation , Postoperative Complications , Stroke , Humans , Male , Female , United States/epidemiology , Middle Aged , Stroke/epidemiology , Postoperative Complications/epidemiology , Risk Factors , Retrospective Studies , Tissue and Organ Procurement , Incidence , Registries , Survival Rate/trends , Adult , Aged , Follow-Up Studies
2.
J Thorac Cardiovasc Surg ; 167(3): 1064-1076.e2, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37480982

ABSTRACT

OBJECTIVE: This study aimed to investigate the clinical trends and the impact of the 2018 heart allocation policy change on both waitlist and post-transplant outcomes in simultaneous heart-kidney transplantation in the United States. METHODS: The United Network for Organ Sharing registry was queried to compare adult patients before and after the allocation policy change. This study included 2 separate analyses evaluating the waitlist and post-transplant outcomes. Multivariable analyses were performed to determine the 2018 allocation system's risk-adjusted hazards for 1-year waitlist and post-transplant mortality. RESULTS: The initial analysis investigating the waitlist outcomes included 1779 patients listed for simultaneous heart-kidney transplantation. Of these, 1075 patients (60.4%) were listed after the 2018 allocation policy change. After the policy change, the waitlist outcomes significantly improved with a shorter waitlist time, lower likelihood of de-listing, and higher likelihood of transplantation. In the subsequent analysis investigating the post-transplant outcomes, 1130 simultaneous heart-kidney transplant recipients were included, where 738 patients (65.3%) underwent simultaneous heart-kidney transplantation after the policy change. The 90-day, 6-month, and 1-year post-transplant survival and complication rates were comparable before and after the policy change. Multivariable analyses demonstrated that the 2018 allocation system positively impacted risk-adjusted 1-year waitlist mortality (sub-hazard ratio, 0.66, 95% CI, 0.51-0.85, P < .001), but it did not significantly impact risk-adjusted 1-year post-transplant mortality (hazard ratio, 1.03; 95% CI, 0.72-1.47, P = .876). CONCLUSIONS: This study demonstrates increased rates of simultaneous heart-kidney transplantation with a shorter waitlist time after the 2018 allocation policy change. Furthermore, there were improved waitlist outcomes and comparable early post-transplant survival after simultaneous heart-kidney transplantation under the 2018 allocation system.


Subject(s)
Heart Transplantation , Kidney Transplantation , Adult , Humans , United States , Kidney Transplantation/adverse effects , Heart Transplantation/adverse effects , Proportional Hazards Models , Waiting Lists , Retrospective Studies
3.
J Thorac Cardiovasc Surg ; 167(5): 1845-1860.e12, 2024 May.
Article in English | MEDLINE | ID: mdl-37714368

ABSTRACT

OBJECTIVE: To quantitate the impact of heart donation after circulatory death (DCD) donor utilization on both waitlist and post-transplant outcomes in the United States. METHODS: The United Network for Organ Sharing database was queried to identify all adult waitlisted and transplanted candidates between October 18, 2018, and December 31, 2022. Waitlisted candidates were stratified according to whether they had been approved for donation after brain death (DBD) offers only or also approved for DCD offers. The cumulative incidence of transplantation was compared between the 2 cohorts. In a post-transplant analysis, 1-year post-transplant survival was compared between unmatched and propensity-score-matched cohorts of DBD and DCD recipients. RESULTS: A total of 14,803 candidates were waitlisted, including 12,287 approved for DBD donors only and 2516 approved for DCD donors. Overall, DCD approval was associated with an increased sub-hazard ratio (HR) for transplantation and a lower sub-HR for delisting owing to death/deterioration after risk adjustment. In a subgroup analysis, candidates with blood type B and status 4 designation received the greatest benefit from DCD approval. A total of 12,238 recipients underwent transplantation, 11,636 with DBD hearts and 602 with DCD hearts. Median waitlist times were significantly shorter for status 3 and status 4 recipients receiving DCD hearts. One-year post-transplant survival was comparable between unmatched and propensity score-matched cohorts of DBD and DCD recipients. CONCLUSIONS: The use of DCD hearts confers a higher probability of transplantation and a lower incidence of death/deterioration while on the waitlist, particularly among certain subpopulations such as status 4 candidates. Importantly, the use of DCD donors results in similar post-transplant survival as DBD donors.


Subject(s)
Heart Transplantation , Tissue and Organ Procurement , Adult , Humans , Brain Death , Tissue Donors , Heart Transplantation/adverse effects , Probability , Brain , Retrospective Studies , Graft Survival
4.
Heart Rhythm O2 ; 4(11): 708-714, 2023 Nov.
Article in English | MEDLINE | ID: mdl-38034894

ABSTRACT

Background: Implantable cardioverter-defibrillation (ICD) shocks after left ventricular assist device therapy (LVAD) are associated with adverse clinical outcomes. Little is known about the association of pre-LVAD ICD shocks on post-LVAD clinical outcomes and whether LVAD therapy affects the prevalence of ICD shocks. Objectives: The purpose of this study was to determine whether pre-LVAD ICD shocks are associated with adverse clinical outcomes post-LVAD and to compare the prevalence of ICD shocks before and after LVAD therapy. Methods: Patients 18 years or older with continuous-flow LVADs and ICDs were retrospectively identified within the University of Pittsburgh Medical Center system from 2006-2020. We analyzed the association between appropriate ICD shocks within 1 year pre-LVAD with a primary composite outcome of death, stroke, and pump thrombosis and secondary outcomes of post-LVAD ICD shocks and ICD shock hospitalizations. Results: Among 309 individuals, average age was 57 ± 12 years, 87% were male, 80% had ischemic cardiomyopathy, and 42% were bridge to transplantation. Seventy-one patients (23%) experienced pre-LVAD shocks, and 69 (22%) experienced post-LVAD shocks. The overall prevalence of shocks pre-LVAD and post-LVAD were not different. Pre-LVAD ICD shocks were not associated with the composite outcome. Pre-LVAD ICD shocks were found to predict post-LVAD shocks (hazard ratio [HR] 5.7; 95% confidence interval [CI] 3.42-9.48; P <.0001) and hospitalizations related to ICD shocks from ventricular arrhythmia (HR 10.34; 95% CI 4.1-25.7; P <.0001). Conclusion: Pre-LVAD ICD shocks predicted post-LVAD ICD shocks and hospitalizations but were not associated with the composite outcome of death, pump thrombosis, or stroke at 1 year. The prevalence of appropriate ICD shocks was similar before and after LVAD implantation in the entire cohort.

5.
Clin Transplant ; 37(12): e15132, 2023 12.
Article in English | MEDLINE | ID: mdl-37705362

ABSTRACT

In this project, we describe proteasome inhibitor (PI) treatment of antibody-mediated rejection (AMR) in heart transplantation (HTX). From January 2018 to September 2021, 10 patients were treated with PI for AMR: carfilzomib (CFZ) n = 8; bortezomib (BTZ) n = 2. Patients received 1-3 cycles of PI. All patients had ≥1 strong donor-specific antibody (DSA) (mean fluorescence intensity [MFI] > 8000) in undiluted serum. Most DSAs (20/21) had HLA class II specificity. The MFI of strong DSAs had a median reduction of 56% (IQR = 13%-89%) in undiluted serum and 92% (IQR = 53%-95%) at 1:16 dilution. Seventeen DSAs in seven patients were reduced > 50% at 1:16 dilution after treatment. Four DSAs from three patients did not respond. DSA with MFI > 8000 at 1:16 dilution was less responsive to treatment. 60% (6/10) patients presented with graft dysfunction; 4/6 recovered ejection fraction > 40% after treatment. Pathologic AMR was resolved in 5/7 (71.4%) of patients within 1 year after treatment. 9/10 (90%) patients survived to 1 year after AMR diagnosis. Using PI in AMR resulted in significant DSA reduction with some resolution of graft dysfunction. Larger studies are needed to evaluate PI for AMR.


Subject(s)
Heart Transplantation , Kidney Transplantation , Humans , Proteasome Inhibitors/therapeutic use , Isoantibodies , Kidney Transplantation/adverse effects , HLA Antigens , Tissue Donors , Graft Rejection/drug therapy , Graft Rejection/etiology , Retrospective Studies
6.
J Heart Lung Transplant ; 42(7): 925-935, 2023 07.
Article in English | MEDLINE | ID: mdl-36973093

ABSTRACT

BACKGROUND: This study compared outcomes of patients waitlisted for orthotopic heart transplantation with durable left ventricular assist devices (LVAD) before and after the October 18, 2018 heart allocation policy change. METHODS: The United Network of Organ Sharing database was queried to identify 2 cohorts of adult candidates with durable LVAD listed within seasonally-matched, equal-length periods before (old policy era [OPE]) and after the policy change (new policy era [NPE]). The primary outcomes were 2-year survival from the time of initial waitlisting, as well as 2-year post-transplant survival. Secondary outcomes included incidence of transplantation from the waitlist and de-listing due to either death or clinical deterioration. RESULTS: A total of 2,512 candidates were waitlisted, 1,253 within the OPE and 1,259 within the NPE. Candidates under both policies had similar 2-year survival after waitlisting, as well as a similar cumulative incidence of transplantation and de-listing due to death and/or clinical deterioration. A total of 2,560 patients were transplanted within the study period, 1,418 OPE and 1,142 within the NPE. Two-year post-transplant survival was similar between policy eras, however, the NPE was associated with a higher incidence of post-transplant stroke, renal failure requiring dialysis, and a longer hospital length of stay. CONCLUSIONS: The 2018 heart allocation policy has conferred no significant impact on overall survival from the time of initial waitlisting among durable LVAD-supported candidates. Similarly, the cumulative incidence of transplantation and waitlist mortality have also been largely unchanged. For those undergoing transplantation, a higher degree of post-transplant morbidity was observed, though survival was not impacted.


Subject(s)
Clinical Deterioration , Heart Failure , Heart Transplantation , Heart-Assist Devices , Adult , Humans , Heart Failure/surgery , Heart Failure/epidemiology , Heart-Assist Devices/adverse effects , Heart Transplantation/adverse effects , Registries
7.
Clin Transplant ; 37(5): e14937, 2023 05.
Article in English | MEDLINE | ID: mdl-36793206

ABSTRACT

BACKGROUND: Induction immunosuppression in heart transplant recipients varies greatly by center. Basiliximab (BAS) is the most commonly used induction immunosuppressant but has not been shown to reduce rejection or improve survival. The objective of this retrospective study was to compare rejection, infection, and mortality within the first 12 months following heart transplant in patients who received BAS or no induction. METHODS: This was a retrospective cohort study of adult heart transplant recipients given BAS or no induction from January 1, 2017 to May 31, 2021. The primary endpoint was incidence of treated acute cellular rejection (ACR) at 12-months post-transplant. Secondary endpoints included ACR at 90 days post-transplant, incidence of antibody-mediated rejection (AMR) at 90 days and 1 year, incidence of infection, and all-cause mortality at 1 year. RESULTS: A total of 108 patients received BAS, and 26 patients received no induction within the specified timeframe. There was a lower incidence of ACR within the first year in the BAS group compared to the no induction group (27.7 vs. 68.2%, p < .002). BAS was independently associated with a lower probability of having a rejection event during the first 12-months post-transplant (hazard ratio (HR) .285, 95% confidence interval [CI] .142-.571, p < .001). There was no difference in the rate of infection and in mortality after hospital discharge at 1-year post-transplant (6% vs. 0%, p = .20). CONCLUSION: BAS appears to be associated with greater freedom from rejection without an increase in infections. BAS may be a preferred to a no induction strategy in patients undergoing heart transplantation.


Subject(s)
Antibodies, Monoclonal , Heart Transplantation , Humans , Adult , Basiliximab , Antibodies, Monoclonal/therapeutic use , Retrospective Studies , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/pharmacology , Graft Rejection/etiology , Recombinant Fusion Proteins/therapeutic use
8.
J Heart Lung Transplant ; 42(6): 795-806, 2023 06.
Article in English | MEDLINE | ID: mdl-36797078

ABSTRACT

BACKGROUND: This study evaluated the current clinical trends, risk factors, and temporal effects of post-transplant dialysis on outcomes following orthotopic heart transplantation after the 2018 United States adult heart allocation policy change. METHODS: The United Network for Organ Sharing (UNOS) registry was queried to analyze adult orthotopic heart transplant recipients after the October 18, 2018 heart allocation policy change. The cohort was stratified according to the need for post-transplant de novo dialysis. The primary outcome was survival. Propensity score-matching was performed to compare the outcomes between 2 similar cohorts with and without post-transplant de novo dialysis. The impact of post-transplant dialysis chronicity was evaluated. Multivariable logistic regression was performed to identify risk factors for post-transplant dialysis. RESULTS: A total of 7,223 patients were included in this study. Out of these, 968 patients (13.4%) developed post-transplant renal failure requiring de novo dialysis. Both 1-year (73.2% vs 94.8%) and 2-year (66.3% vs 90.6%) survival rates were lower in the dialysis cohort (p < 0.001), and the lower survival rates persisted in a propensity-matched comparison. Recipients requiring only temporary post-transplant dialysis had significantly improved 1-year (92.5% vs 71.6%) and 2-year (86.6 % vs 52.2%) survival rates compared to the chronic post-transplant dialysis group (p < 0.001). Multivariable analysis demonstrated low pretransplant estimated glomerular filtration (eGFR) and bridge with extracorporeal membrane oxygenation (ECMO) were strong predictors of post-transplant dialysis. CONCLUSIONS: This study demonstrates that post-transplant dialysis is associated with significantly increased morbidity and mortality in the new allocation system. Post-transplant survival is affected by the chronicity of post-transplant dialysis. Low pretransplant eGFR and ECMO are strong risk factors for post-transplant dialysis.


Subject(s)
Heart Failure , Heart Transplantation , Kidney Transplantation , Renal Insufficiency , Adult , Humans , United States/epidemiology , Renal Dialysis , Heart Transplantation/adverse effects , Risk Factors , Retrospective Studies , Treatment Outcome
9.
J Heart Lung Transplant ; 42(1): 76-86, 2023 01.
Article in English | MEDLINE | ID: mdl-36182653

ABSTRACT

BACKGROUND: Since the revision of the United States heart allocation system, increasing use of mechanical circulatory support has been observed as a means to support acutely ill patients. We sought to compare outcomes between patients bridged to orthotopic heart transplantation (OHT) with either temporary (t-LVAD) or durable left ventricular assist devises (d-LVAD) under the revised system. METHODS: The United States Organ Network database was queried to identify all adult OHT recipients who were bridged to transplant with either an isolated t-LVAD or d-LVAD from 10/18/2018 to 9/30/2020. The primary outcome was 1-year post-transplant survival. Predictors of mortality were also modeled, and national trends of LVAD bridging were examined across the study period. RESULTS: About 1,734 OHT recipients were analyzed, 1,580 (91.1%) bridged with d-LVAD and 154 (8.9%) bridged with t-LVAD. At transplant, the t-LVAD cohort had higher total bilirubin levels and greater prevalence of pre-transplant intravenous inotrope usage and mechanical ventilation. Median waitlist time was also shorter for t-LVAD. At 1 year, there was a non-significant trend of increased survival in the t-LVAD cohort (94.8% vs 90.1%; p = 0.06). After risk adjustment, d-LVAD was associated with a 4-fold hazards for 1-year mortality (hazard ratio 3.96, 95% confidence interval 1.42-11.03; p = 0.009). From 2018 to 2021, t-LVAD bridging increased, though d-LVAD remained a more common bridging strategy. CONCLUSIONS: Since the 2018 allocation change, there has been a steady increase in t-LVAD usage as a bridge to OHT. Overall, patients bridged with these devices appear to have least equivalent 1-year survival compared to those bridged with d-LVAD.


Subject(s)
Heart Failure , Heart Transplantation , Heart-Assist Devices , Adult , Humans , Heart Failure/surgery , Heart Failure/etiology , Heart-Assist Devices/adverse effects , Treatment Outcome , Retrospective Studies , Heart Transplantation/adverse effects
10.
J Cardiothorac Surg ; 17(1): 291, 2022 Nov 18.
Article in English | MEDLINE | ID: mdl-36401286

ABSTRACT

BACKGROUND: Anomalous coronary arteries arise in a small subset of the population, with each configuration conveying a varying degree of long-term risk. The utilization of cardiac grafts with these anomalies have not been well described. CASE PRESENTATION: An anomalous single coronary artery with the left main coronary artery arising from the right coronary ostium was discovered in a 40-year old male evaluated for cardiac donation. After evaluation, this heart was successfully procured and utilized for orthotopic heart transplantation. CONCLUSION: In this report, we demonstrate that in select cases, a cardiac graft with single coronary artery anatomy can be successfully procured and transplanted with excellent outcomes.


Subject(s)
Coronary Artery Disease , Coronary Vessel Anomalies , Heart Transplantation , Humans , Male , Adult , Coronary Vessel Anomalies/surgery , Tissue Donors , Coronary Artery Disease/surgery
11.
Clin Infect Dis ; 75(1): e630-e644, 2022 08 24.
Article in English | MEDLINE | ID: mdl-35179197

ABSTRACT

BACKGROUND: We studied humoral responses after coronavirus disease 2019 (COVID-19) vaccination across varying causes of immunodeficiency. METHODS: Prospective study of fully vaccinated immunocompromised adults (solid organ transplant [SOT], hematologic malignancy, solid cancers, autoimmune conditions, human immunodeficiency virus [HIV]) versus nonimmunocompromised healthcare workers (HCWs). The primary outcome was the proportion with a reactive test (seropositive) for immunoglobulin G to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) receptor-binding domain. Secondary outcomes were comparisons of antibody levels and their correlation with pseudovirus neutralization titers. Stepwise logistic regression was used to identify factors associated with seropositivity. RESULTS: A total of 1271 participants enrolled: 1099 immunocompromised and 172 HCW. Compared with HCW (92.4% seropositive), seropositivity was lower among participants with SOT (30.7%), hematological malignancies (50.0%), autoimmune conditions (79.1%), solid tumors (78.7%), and HIV (79.8%) (P < .01). Factors associated with poor seropositivity included age, greater immunosuppression, time since vaccination, anti-CD20 monoclonal antibodies, and vaccination with BNT162b2 (Pfizer) or adenovirus vector vaccines versus messenger RNA (mRNA)-1273 (Moderna). mRNA-1273 was associated with higher antibody levels than BNT162b2 or adenovirus vector vaccines after adjusting for time since vaccination, age, and underlying condition. Antibody levels were strongly correlated with pseudovirus neutralization titers (Spearman r = 0.89, P < .0001), but in seropositive participants with intermediate antibody levels, neutralization titers were significantly lower in immunocompromised individuals versus HCW. CONCLUSIONS: Antibody responses to COVID-19 vaccines were lowest among SOT and anti-CD20 monoclonal recipients, and recipients of vaccines other than mRNA-1273. Among those with intermediate antibody levels, pseudovirus neutralization titers were lower in immunocompromised patients than HCWs. Additional SARS-CoV-2 preventive approaches are needed for immunocompromised persons, which may need to be tailored to the cause of immunodeficiency.


Subject(s)
COVID-19 , HIV Infections , Adult , Antibodies, Viral , BNT162 Vaccine , COVID-19/prevention & control , COVID-19 Vaccines , HIV Infections/complications , Humans , Immunocompromised Host , Prospective Studies , SARS-CoV-2 , Vaccination
12.
ASAIO J ; 68(3): 394-401, 2022 03 01.
Article in English | MEDLINE | ID: mdl-34593684

ABSTRACT

Before the 33rd Annual International Society for Heart and Lung Transplantation conference, there was significant intercenter variability in definitions of primary graft dysfunction (PGD). The incidence, risk factors, and outcomes of consensus-defined PGD warrant further investigation. We retrospectively examined 448 adult cardiac transplant recipients at our institution from 2005 to 2017. Patient and procedural characteristics were compared between PGD cases and controls. Multivariable logistic regression was used to model PGD and immediate postoperative high-inotrope requirement for hypothesized risk factors. Patients were followed for a mean 5.3 years to determine longitudinal mortality. The incidence of PGD was 16.5%. No significant differences were found with respect to age, sex, race, body mass index, predicted heart mass mismatch, pretransplant amiodarone therapy, or pretransplant mechanical circulatory support (MCS) between recipients with PGD versus no PGD. Each 10 minute increase in ischemic time was associated with 5% greater odds of PGD (OR = 1.05 [95% CI, 1.00-1.10]; p = 0.049). Pretransplant MCS, predicted heart mass mismatch ≥30%, and pretransplant amiodarone therapy were associated with high-immediate postoperative inotropic requirement. The 30 day, 1 year, and 5 year mortality for patients with PGD were 28.4%, 38.0%, and 45.8%, respectively, compared with 1.9%, 7.1%, and 21.5% for those without PGD (log-rank, p < 0.0001). PGD heralded high 30 day, 1 year, and 5 year mortality. Pretransplant MCS, predicted heart mass mismatch, and amiodarone exposure were associated with high-inotrope requirement, while prolonged ischemic time and multiple perioperative transfusions were associated with consensus-defined PGD, which may have important clinical implications under the revised United Network for Organ Sharing allocation system.


Subject(s)
Heart Transplantation , Lung Transplantation , Primary Graft Dysfunction , Adult , Heart Transplantation/adverse effects , Humans , Lung Transplantation/adverse effects , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/etiology , Retrospective Studies , Risk Factors , Transplant Recipients
13.
J Thorac Dis ; 13(9): 5458-5466, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34659812

ABSTRACT

BACKGROUND: Left ventricular dimension has the potential to impact clinical outcomes following implantation of left ventricular assist devices (LVAD). We investigated the effect of pre-implant left ventricular end-diastolic diameter (LVEDD) on outcomes following LVAD implantation. METHODS: Patients implanted with a continuous-flow LVAD between 2004 and 2018 at a single institution were included. The primary outcome was death while on LVAD support. Secondary outcomes included adverse event rates such as renal failure requiring dialysis, device thrombosis, and right ventricular failure. The LVEDD measurements were dichotomized using restricted cubic splines and threshold regression. Survival was determined using Kaplan-Meier estimates. Multivariable logistic regression was used to determine risk-adjusted mortality based on LVEDD. RESULTS: A total of 344 patients underwent implantation of a continuous flow LVAD during the study period. The optimal cut point for LVEDD was 65 mm, with 126 (36.6%) subjects in the <65 mm group and 165 (48.0%) in the >65 mm group. The LVEDD <65 mm group was older, had more females, higher incidence of diabetes, more pre-implant mechanical ventilation, and more admissions for acute myocardial infarctions (all, P<0.05). Importantly, post-implant adverse events were similar between the groups (all, P>0.05). Risk-adjusted survival at 1-year (OR 1.3, 95% CI: 0.6-2.5, P=0.53) was also comparable between the groups. Furthermore, incremental increases in LVEDD when modeled as a continuous variable did not impact overall mortality (OR 0.98, 95% CI: 0.9-1.0, P=0.09). CONCLUSIONS: Preoperative LVEDD was not associated with rates of major morbidities or mortality following LVAD implantation.

14.
J Heart Lung Transplant ; 40(7): 595-603, 2021 07.
Article in English | MEDLINE | ID: mdl-33785250

ABSTRACT

BACKGROUND: Allosensitization in heart transplant candidates is associated with longer transplant wait times and post-transplant complications. We summarize our experience with desensitization using carfilzomib, an irreversible proteasome inhibitor that causes plasma cell apoptosis. METHODS: One cycle of desensitization consisted of plasmapheresis and carfilzomib 20 mg/m2 on days 1, 2, 8, 9, 15, and 16 with intravenous immune globulin 2 g/kg after carfilzomib on day 16. Patients underwent repeat cycles as indicated. We compare calculated panel-reactive antibody (cPRA) for neat combined Class I and II IgG and C1q pre- and post-treatment using a cutoff for cPRA entry of ≥ 4000 and 500 MFI, respectively. RESULTS: From June 2013 to October 2019, 9 patients underwent 20 cycles of carfilzomib-based desensitization. Each cycle resulted in an average cPRA decrease of 24% (95% CI: 6-42) for IgG and 36% (95% CI: 17-55) for C1q. From treatment start to finish, mean cPRA fell from 76% to 40% (p = 0.01) for IgG and 56% to 4% (p = 0.017) for C1q. Six of 9 patients have been transplanted with 5 of the transplanted hearts crossing preoperative donor-specific antibodies. During a median follow-up of 35.1 months, all transplanted patients have survived with only 1 occurrence of treated rejection. Side effects of desensitization included acute kidney injury (67%) and thrombocytopenia (33%) with all episodes self-resolving. CONCLUSIONS: A carfilzomib-based desensitization strategy among heart transplant candidates reduces the level of HLA antibodies and complement binding, facilitates successful transplantation, and is associated with excellent outcomes at 3 years.


Subject(s)
Desensitization, Immunologic/methods , Graft Rejection/prevention & control , Heart Transplantation , Oligopeptides/pharmacology , Plasma Cells/immunology , Tissue Donors , Adult , Aged , Female , Follow-Up Studies , Graft Rejection/immunology , Humans , Male , Middle Aged , Retrospective Studies
15.
J Card Surg ; 36(1): 105-110, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33124124

ABSTRACT

BACKGROUND: The predictive value of preoperative pulmonary function testing (PFT) in left ventricular assist device (LVAD) patients remains unknown. This study evaluates the relationship between abnormal PFTs and postimplant outcomes in LVAD patients. METHODS: LVAD implants from January 2004 to December 2018 at a single institution were included. Patients were stratified based on the presence of abnormal preoperative PFTs, and the primary outcome was respiratory adverse events (AE). Secondary outcomes included 1-year overall postimplant survival, and complications including bleeding, renal failure, thromboembolism, and device malfunction. RESULTS: The total of 333 patients underwent LVAD implant, 46.5% (n = 155) with normal PFTs and 53.5% (n = 178) with abnormal PFTs. Patients with abnormal PFTs were noted to have higher rates of respiratory AEs (25.9% vs. 15.1%, p = .049). In multivariable analysis, the impact of PFTs was most significant when forced expiratory volume in 1 s/forced expiratory volume (FEV1/FVC) ratio was less than 0.5 (hazard ratio [HR] 16.32, 95% confidence interval [CI], 1.70-156.78). The rates of other AEs including bleeding, renal failure, right heart failure, and device malfunction were similar. One-year overall postimplant survival was comparable between the groups (56.8% vs. 68.8%, p = .3183), though patients in the lowest strata of FEV1 (<60% predicted) and FEV1/FVC (<0.5) had elevated risk-adjusted hazards for mortality (HR 2.63, 95% CI, 1.51-4.60 and HR 18.92, 95% CI, 2.10-170.40, respectively). CONCLUSIONS: The presence of abnormal preoperative PFTs is not prohibitory for LVAD implantation although it can be used for risk stratification for respiratory AEs and mortality, particularly in patients with severely reduced metrics. The importance of careful patient selection should be underscored in this higher risk patient subset.


Subject(s)
Heart Failure , Heart-Assist Devices , Renal Insufficiency , Heart Failure/therapy , Heart-Assist Devices/adverse effects , Humans , Respiratory Function Tests , Retrospective Studies , Treatment Outcome
16.
Transplantation ; 105(3): 608-619, 2021 03 01.
Article in English | MEDLINE | ID: mdl-32345866

ABSTRACT

BACKGROUND: Psychosocial evaluations are required for long-term mechanical circulatory support (MCS) candidates, no matter whether MCS will be destination therapy (DT) or a bridge to heart transplantation. Although guidelines specify psychosocial contraindications to MCS, there is no comprehensive examination of which psychosocial evaluation domains are most prognostic for clinical outcomes. We evaluated whether overall psychosocial risk, determined across all psychosocial domains, predicted outcomes, and which specific domains appeared responsible for any effects. METHODS: A single-site retrospective analysis was performed for adults receiving MCS between April 2004 and December 2017. Using an established rating system, we coded psychosocial evaluations to identify patients at low, moderate, or high overall risk. We similarly determined risk within each of 10 individual psychosocial domains. Multivariable analyses evaluated whether psychosocial risk predicted clinical decisions about MCS use (DT versus bridge), and postimplantation mortality, transplantation, rehospitalization, MCS pump exchange, and standardly defined adverse medical events (AEs). RESULTS: In 241 MCS recipients, greater overall psychosocial risk increased the likelihood of a DT decision (odds ratio, 1.76; P = 0.017); and postimplantation pump exchange and occurrence of AEs (hazard ratios [HRs] ≥ 1.25; P ≤ 0.042). The individual AEs most strongly predicted were cardiac arrhythmias and device malfunctions (HRs ≥ 1.39; P ≤ 0.032). The specific psychosocial domains predicting at least 1 study outcome were mental health problem severity, poorer medical adherence, and substance use (odds ratios and HRs ≥ 1.32; P ≤ 0.010). CONCLUSIONS: The psychosocial evaluation predicts not only clinical decisions about MCS use (DT versus bridge) but important postimplantation outcomes. Strategies to address psychosocial risk factors before or soon after implantation may help to reduce postimplantation clinical risks.


Subject(s)
Heart Failure/therapy , Heart Transplantation/psychology , Heart-Assist Devices , Female , Follow-Up Studies , Heart Failure/psychology , Humans , Male , Middle Aged , Prognosis , Retrospective Studies , Time Factors
17.
J Card Surg ; 36(2): 643-650, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33295043

ABSTRACT

BACKGROUND: This study evaluated 20-year survival after adult orthotopic heart transplantation (OHT). METHODS: The United Network of Organ Sharing Registry database was queried to study adult OHT recipients between 1987 and 1998 with over 20-year posttransplant follow-up. The primary and secondary outcomes were 20-year survival and cause of death after OHT, respectively. Multivariable logistic regression was used to identify significant independent predictors of long-term survival, and long-term survival was compared among cohorts stratified by number of predictors using Kaplan Meier survival analysis. RESULTS: 20,658 patients undergoing OHT were included, with a median follow-up of 9.0 (IQR, 3.2-15.4) years. Kaplan-Meier estimates of 10-, 15-, and 20-year survival were 50.2%, 30.1%, and 17.2%, respectively. Median survival was 10.1 (IQR, 3.9-16.9) years. Increasing recipient age (>65 years), increasing donor age (>40 years), increasing recipient body mass index (>30), black race, ischemic cardiomyopathy, and longer cold ischemic time (>4 h) were adversely associated with a 20-year survival. Of these 6 negative predictors, presence of 0 risk factors had the greatest 10-year (59.7%) and 20-year survival (26.2%), with decreasing survival with additional negative predictors. The most common cause of death in 20-year survivors was renal, liver, and/or multisystem organ failure whereas graft failure more greatly impacted earlier mortality. CONCLUSIONS: This study identifies six negative preoperative predictors of 20-year survival with 20-year survival rates exceeding 25% in the absence of these factors. These data highlight the potential for very long-term survival after OHT in patients with end-stage heart failure and may be useful for patient selection and prognostication.


Subject(s)
Heart Failure , Heart Transplantation , Adult , Aged , Graft Survival , Humans , Kaplan-Meier Estimate , Retrospective Studies , Survival Rate , Tissue Donors , United States/epidemiology
18.
JAMA Cardiol ; 6(2): 159-167, 2021 02 01.
Article in English | MEDLINE | ID: mdl-33112391

ABSTRACT

Importance: The US heart allocation policy was changed on October 18, 2018. The association of this change with recipient and donor selection and outcomes remains to be elucidated. Objective: To evaluate changes in patient characteristics, wait list outcomes, and posttransplant outcomes after the recent allocation policy change in heart transplant. Design, Setting, and Participants: In this cohort study, all 15 631 adults undergoing heart transplants, excluding multiorgan transplants, in the US as identified by the United Network for Organ Sharing multicenter, national registry were reviewed. Patients were stratified according to prepolicy change (October 1, 2015, to October 1, 2018) and postpolicy change (October 18, 2018 or after). Follow-up data were available through March 31, 2020. Exposures: Heart transplants after the policy change. Main Outcomes and Measures: Competing risk regression for wait list outcomes was performed. Posttransplant survival was compared using the Kaplan-Meier method, and risk adjustment was performed using multivariable Cox proportional hazards regression analysis. Results: In this cohort study, of the 15 631 patients undergoing transplant, 10 671 (mean [SD] age, 53.1 [12.7] years; 7823 [73.3%] male) were wait listed before and 4960 (mean [SD] age, 52.7 [13.0] years; 3610 [72.8%] male) were wait listed after the policy change. Competing risk regression demonstrated reduced likelihood of mortality or deterioration (subhazard ratio [SHR], 0.60; 95% CI, 0.52-0.69; P < .001), increased likelihood of transplant (SHR, 1.38; 95% CI, 1.32-1.45; P < .001), and reduced likelihood of recovery (SHR, 0.54; 95% CI, 0.40-0.73; P < .001) for wait listed patients after the policy change. A total of 6078 patients underwent transplant before and 2801 after the policy change. Notable changes after the policy change included higher frequency of bridging with temporary mechanical circulatory support and lower frequency of bridging with durable left ventricular assist devices. Posttransplant survival was reduced after the policy change (1-year: 92.1% vs 87.5%; log-rank P < .001), a finding that persisted after risk adjustment (HR, 1.29; 95% CI, 1.07-1.55; P = .008). Conclusions and Relevance: Substantial changes have occurred in adult heart transplant in the US after the policy change in October 2018. Wait list outcomes have improved, although posttransplant survival has decreased. These data confirm findings from earlier preliminary analyses and demonstrate that these trends have persisted to 1-year follow-up, underscoring the importance of continued reevaluation of the new heart allocation policy.


Subject(s)
Health Policy , Heart Failure/surgery , Heart Transplantation/trends , Survival Rate/trends , Tissue and Organ Procurement/organization & administration , Waiting Lists/mortality , Adult , Aged , Female , Humans , Male , Middle Aged , Patient Selection , United States
19.
J Card Surg ; 35(11): 3053-3061, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33016378

ABSTRACT

BACKGROUND: Institutional factors have been shown to impact outcomes following orthotopic heart transplantation (OHT). This study evaluated center variability in the utilization of induction therapy for OHT and its implications on clinical outcomes. METHODS: Adult OHT patients between 2010 and 2018 were identified from the United Network for Organ Sharing registry. Transplant centers were stratified based on their rates of induction therapy utilization. Mixed-effects logistic regression models were created with drug-treated rejection within 1 year as primary endpoint and individual centers as a random parameter. Risk-adjusted Cox regression was used to evaluate patient-level mortality outcomes. RESULTS: In 17,524 OHTs performed at 100 centers, induction therapy was utilized in 48.6% (n = 8411) with substantial variability between centers (interquartile range, 21.4%-79.1%). There were 36, 30, and 34 centers in the low (<29%), intermediate (29%-66%), and high (>67%) induction utilization terciles groups, respectively. Induction therapy did not account for the observed variability in the treated rejection rate at 1 year among centers after adjusting for donor and recipient factors (p = .20). No differences were observed in postoperative outcomes among induction utilization centers groups (all, p > .05). Furthermore, there was a weak correlation between the percentage of induction therapy utilization at the center-level and recipients found to have moderate (r = .03) or high (r = .04) baseline risks for acute rejection at 1 year. CONCLUSIONS: This analysis demonstrates that there is substantial variability in the use of induction therapy among OHT centers. In addition, there was a minimal correlation with baseline recipient risk or 1-year rejection rates, suggesting a need for better-standardized practices for induction therapy use in OHT.


Subject(s)
Drug Utilization/statistics & numerical data , Graft Rejection/prevention & control , Heart Transplantation , Immunosuppression Therapy/methods , Immunosuppression Therapy/statistics & numerical data , Induction Chemotherapy/statistics & numerical data , Adult , Aged , Antilymphocyte Serum/administration & dosage , Basiliximab/administration & dosage , Female , Graft Rejection/etiology , Heart Transplantation/adverse effects , Heart Transplantation/mortality , Humans , Male , Middle Aged , Time Factors , Treatment Outcome
20.
Ann Thorac Surg ; 110(6): 2026-2033, 2020 12.
Article in English | MEDLINE | ID: mdl-32376349

ABSTRACT

BACKGROUND: Prior studies demonstrated that female sex is associated with an increased mortality after orthotopic heart transplantation (OHT). The impact of sex on OHT outcomes after bridging with newer-generation durable left ventricular assist devices (LVADs) remains unclear. METHODS: The United Network for Organ Sharing database was queried to study OHT recipients bridged with a newer-generation LVAD (ie, HeartMate III or HeartWare) between 2010 and 2018. The primary outcome was mortality at 30 and 90-days and 1-year. Secondary outcomes included rates of posttransplant complications. Propensity score matching and Cox multivariable analysis were used to assess comorbidity-adjusted sex differences in outcomes. RESULTS: A total of 3010 patients (76.7% male) bridged with newer-generation LVADs underwent OHT. After adjusting for relevant covariates, both age and heart failure etiology, but not sex, were independent predictors of mortality. In the matched cohorts, sex did not affect posttransplant outcomes, including renal failure, cerebrovascular events, allograft rejection, functional status, or mortality (all P > .05). Survival at 1-year after OHT was 90.5% in males and 92.8% in females (P = .058). CONCLUSIONS: Among 3010 OHT recipients, matched females bridged with newer-generation HeartWare or HeartMate III LVADs have comparable posttransplant outcomes compared with males. Furthermore, survival at 1-year follow-up was not affected by sex; instead, it was driven by well-established risk factors including increased age, worse preoperative renal function, and heart failure etiology. These data suggest that considerable progress has been made in mitigating sex differences in heart failure outcomes in the modern era.


Subject(s)
Heart Failure/mortality , Heart Failure/surgery , Heart Transplantation , Heart-Assist Devices , Postoperative Complications/epidemiology , Adult , Female , Heart Failure/etiology , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Propensity Score , Proportional Hazards Models , Retrospective Studies , Risk Factors , Sex Factors , Survival Rate , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...