RESUMO
BACKGROUND: This study evaluates the effects of pre-transplant transpulmonary gradient (TPG) and donor right ventricular mass (RVM) on outcomes following heart transplantation. METHODS: UNOS registry was queried to analyze adult recipients who underwent primary isolated heart transplantation from 1/1/2010 to 12/31/2018. The recipients were dichotomized into two groups based on their TPG at the time of transplantation, <12 and ≥12 mmHg. The outcomes included 5-year survival and post-transplant complications. Propensity score-matching was performed. Sub-analysis was performed to evaluate the effects of donor-recipient RVM matching, where a ratio <0.85 was classified as undersized, 0.85-1.15 as size-matched, and >1.15 as oversized. RESULTS;: 17,898 isolated heart transplant recipients were analyzed, and 5,129 (28.7%) recipients had TPG ≥12 mmHg at the time of transplantation. The recipients with TPG ≥12 mmHg experienced significantly lower 5-year survival (78.4% vs 81.2%, p<0.001) compared to the recipients with TPG <12 mmHg, and this finding persisted in the propensity score-matched comparison. The recipients with TPG ≥12 mmHg experienced a higher rate of post-transplant dialysis and a longer duration of hospitalization. Oversizing the donor RVM considerably improved the 5-year survival among the recipients with TPG ≥12 mmHg, comparable to those with TPG <12 mmHg. CONCLUSION: Elevated pre-transplant TPG negatively impacts post-transplant survival. However, oversizing the donor RVM is associated with improved survival in recipients with elevated TPG, resulting in improved survival that is comparable to recipients with normal TPG. Therefore, careful risk stratification and donor matching among recipients with elevated TPG is essential to improve outcomes in this vulnerable population.
RESUMO
BACKGROUND: This study evaluates the clinical trends and impact of hepatitis C virus-positive (HCV+) donors on waitlist and posttransplant outcomes after heart transplantation. METHODS: The United Network for Organ Sharing registry was queried to identify adult waitlisted and transplanted patients from January 1, 2015, to December 31, 2022. In the waitlist analysis, the candidates were stratified into 2 cohorts based on whether they were willing to accept HCV+ donor offers. Waitlist outcomes included 1-y cumulative incidences of transplantation and death/delisting. In the posttransplant analysis, the recipients were stratified into 2 cohorts with and without HCV nucleic acid test (NAT)-positive donors. Outcomes included 1- and 4-y posttransplant survival. Propensity score-matching was performed. Risk adjustment was performed using multivariable Cox regression. RESULTS: During the study period, the number of centers using HCV NAT+ donors increased from 1 to 65 centers, along with the number of transplants. In the waitlist analysis, 26 648 waitlisted candidates were analyzed, and 4535 candidates (17%) were approved to accept HCV+ donors. Approval to accept HCV+ donors was associated with a higher likelihood of transplantation and a lower likelihood of death/delisting within 1 y of waitlisting. In the posttransplant analysis, 21 131 recipients were analyzed, and 997 recipients (4.7%) received HCV NAT+ hearts. The 1- and 4-y posttransplant survival were comparable between the recipients of HCV NAT+ and NAT- donors. Furthermore, the similar 1- and 4-y posttransplant survival persisted in the propensity score-matched comparison and multivariable Cox regression analysis. CONCLUSIONS: Utilization of HCV+ donors is rising. Heart transplants using HCV+ donors are associated with improved waitlist and comparable posttransplant outcomes.
RESUMO
BACKGROUND: This study evaluates the interaction of donor and recipient age with outcomes following heart transplantation under the 2018 heart allocation system. METHODS: The United Network for Organ Sharing registry was queried to analyze adult primary isolated orthotopic heart transplant recipients and associated donors from August 18, 2018, to June 30, 2021. Both recipient and donor cohorts were grouped according to age: <65 and ≥65 y for recipients and <50 and ≥50 y for donors. The primary outcome was survival. Subanalyses were performed to evaluate the impact of donor age. RESULTS: A total of 7601 recipients and 7601 donors were analyzed. Of these, 1584 recipients (20.8%) were ≥65 y old and 560 donors (7.4%) were ≥50 y old. Compared with recipients <65, recipients ≥65 had decreased 1-y (88.8% versus 92.3%) and 2-y (85.1% versus 88.5%) survival rates (P < 0.001). The association of recipient age ≥65 with lower survival persisted after adjusting for potential cofounders (hazard ratio, 1.38; 95% confidence interval, 1.18-1.61; P < 0.001). Recipients <65 with donors ≥50 had comparable 1-y and 2-y survival rates to recipients <65 with donors <50 (P =0.997). Conversely, transplantation of older allografts was associated with lower 1-y (84.2% versus 89.4%) and 2-y (79.5% versus 85.8%) survival rates in recipients ≥65 (P = 0.025). CONCLUSIONS: Recipient age ≥65 continues to be associated with worse survival following heart transplantation in the 2018 heart allocation system compared with younger recipients. Donors ≥50 may be acceptable among recipients <65 with comparable outcomes. However, careful donor age selection should be considered for recipients ≥65, as the use of younger donor allografts appears to improve posttransplantation survival.
RESUMO
BACKGROUND: This study evaluates the clinical trends, risk factors, and impact of waitlist blood transfusion on outcomes following isolated heart transplantation. METHODS: The UNOS registry was queried to identify adult recipients from January 1, 2014, to June 30, 2022. The recipients were stratified into two groups depending on whether they received a blood transfusion while on the waitlist. The incidence of waitlist transfusion was compared before and after the 2018 allocation policy change. The primary outcome was survival. Propensity score-matching was performed. Multivariable logistic regression was performed to identify predictors of waitlist transfusion. A sub-analysis was performed to evaluate the impact of waitlist time on waitlist transfusion. RESULTS: From the 21 926 recipients analyzed in this study, 4201 (19.2%) received waitlist transfusion. The incidence of waitlist transfusion was lower following the allocation policy change (14.3% vs. 23.7%, p < 0.001). The recipients with waitlist transfusion had significantly reduced 1-year posttransplant survival (88.8% vs. 91.9%, p < 0.001) compared to the recipients without waitlist transfusion in an unmatched comparison. However, in a propensity score-matched comparison, the two groups had similar 1-year survival (90.0% vs. 90.4%, p = 0.656). Multivariable analysis identified ECMO, Impella, and pretransplant dialysis as strong predictors of waitlist transfusion. In a sub-analysis, the odds of waitlist transfusion increased nonlinearly with longer waitlist time. CONCLUSION: There is a lower incidence of waitlist transfusion among transplant recipients under the 2018 allocation system. Waitlist transfusion is not an independent predictor of adverse posttransplant outcomes but rather a marker of the patient's clinical condition. ECMO, Impella, and pretransplant dialysis are strong predictors of waitlist transfusion.
Assuntos
Transfusão de Sangue , Transplante de Coração , Sistema de Registros , Listas de Espera , Humanos , Masculino , Listas de Espera/mortalidade , Feminino , Transplante de Coração/efeitos adversos , Transplante de Coração/mortalidade , Pessoa de Meia-Idade , Seguimentos , Fatores de Risco , Prognóstico , Taxa de Sobrevida , Transfusão de Sangue/estatística & dados numéricos , Sobrevivência de Enxerto , Adulto , Estudos RetrospectivosRESUMO
Life expectancy of patients with a durable, continuous-flow left ventricular assist device (CF-LVAD) continues to increase. Despite significant improvements in the delivery of care for patients with these devices, hemocompatability-related adverse events (HRAEs) are still a concern and contribute to significant morbility and mortality when they occur. As such, dissemination of current best evidence and practices is of critical importance. This ISHLT Consensus Statement is a summative assessment of the current literature on prevention and management of HRAEs through optimal management of oral anticoagulant and antiplatelet medications, parenteral anticoagulant medications, management of patients at high risk for HRAEs and those experiencing thrombotic or bleeding events, and device management outside of antithrombotic medications. This document is intended to assist clinicians caring for patients with a CF-LVAD provide the best care possible with respect to prevention and management of these events.
Assuntos
Consenso , Coração Auxiliar , Coração Auxiliar/efeitos adversos , Humanos , Anticoagulantes/uso terapêutico , Insuficiência Cardíaca/terapia , Insuficiência Cardíaca/cirurgia , Trombose/prevenção & controle , Trombose/etiologia , Hemorragia/prevenção & controle , Inibidores da Agregação Plaquetária/uso terapêuticoRESUMO
BACKGROUND: This study evaluates the clinical trends, risk factors, and effects of post-transplant stroke and subsequent functional independence on outcomes following orthotopic heart transplantation under the 2018 heart allocation system. METHODS: The United Network for Organ Sharing registry was queried to identify adult recipients from October 18, 2018 to December 31, 2021. The cohort was stratified into 2 groups with and without post-transplant stroke. The incidence of post-transplant stroke was compared before and after the allocation policy change. Outcomes included post-transplant survival and complications. Multivariable logistic regression was performed to identify risk factors for post-transplant stroke. Sub-analysis was performed to evaluate the impact of functional independence among recipients with post-transplant stroke. RESULTS: A total of 9,039 recipients were analyzed in this study. The incidence of post-transplant stroke was higher following the policy change (3.8% vs 3.1%, p = 0.017). Thirty-day (81.4% vs 97.7%) and 1-year (66.4% vs 92.5%) survival rates were substantially lower in the stroke cohort (p < 0.001). The stroke cohort had a higher rate of post-transplant renal failure, longer hospital length of stay, and worse functional status. Multivariable analysis identified extracorporeal membrane oxygenation, durable left ventricular assist device, blood type O, and redo heart transplantation as strong predictors of post-transplant stroke. Preserved functional independence considerably improved 30-day (99.2% vs 61.2%) and 1-year (97.7% vs 47.4%) survival rates among the recipients with post-transplant stroke (p < 0.001). CONCLUSIONS: There is a higher incidence of post-transplant stroke under the 2018 allocation system, and it is associated with significantly worse post-transplant outcomes. However, post-transplant stroke recipients with preserved functional independence have improved survival, similar to those without post-transplant stroke.
Assuntos
Transplante de Coração , Complicações Pós-Operatórias , Acidente Vascular Cerebral , Humanos , Masculino , Feminino , Estados Unidos/epidemiologia , Pessoa de Meia-Idade , Acidente Vascular Cerebral/epidemiologia , Complicações Pós-Operatórias/epidemiologia , Fatores de Risco , Estudos Retrospectivos , Obtenção de Tecidos e Órgãos , Incidência , Sistema de Registros , Taxa de Sobrevida/tendências , Adulto , Idoso , SeguimentosRESUMO
OBJECTIVE: This study aimed to investigate the clinical trends and the impact of the 2018 heart allocation policy change on both waitlist and post-transplant outcomes in simultaneous heart-kidney transplantation in the United States. METHODS: The United Network for Organ Sharing registry was queried to compare adult patients before and after the allocation policy change. This study included 2 separate analyses evaluating the waitlist and post-transplant outcomes. Multivariable analyses were performed to determine the 2018 allocation system's risk-adjusted hazards for 1-year waitlist and post-transplant mortality. RESULTS: The initial analysis investigating the waitlist outcomes included 1779 patients listed for simultaneous heart-kidney transplantation. Of these, 1075 patients (60.4%) were listed after the 2018 allocation policy change. After the policy change, the waitlist outcomes significantly improved with a shorter waitlist time, lower likelihood of de-listing, and higher likelihood of transplantation. In the subsequent analysis investigating the post-transplant outcomes, 1130 simultaneous heart-kidney transplant recipients were included, where 738 patients (65.3%) underwent simultaneous heart-kidney transplantation after the policy change. The 90-day, 6-month, and 1-year post-transplant survival and complication rates were comparable before and after the policy change. Multivariable analyses demonstrated that the 2018 allocation system positively impacted risk-adjusted 1-year waitlist mortality (sub-hazard ratio, 0.66, 95% CI, 0.51-0.85, P < .001), but it did not significantly impact risk-adjusted 1-year post-transplant mortality (hazard ratio, 1.03; 95% CI, 0.72-1.47, P = .876). CONCLUSIONS: This study demonstrates increased rates of simultaneous heart-kidney transplantation with a shorter waitlist time after the 2018 allocation policy change. Furthermore, there were improved waitlist outcomes and comparable early post-transplant survival after simultaneous heart-kidney transplantation under the 2018 allocation system.
Assuntos
Transplante de Coração , Transplante de Rim , Adulto , Humanos , Estados Unidos , Transplante de Rim/efeitos adversos , Transplante de Coração/efeitos adversos , Modelos de Riscos Proporcionais , Listas de Espera , Estudos RetrospectivosRESUMO
OBJECTIVE: To quantitate the impact of heart donation after circulatory death (DCD) donor utilization on both waitlist and post-transplant outcomes in the United States. METHODS: The United Network for Organ Sharing database was queried to identify all adult waitlisted and transplanted candidates between October 18, 2018, and December 31, 2022. Waitlisted candidates were stratified according to whether they had been approved for donation after brain death (DBD) offers only or also approved for DCD offers. The cumulative incidence of transplantation was compared between the 2 cohorts. In a post-transplant analysis, 1-year post-transplant survival was compared between unmatched and propensity-score-matched cohorts of DBD and DCD recipients. RESULTS: A total of 14,803 candidates were waitlisted, including 12,287 approved for DBD donors only and 2516 approved for DCD donors. Overall, DCD approval was associated with an increased sub-hazard ratio (HR) for transplantation and a lower sub-HR for delisting owing to death/deterioration after risk adjustment. In a subgroup analysis, candidates with blood type B and status 4 designation received the greatest benefit from DCD approval. A total of 12,238 recipients underwent transplantation, 11,636 with DBD hearts and 602 with DCD hearts. Median waitlist times were significantly shorter for status 3 and status 4 recipients receiving DCD hearts. One-year post-transplant survival was comparable between unmatched and propensity score-matched cohorts of DBD and DCD recipients. CONCLUSIONS: The use of DCD hearts confers a higher probability of transplantation and a lower incidence of death/deterioration while on the waitlist, particularly among certain subpopulations such as status 4 candidates. Importantly, the use of DCD donors results in similar post-transplant survival as DBD donors.
Assuntos
Transplante de Coração , Obtenção de Tecidos e Órgãos , Adulto , Humanos , Morte Encefálica , Doadores de Tecidos , Transplante de Coração/efeitos adversos , Probabilidade , Encéfalo , Estudos Retrospectivos , Sobrevivência de EnxertoRESUMO
Background: Implantable cardioverter-defibrillation (ICD) shocks after left ventricular assist device therapy (LVAD) are associated with adverse clinical outcomes. Little is known about the association of pre-LVAD ICD shocks on post-LVAD clinical outcomes and whether LVAD therapy affects the prevalence of ICD shocks. Objectives: The purpose of this study was to determine whether pre-LVAD ICD shocks are associated with adverse clinical outcomes post-LVAD and to compare the prevalence of ICD shocks before and after LVAD therapy. Methods: Patients 18 years or older with continuous-flow LVADs and ICDs were retrospectively identified within the University of Pittsburgh Medical Center system from 2006-2020. We analyzed the association between appropriate ICD shocks within 1 year pre-LVAD with a primary composite outcome of death, stroke, and pump thrombosis and secondary outcomes of post-LVAD ICD shocks and ICD shock hospitalizations. Results: Among 309 individuals, average age was 57 ± 12 years, 87% were male, 80% had ischemic cardiomyopathy, and 42% were bridge to transplantation. Seventy-one patients (23%) experienced pre-LVAD shocks, and 69 (22%) experienced post-LVAD shocks. The overall prevalence of shocks pre-LVAD and post-LVAD were not different. Pre-LVAD ICD shocks were not associated with the composite outcome. Pre-LVAD ICD shocks were found to predict post-LVAD shocks (hazard ratio [HR] 5.7; 95% confidence interval [CI] 3.42-9.48; P <.0001) and hospitalizations related to ICD shocks from ventricular arrhythmia (HR 10.34; 95% CI 4.1-25.7; P <.0001). Conclusion: Pre-LVAD ICD shocks predicted post-LVAD ICD shocks and hospitalizations but were not associated with the composite outcome of death, pump thrombosis, or stroke at 1 year. The prevalence of appropriate ICD shocks was similar before and after LVAD implantation in the entire cohort.
RESUMO
In this project, we describe proteasome inhibitor (PI) treatment of antibody-mediated rejection (AMR) in heart transplantation (HTX). From January 2018 to September 2021, 10 patients were treated with PI for AMR: carfilzomib (CFZ) n = 8; bortezomib (BTZ) n = 2. Patients received 1-3 cycles of PI. All patients had ≥1 strong donor-specific antibody (DSA) (mean fluorescence intensity [MFI] > 8000) in undiluted serum. Most DSAs (20/21) had HLA class II specificity. The MFI of strong DSAs had a median reduction of 56% (IQR = 13%-89%) in undiluted serum and 92% (IQR = 53%-95%) at 1:16 dilution. Seventeen DSAs in seven patients were reduced > 50% at 1:16 dilution after treatment. Four DSAs from three patients did not respond. DSA with MFI > 8000 at 1:16 dilution was less responsive to treatment. 60% (6/10) patients presented with graft dysfunction; 4/6 recovered ejection fraction > 40% after treatment. Pathologic AMR was resolved in 5/7 (71.4%) of patients within 1 year after treatment. 9/10 (90%) patients survived to 1 year after AMR diagnosis. Using PI in AMR resulted in significant DSA reduction with some resolution of graft dysfunction. Larger studies are needed to evaluate PI for AMR.
Assuntos
Transplante de Coração , Transplante de Rim , Humanos , Inibidores de Proteassoma/uso terapêutico , Isoanticorpos , Transplante de Rim/efeitos adversos , Antígenos HLA , Doadores de Tecidos , Rejeição de Enxerto/tratamento farmacológico , Rejeição de Enxerto/etiologia , Estudos RetrospectivosRESUMO
The purpose of this study was to describe the changes in plasma levels of angiogenic and inflammatory biomarkers, specifically Ang-2 and TNF-α, in patients receiving HeartMate II (HMII) left ventricular assist device (LVAD) and correlate them with nonsurgical bleeding. It has been shown that angiopoietin-2 (Ang-2) and tissue necrosis factor-α (TNF-α) may be linked to bleeding in LVAD patients. This study utilized biobanked samples prospectively collected from the PREVENT study, a prospective, multicenter, single-arm, nonrandomized study of patients implanted with HMII. Paired serum samples were obtained in 140 patients before implantation and at 90 days postimplantation. Baseline demographics were as follows: age 57 ± 13 years, 41% had ischemic etiology, 82% male, and 75% destination therapy indication. In the 17 patients with baseline elevation of both TNF-α and Ang-2, 10 (60%) experienced a significant bleeding event within 180 days postimplant compared with 37 of 98 (38%) patients with Ang-2 and TNF-α below the mean ( p = 0.02). The hazard ratio for a bleeding event was 2.3 (95% CI: 1.2-4.6) in patients with elevated levels of both TNF-α and Ang-2. In the PREVENT multicenter study, patients with elevations in serum Angiopoietin-2 and TNF-α at baseline before LVAD implantation demonstrated increased bleeding events after LVAD implantation.
Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Humanos , Masculino , Adulto , Pessoa de Meia-Idade , Idoso , Feminino , Fator de Necrose Tumoral alfa , Angiopoietina-2 , Estudos Prospectivos , Coração Auxiliar/efeitos adversos , Tromboplastina , Hemorragia/etiologia , Necrose/complicações , Insuficiência Cardíaca/cirurgia , Insuficiência Cardíaca/complicações , Estudos RetrospectivosRESUMO
BACKGROUND: We sought to quantify the impact of pre- and postoperative variables on health-related quality of life (HRQOL) after left ventricular assist device (LVAD) implantation. METHODS: Primary durable LVAD implants between 2012 and 2019 in the Interagency Registry for Mechanically Assisted Circulatory Support were identified. Multivariable modeling using general linear models assessed the impact of baseline characteristics and postimplant adverse events (AEs) on HRQOL as assessed by the EQ-5D visual analog scale (VAS) and the Kansas City Cardiomyopathy Questionnaire-12 (KCCQ) at 6 months and 3 years. RESULTS: Of 22,230 patients, 9,888 had VAS and 10,552 had KCCQ reported at 6 months, and 2,170 patients had VAS and 2,355 had KCCQ reported at 3 years postimplant. VAS improved from a mean of 38.2 ± 28.3 to 70.7 ± 22.9 at 6 months and from 40.1 ± 27.8 to 70.3 ± 23.1 at 3 years. KCCQ improved from 28.2 ± 23.9 to 64.3 ± 23.2 at 6 months and from 29.8 ± 23.7 to 63.0 ± 23.7 at 3 years. Preimplant variables, including baseline VAS, had small effect sizes on HRQOL while postimplant AEs had large negative effect sizes. Recent stroke, respiratory failure, and renal dysfunction had the largest negative effect on HRQOL at 6 months, while recent renal dysfunction, respiratory failure, and infection had the largest negative effect at 3 years. CONCLUSIONS: AEs following LVAD implantation have large negative effects on HRQOL in early and late follow-up. Understanding the impact of AEs on HRQOL may assist shared decision-making regarding LVAD eligibility. Continued efforts to reduce post-LVAD AEs are warranted to improve HRQOL in addition to survival.
Assuntos
Insuficiência Cardíaca , Coração Auxiliar , Nefropatias , Insuficiência Respiratória , Humanos , Qualidade de Vida , Coração Auxiliar/efeitos adversos , Insuficiência Cardíaca/cirurgia , Sistema de Registros , Resultado do TratamentoRESUMO
BACKGROUND: This study compared outcomes of patients waitlisted for orthotopic heart transplantation with durable left ventricular assist devices (LVAD) before and after the October 18, 2018 heart allocation policy change. METHODS: The United Network of Organ Sharing database was queried to identify 2 cohorts of adult candidates with durable LVAD listed within seasonally-matched, equal-length periods before (old policy era [OPE]) and after the policy change (new policy era [NPE]). The primary outcomes were 2-year survival from the time of initial waitlisting, as well as 2-year post-transplant survival. Secondary outcomes included incidence of transplantation from the waitlist and de-listing due to either death or clinical deterioration. RESULTS: A total of 2,512 candidates were waitlisted, 1,253 within the OPE and 1,259 within the NPE. Candidates under both policies had similar 2-year survival after waitlisting, as well as a similar cumulative incidence of transplantation and de-listing due to death and/or clinical deterioration. A total of 2,560 patients were transplanted within the study period, 1,418 OPE and 1,142 within the NPE. Two-year post-transplant survival was similar between policy eras, however, the NPE was associated with a higher incidence of post-transplant stroke, renal failure requiring dialysis, and a longer hospital length of stay. CONCLUSIONS: The 2018 heart allocation policy has conferred no significant impact on overall survival from the time of initial waitlisting among durable LVAD-supported candidates. Similarly, the cumulative incidence of transplantation and waitlist mortality have also been largely unchanged. For those undergoing transplantation, a higher degree of post-transplant morbidity was observed, though survival was not impacted.
Assuntos
Deterioração Clínica , Insuficiência Cardíaca , Transplante de Coração , Coração Auxiliar , Adulto , Humanos , Insuficiência Cardíaca/cirurgia , Insuficiência Cardíaca/epidemiologia , Coração Auxiliar/efeitos adversos , Transplante de Coração/efeitos adversos , Sistema de RegistrosRESUMO
BACKGROUND: This study evaluated the current clinical trends, risk factors, and temporal effects of post-transplant dialysis on outcomes following orthotopic heart transplantation after the 2018 United States adult heart allocation policy change. METHODS: The United Network for Organ Sharing (UNOS) registry was queried to analyze adult orthotopic heart transplant recipients after the October 18, 2018 heart allocation policy change. The cohort was stratified according to the need for post-transplant de novo dialysis. The primary outcome was survival. Propensity score-matching was performed to compare the outcomes between 2 similar cohorts with and without post-transplant de novo dialysis. The impact of post-transplant dialysis chronicity was evaluated. Multivariable logistic regression was performed to identify risk factors for post-transplant dialysis. RESULTS: A total of 7,223 patients were included in this study. Out of these, 968 patients (13.4%) developed post-transplant renal failure requiring de novo dialysis. Both 1-year (73.2% vs 94.8%) and 2-year (66.3% vs 90.6%) survival rates were lower in the dialysis cohort (p < 0.001), and the lower survival rates persisted in a propensity-matched comparison. Recipients requiring only temporary post-transplant dialysis had significantly improved 1-year (92.5% vs 71.6%) and 2-year (86.6 % vs 52.2%) survival rates compared to the chronic post-transplant dialysis group (p < 0.001). Multivariable analysis demonstrated low pretransplant estimated glomerular filtration (eGFR) and bridge with extracorporeal membrane oxygenation (ECMO) were strong predictors of post-transplant dialysis. CONCLUSIONS: This study demonstrates that post-transplant dialysis is associated with significantly increased morbidity and mortality in the new allocation system. Post-transplant survival is affected by the chronicity of post-transplant dialysis. Low pretransplant eGFR and ECMO are strong risk factors for post-transplant dialysis.
Assuntos
Insuficiência Cardíaca , Transplante de Coração , Transplante de Rim , Insuficiência Renal , Adulto , Humanos , Estados Unidos/epidemiologia , Diálise Renal , Transplante de Coração/efeitos adversos , Fatores de Risco , Estudos Retrospectivos , Resultado do TratamentoRESUMO
BACKGROUND: Induction immunosuppression in heart transplant recipients varies greatly by center. Basiliximab (BAS) is the most commonly used induction immunosuppressant but has not been shown to reduce rejection or improve survival. The objective of this retrospective study was to compare rejection, infection, and mortality within the first 12 months following heart transplant in patients who received BAS or no induction. METHODS: This was a retrospective cohort study of adult heart transplant recipients given BAS or no induction from January 1, 2017 to May 31, 2021. The primary endpoint was incidence of treated acute cellular rejection (ACR) at 12-months post-transplant. Secondary endpoints included ACR at 90 days post-transplant, incidence of antibody-mediated rejection (AMR) at 90 days and 1 year, incidence of infection, and all-cause mortality at 1 year. RESULTS: A total of 108 patients received BAS, and 26 patients received no induction within the specified timeframe. There was a lower incidence of ACR within the first year in the BAS group compared to the no induction group (27.7 vs. 68.2%, p < .002). BAS was independently associated with a lower probability of having a rejection event during the first 12-months post-transplant (hazard ratio (HR) .285, 95% confidence interval [CI] .142-.571, p < .001). There was no difference in the rate of infection and in mortality after hospital discharge at 1-year post-transplant (6% vs. 0%, p = .20). CONCLUSION: BAS appears to be associated with greater freedom from rejection without an increase in infections. BAS may be a preferred to a no induction strategy in patients undergoing heart transplantation.
Assuntos
Anticorpos Monoclonais , Transplante de Coração , Humanos , Adulto , Basiliximab , Anticorpos Monoclonais/uso terapêutico , Estudos Retrospectivos , Imunossupressores/uso terapêutico , Imunossupressores/farmacologia , Rejeição de Enxerto/etiologia , Proteínas Recombinantes de Fusão/uso terapêuticoRESUMO
Efforts to optimize guideline-directed medical therapy (GDMT) through team-based care may affect outcomes in patients with heart failure with reduced ejection fraction (HFrEF). This study evaluated the impact of an innovative medication optimization clinic (MOC) on GDMT and outcomes in patients with HFrEF. Patients with HFrEF who are not receiving optimal GDMT are referred to MOC and managed by a team comprised of a nurse practitioner or physician assistant, clinical pharmacist, and HF cardiologist. We retrospectively evaluated the impact of MOC (n = 206) compared with usual care (n = 412) with a 2:1 propensity-matched control group. The primary clinical outcome was the incidence of HF hospitalizations at 3 months after the index visit. Kaplan-Meier cumulative event curves and Cox proportional hazards regression models with adjustment were conducted. A significantly higher proportion of patients in MOC received quadruple therapy (49% vs 4%, p <0.0001), angiotensin receptor neprilysin inhibitor (60% vs 27%, p <0.0001), mineralocorticoid receptor antagonist (59% vs 37%, p <0.0001), and sodium-glucose cotransporter-2 inhibitor (60% vs 10%, p <0.0001). The primary outcome was significantly lower in the MOC versus the control group (log-rank, p = 0.0008). Cox regression showed that patients in the control group were more than threefold more likely to be hospitalized because of HF than those in the MOC group (p = 0.0014). In conclusion, the MOC was associated with improved GDMT and lower risk of HF hospitalizations in patients with HFrEF.
Assuntos
Diabetes Mellitus Tipo 2 , Insuficiência Cardíaca , Inibidores do Transportador 2 de Sódio-Glicose , Humanos , Insuficiência Cardíaca/epidemiologia , Estudos Retrospectivos , Diabetes Mellitus Tipo 2/tratamento farmacológico , Volume Sistólico , Inibidores do Transportador 2 de Sódio-Glicose/uso terapêutico , Hospitalização , Antagonistas de Receptores de Angiotensina/uso terapêuticoRESUMO
BACKGROUND: Since the revision of the United States heart allocation system, increasing use of mechanical circulatory support has been observed as a means to support acutely ill patients. We sought to compare outcomes between patients bridged to orthotopic heart transplantation (OHT) with either temporary (t-LVAD) or durable left ventricular assist devises (d-LVAD) under the revised system. METHODS: The United States Organ Network database was queried to identify all adult OHT recipients who were bridged to transplant with either an isolated t-LVAD or d-LVAD from 10/18/2018 to 9/30/2020. The primary outcome was 1-year post-transplant survival. Predictors of mortality were also modeled, and national trends of LVAD bridging were examined across the study period. RESULTS: About 1,734 OHT recipients were analyzed, 1,580 (91.1%) bridged with d-LVAD and 154 (8.9%) bridged with t-LVAD. At transplant, the t-LVAD cohort had higher total bilirubin levels and greater prevalence of pre-transplant intravenous inotrope usage and mechanical ventilation. Median waitlist time was also shorter for t-LVAD. At 1 year, there was a non-significant trend of increased survival in the t-LVAD cohort (94.8% vs 90.1%; p = 0.06). After risk adjustment, d-LVAD was associated with a 4-fold hazards for 1-year mortality (hazard ratio 3.96, 95% confidence interval 1.42-11.03; p = 0.009). From 2018 to 2021, t-LVAD bridging increased, though d-LVAD remained a more common bridging strategy. CONCLUSIONS: Since the 2018 allocation change, there has been a steady increase in t-LVAD usage as a bridge to OHT. Overall, patients bridged with these devices appear to have least equivalent 1-year survival compared to those bridged with d-LVAD.
Assuntos
Insuficiência Cardíaca , Transplante de Coração , Coração Auxiliar , Adulto , Humanos , Insuficiência Cardíaca/cirurgia , Insuficiência Cardíaca/etiologia , Coração Auxiliar/efeitos adversos , Resultado do Tratamento , Estudos Retrospectivos , Transplante de Coração/efeitos adversosRESUMO
BACKGROUND: Anomalous coronary arteries arise in a small subset of the population, with each configuration conveying a varying degree of long-term risk. The utilization of cardiac grafts with these anomalies have not been well described. CASE PRESENTATION: An anomalous single coronary artery with the left main coronary artery arising from the right coronary ostium was discovered in a 40-year old male evaluated for cardiac donation. After evaluation, this heart was successfully procured and utilized for orthotopic heart transplantation. CONCLUSION: In this report, we demonstrate that in select cases, a cardiac graft with single coronary artery anatomy can be successfully procured and transplanted with excellent outcomes.
Assuntos
Doença da Artéria Coronariana , Anomalias dos Vasos Coronários , Transplante de Coração , Humanos , Masculino , Adulto , Anomalias dos Vasos Coronários/cirurgia , Doadores de Tecidos , Doença da Artéria Coronariana/cirurgiaRESUMO
BACKGROUND: We studied humoral responses after coronavirus disease 2019 (COVID-19) vaccination across varying causes of immunodeficiency. METHODS: Prospective study of fully vaccinated immunocompromised adults (solid organ transplant [SOT], hematologic malignancy, solid cancers, autoimmune conditions, human immunodeficiency virus [HIV]) versus nonimmunocompromised healthcare workers (HCWs). The primary outcome was the proportion with a reactive test (seropositive) for immunoglobulin G to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) receptor-binding domain. Secondary outcomes were comparisons of antibody levels and their correlation with pseudovirus neutralization titers. Stepwise logistic regression was used to identify factors associated with seropositivity. RESULTS: A total of 1271 participants enrolled: 1099 immunocompromised and 172 HCW. Compared with HCW (92.4% seropositive), seropositivity was lower among participants with SOT (30.7%), hematological malignancies (50.0%), autoimmune conditions (79.1%), solid tumors (78.7%), and HIV (79.8%) (P < .01). Factors associated with poor seropositivity included age, greater immunosuppression, time since vaccination, anti-CD20 monoclonal antibodies, and vaccination with BNT162b2 (Pfizer) or adenovirus vector vaccines versus messenger RNA (mRNA)-1273 (Moderna). mRNA-1273 was associated with higher antibody levels than BNT162b2 or adenovirus vector vaccines after adjusting for time since vaccination, age, and underlying condition. Antibody levels were strongly correlated with pseudovirus neutralization titers (Spearman râ =â 0.89, Pâ <â .0001), but in seropositive participants with intermediate antibody levels, neutralization titers were significantly lower in immunocompromised individuals versus HCW. CONCLUSIONS: Antibody responses to COVID-19 vaccines were lowest among SOT and anti-CD20 monoclonal recipients, and recipients of vaccines other than mRNA-1273. Among those with intermediate antibody levels, pseudovirus neutralization titers were lower in immunocompromised patients than HCWs. Additional SARS-CoV-2 preventive approaches are needed for immunocompromised persons, which may need to be tailored to the cause of immunodeficiency.
Assuntos
COVID-19 , Infecções por HIV , Adulto , Anticorpos Antivirais , Vacina BNT162 , COVID-19/prevenção & controle , Vacinas contra COVID-19 , Infecções por HIV/complicações , Humanos , Hospedeiro Imunocomprometido , Estudos Prospectivos , SARS-CoV-2 , VacinaçãoRESUMO
Before the 33rd Annual International Society for Heart and Lung Transplantation conference, there was significant intercenter variability in definitions of primary graft dysfunction (PGD). The incidence, risk factors, and outcomes of consensus-defined PGD warrant further investigation. We retrospectively examined 448 adult cardiac transplant recipients at our institution from 2005 to 2017. Patient and procedural characteristics were compared between PGD cases and controls. Multivariable logistic regression was used to model PGD and immediate postoperative high-inotrope requirement for hypothesized risk factors. Patients were followed for a mean 5.3 years to determine longitudinal mortality. The incidence of PGD was 16.5%. No significant differences were found with respect to age, sex, race, body mass index, predicted heart mass mismatch, pretransplant amiodarone therapy, or pretransplant mechanical circulatory support (MCS) between recipients with PGD versus no PGD. Each 10 minute increase in ischemic time was associated with 5% greater odds of PGD (OR = 1.05 [95% CI, 1.00-1.10]; p = 0.049). Pretransplant MCS, predicted heart mass mismatch ≥30%, and pretransplant amiodarone therapy were associated with high-immediate postoperative inotropic requirement. The 30 day, 1 year, and 5 year mortality for patients with PGD were 28.4%, 38.0%, and 45.8%, respectively, compared with 1.9%, 7.1%, and 21.5% for those without PGD (log-rank, p < 0.0001). PGD heralded high 30 day, 1 year, and 5 year mortality. Pretransplant MCS, predicted heart mass mismatch, and amiodarone exposure were associated with high-inotrope requirement, while prolonged ischemic time and multiple perioperative transfusions were associated with consensus-defined PGD, which may have important clinical implications under the revised United Network for Organ Sharing allocation system.