RESUMO
Data from Cardiac Transplant Research Database (CTRD) were analyzed from 1999 to 2006 to examine the effects of different induction strategies at the time of cardiac transplantation. A total of 2090 primary heart transplants were categorized by induction with interleukin-2 receptor blocker (IL-2RB), antithymocyte globulin (ATG), or no induction (NI). Probabilities for rejection and infection were estimated with parametric time-related models. Using these models, hazard was calculated for two theoretical patient profiles, one at lower risk for rejection and higher risk of infection (Profile 1) and higher risk for rejection and lower risk of infection (Profile 2). Of the 2090 transplants, 49.8% (1095) did not receive induction, 27.3% (599) received IL-2RB, and 18.0% (396) received ATG. Profile 1 patients had lower hazard for rejection with IL-2RB compared to ATG and NI (p < 0.01), but at the cost of increased risk of infection (5.0 vs. 1.8 vs. 1.6, respectively, at four wk, p < 0.01). Profile 2 patients experienced a fivefold decreased hazard for rejection when treated with IL-2RB compared with ATG and NI (p < 0.01). In patients at high risk of infection, IL-2RB reduced risk of rejection but at the expense of increased hazard for infection.
Assuntos
Soro Antilinfocitário/uso terapêutico , Doença da Artéria Coronariana/cirurgia , Rejeição de Enxerto/epidemiologia , Transplante de Coração , Imunossupressores/uso terapêutico , Infecções/epidemiologia , Receptores de Interleucina-2/antagonistas & inibidores , Adulto , Seguimentos , Sobrevivência de Enxerto/efeitos dos fármacos , Humanos , Pessoa de Meia-Idade , Prognóstico , Indução de Remissão , Fatores de RiscoRESUMO
BACKGROUND: Racial disparities in renal transplantation outcomes have been documented with inferior allograft survival among African Americans compared with non-African Americans. These differences have been attributed to a variety of factors, including immunologic hyperresponsiveness, socioeconomic status, compliance, HLA matching, and access to care. The purpose of this study was to examine both immunologic and nonimmunologic risk factors for allograft loss with a goal of defining targeted strategies to improve outcomes among African Americans. STUDY DESIGN: We retrospectively analyzed all primary deceased-donor adult renal transplants (n = 2,453) at our center between May 1987 and December 2004. Analysis included the impact of recipient and donor characteristics, HLA typing, and immunosuppressive regimen on graft outcomes. Data were analyzed using standard Kaplan-Meier actuarial techniques and were explored with nonparametric and parametric methods. Multivariable analyses in the hazard-function domain were done to identify specific risk factors associated with graft loss. RESULTS: The 1-year allograft survival in recipients improved substantially throughout the study period, and 3-year allograft survival also improved. Risk factor analyses are shown by type of allograft and according to specific time periods. Risk of immunologic graft loss (acute rejection) was most prominent during the early phase. During late-phase, immunologic risk persists (chronic rejection), but recurrent disease, graft quality, and recipient's comorbidities have an increasingly greater role. CONCLUSIONS: Advances in immunosuppression regimens have contributed to allograft survival in both early and late (constant) phases throughout all eras, but improvement in longterm outcomes for African Americans continues to lag behind non-African Americans. The disparity in renal allograft loss between African Americans and non-African Americans over time indicates that beyond immunologic risk, the impact of nonimmunologic variables, such as time on dialysis pretransplantation, diabetes, and access to medical care, can be key issues.
Assuntos
Negro ou Afro-Americano/estatística & dados numéricos , Sobrevivência de Enxerto , Transplante de Rim , Fatores Etários , Diabetes Mellitus Tipo 2/complicações , Feminino , Sobrevivência de Enxerto/imunologia , Acessibilidade aos Serviços de Saúde , Humanos , Terapia de Imunossupressão/métodos , Masculino , Diálise Renal , Estudos Retrospectivos , Fatores de Risco , Fatores Socioeconômicos , Análise de Sobrevida , Fatores de TempoRESUMO
BACKGROUND: Maintenance steroid (MS) use in pediatric heart transplantation is variable. The purpose of this study was to evaluate the impact of MS use on graft outcomes. METHODS: All patients <18 years old in the Pediatric Heart Transplant Study database at the time of first heart transplant between 1993 and 2011 who survived ≥30 days post-transplant and were from centers with a protocolized approach to MS use were included (N = 2,178). Patients were grouped by MS use at 30 days post-transplant as MS+ or MS- (no MS use). Propensity score analysis was used to generate matched groups of MS+ and MS- patients based on pre-transplant and peri-transplant factors. Kaplan-Meier survival analysis was used to compare freedom from graft loss, graft loss secondary to rejection, rejection, rejection with severe hemodynamic compromise (RSHC), malignancy, and infection between groups. RESULTS: Of patients, 1,393 (64%) were MS+ and 785 (36%) were MS-. There were 315 MS- patients who had propensity matched MS+ controls. Kaplan-Meier estimates showed no difference in graft loss (p = 0.9) or graft loss secondary to rejection (p = 0.09). At 1 year post-transplant, there was no difference in freedom from rejection (p = 0.15) or malignancy (p = 0.07), but there was lower freedom from RSHC and infection in the MS- group (p = 0.05 and p = 0.02, respectively). CONCLUSIONS: MS use at 30 days post-transplant was not associated with enhanced graft survival after pediatric heart transplant. MS- patients had a higher incidence of RSHC and infection. These risks should be taken into consideration when determining MS use for pediatric recipients of heart transplants.
Assuntos
Rejeição de Enxerto/prevenção & controle , Insuficiência Cardíaca/cirurgia , Transplante de Coração , Esteroides/administração & dosagem , Adolescente , Fatores Etários , Criança , Pré-Escolar , Bases de Dados Factuais , Esquema de Medicação , Feminino , Rejeição de Enxerto/epidemiologia , Sobrevivência de Enxerto , Insuficiência Cardíaca/mortalidade , Humanos , Terapia de Imunossupressão , Incidência , Lactente , Estimativa de Kaplan-Meier , Masculino , Pontuação de Propensão , Estudos Retrospectivos , Fatores de Risco , Resultado do TratamentoRESUMO
OBJECTIVE: Dating back to the first published report of the Fontan circulation in 1971, multiple studies have examined the long-term results of this standard procedure for palliation of single-ventricle heart disease in children. Although the technique has evolved over the last 4 decades to include a polytetrafluorethylene (PTFE) conduit for a large percentage of patients, the long-term outcome has not yet been established. The aim of the current study was to investigate the possibility of a late increasing risk for death after 15 years among patients with a modern Fontan operation and to evaluate late morbidity. METHODS: Between January 1, 1988, and December 31, 2011, 207 patients underwent the Fontan procedure using an internal or external PTFE conduit plus a bidirectional cavopulmonary connection. Survival and late adverse events were analyzed. Risk factors for early and late mortality were examined using hazard function methodology. RESULTS: At 1, 10, and 20 years, survival for the entire cohort was 95%, 88%, and 76%, respectively, with no deaths in the last 6 years of the study. Hazard modeling showed a 1.3% risk of death per year 24 years after the Fontan procedure, with no late increasing hazard phase. Freedom from reoperations was greater than 90% at 20 years and freedom from thrombotic complications was 98% at 20 years (with greater than 80% of patients on aspirin alone). Survival curves were superimposable for 16- to 20-mm conduits, and the freedom from any reoperation including transplantation was greater than 90% after 20 years. Multivariable risk factor analysis identified only earlier date of operation as a predictor of early and late mortality. By era of surgery, the 10-year predicated survival is 89% for patients undergoing surgery in 2000 and 94% for patients in 2010. CONCLUSIONS: Early and late survival after a Fontan operation with a PTFE conduit is excellent, with no late phase of increasing death risk after 20 years. Late functional status is good, the need for late reoperation is rare, and thrombotic complications are uncommon on a standard medical regimen including aspirin as the only anticoagulation medication.
Assuntos
Técnica de Fontan/mortalidade , Cardiopatias Congênitas/cirurgia , Anticoagulantes/uso terapêutico , Técnica de Fontan/efeitos adversos , Técnica de Fontan/instrumentação , Cardiopatias Congênitas/diagnóstico , Cardiopatias Congênitas/mortalidade , Humanos , Estimativa de Kaplan-Meier , Análise Multivariada , Cuidados Paliativos , Complicações Pós-Operatórias/mortalidade , Complicações Pós-Operatórias/cirurgia , Modelos de Riscos Proporcionais , Reoperação , Estudos Retrospectivos , Medição de Risco , Fatores de Risco , Sobreviventes , Fatores de Tempo , Resultado do TratamentoRESUMO
BACKGROUND: The purpose of these studies was to determine the incidence and survival of patients with specific malignancies with respect to age and transplant year and to compare the data with the normal non-transplant population. METHODS: Data from 6,211 primary cardiac transplants between July 31, 1993, and December 30, 2008, were collected by 35 institutions participating in the Cardiac Transplant Research Database. Data were compared with information collected by the Surveillance Epidemiology and End Results (SEER) Cancer Statistics Review 1975-2006. RESULTS: Multivariable analysis showed older age (relative risk [RR], 2.1; p < 0.0001) and earlier transplant year (RR, 1.8; p < 0.0001) were highly significant risk factors. Aggregate malignancy incidence in the modern era (2001 to 2008) did not differ significantly from the normal population, which appeared to be attributable to a lower rate of malignancies other than lung cancer, lymphoma, and melanoma (actual/expected ratio, 0.71). From 2001 to 2008, rates were significantly higher for lung cancer (actual/expected ratio, 1.86; p = 0.006) and lymphoma (actual/expected ratio, 4.3, p < 0.0001) than in the normal population. The highest risk for lymphoma was in younger adults who received transplants at ages 18 to 35 years (actual/expected ratio, 27). The highest risk for lung cancer was in patients who underwent transplantation at ages 55 to 65 years (actual/expected ratio, 28). Once diagnosed with malignancy, subsequent survival at 5 years was 21% for lung cancer and 32% for lymphoma. CONCLUSIONS: The risk of malignancy has markedly declined during a 15-year period such that the aggregate rate of malignancy approached that of the general population in the United States. However, the distribution of malignancies was not the same, with a greater prominence of lung cancer and lymphoproliferative disease.
Assuntos
Previsões , Transplante de Coração/efeitos adversos , Neoplasias/epidemiologia , Adolescente , Adulto , Seguimentos , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Neoplasias/etiologia , Estudos Retrospectivos , Fatores de Risco , Programa de SEER , Taxa de Sobrevida/tendências , Estados Unidos/epidemiologia , Adulto JovemRESUMO
Coronary spasm during coronary angiography for vasculopathy in children can be prevented by the intracoronary administration of nitroglycerin. We reviewed the anesthesia and catheterization reports and charts for pediatric transplant recipients who underwent angiography from 2005 through 2010. Correlation analysis was used to study the relation of post-injection systolic blood pressure (SBP) to nitroglycerin dose. Forty-one angiographic evaluations were performed on 25 patients (13 male and 12 female). Mean age was 9.9 ± 3.2 years (range, 3.3-16.1 yr). The mean total dose of nitroglycerin was 2.93 ± 1.60 µg/kg (range, 1-8 µg/kg). There was a significant drop between the baseline SBP (mean, 106 ± 21.6 mmHg) and the lowest mean SBP before nitroglycerin administration (78 ± 13.2, P <0.0001, paired t test). There was no significant additional change in SBP (mean after nitroglycerin administration, 80.7 ± 13.1 mmHg; P = 0.2). There was a significant drop in lowest heart rate between baseline (109 ± 16.5 beats/min) and before nitroglycerin administration (89 ± 14.3 beats/min; P <0.0001, paired t test). There was no significant additional change in heart rate (mean heart rate after nitroglycerin, 84 ± 17.7 beats/min; P = 0.09). There were 2 interventions for SBP before nitroglycerin and 2 after nitroglycerin. One child experienced a transient ST-T-segment change during angiography after nitroglycerin. In the highest dose range, the additional decrease in SBP was 7.2 mmHg (P=0.03). Routine intracoronary nitroglycerin administration in this dose range produced no significant changes in SBP or heart rate in children.
Assuntos
Pressão Sanguínea/efeitos dos fármacos , Angiografia Coronária/efeitos adversos , Doença da Artéria Coronariana/diagnóstico por imagem , Vasoespasmo Coronário/prevenção & controle , Frequência Cardíaca/efeitos dos fármacos , Transplante de Coração/efeitos adversos , Nitroglicerina/administração & dosagem , Vasodilatadores/administração & dosagem , Adolescente , Fatores Etários , Criança , Pré-Escolar , Doença da Artéria Coronariana/etiologia , Vasoespasmo Coronário/diagnóstico , Vasoespasmo Coronário/etiologia , Feminino , Humanos , Masculino , Nitroglicerina/efeitos adversos , Valor Preditivo dos Testes , Fatores de Tempo , Vasodilatadores/efeitos adversosRESUMO
BACKGROUND: The Cardiac Transplant Research Database (CTRD) collected data from 26 U.S. institutions from January 1, 1990 to December 31, 2008 providing the opportunity for construction of a comprehensive multivariable model of risk for death after transplantation. We analyzed risk factors for death over 19 years of experience to determine how risk profiles have changed over time and how they interact with age. METHODS: A multivariable parametric hazard model for death was created for 7,015 patients entered into the CTRD. Variables collected over 19 years of experience were examined as potential risk factors and tested for interaction with date of transplantation to determine if their relative risk (RR) changed over time. RESULTS: The hazard for death post-transplant occurred in 2 phases: an early phase of acute risk lasting <1 year, and a late phase of relatively low, gradually increasing risk (<0.1 event/year). In the early phase, predictive models showed that ventricular assist device (VAD) at the time of transplant did not increase the RR of death for recipient transplant at 30 years of age, but the RR of death was increased by 60% (p = 0.04) at 60 years of age. Of the late-phase variables found to be risk factors, the RR of age, date of transplant and pulmonary vascular resistance changed with respect to transplant year. The overall risk of death dropped importantly over the study period, but the RR of all other variables remained unchanged. RR was 2.6 (p < 0.0001) for 25-year-old African-American (AA) versus non-AA recipients and 1.6 for 60-year-old AA recipients (p = 0.02). CONCLUSION: Over 19 years, the baseline risk of death has decreased, but the specific risk factors and the magnitudes of their RR have remained unchanged. Therefore, despite advances in clinical management and improvement in overall survival, the risk profile for death after cardiac transplantation is similar to that in 1990.
Assuntos
Bases de Dados Factuais , Transplante de Coração/mortalidade , Transplante de Coração/estatística & dados numéricos , Adulto , Fatores Etários , Feminino , Humanos , Estudos Longitudinais , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Grupos Raciais , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida , Fatores de TempoRESUMO
BACKGROUND: The accuracy of various risk models to predict early post-transplant mortality is limited by the type, quality, and era of the data collected. Most models incorporate a large number of recipient-derived and donor-derived variables; however, other factors related to specific institutional practices likely influence early mortality. The goal of this study was to determine if the addition of institutional practice variables would improve the predictive accuracy of a recipient/donor risk model in a modern cohort of heart transplant recipients. METHODS: Between 1999 and 2007, 3,591 primary heart transplants were performed at the 26 institutions participating in the Cardiac Transplant Research Database. Multivariable regression analysis in the hazard domain was used to identify recipient, donor, and institutional practice variables that were predictive of 1-year mortality. The derived model was used to predict institutional outcomes and compare them with observed outcomes first without and then with the inclusion of the institutional practice variables. RESULTS: Eleven individual plus 2 interaction recipient variables and 2 individual plus 2 interaction donor variables were predictive of increased mortality. The addition of institutional practice variables to the model identified 4 variables associated with decreased mortality: greater number of transplant cardiologists, a thoracic surgery fellowship, a surgery or cardiology attending taking donor call, and routine surveillance for antibody-mediated rejection. By using a p-value > 0.10 as a robust measure of similarity, the addition of institutional practice variables increased the number of institutions with similar predicted vs. observed mortality from 18 of 26 institutions (69%) to 26 of 26 (100%), demonstrating improved predictive accuracy of the model. CONCLUSIONS: Multiple recipient and donor variables influence early survival but do not fully explain the difference in predicted and observed outcomes at the institutional level. Variations in staffing and clinical practice contribute to risk, and the addition of these variables to our risk model improved predictive accuracy.
Assuntos
Algoritmos , Transplante de Coração/mortalidade , Corpo Clínico Hospitalar/estatística & dados numéricos , Modelos Estatísticos , Padrões de Prática Médica/estatística & dados numéricos , Adulto , Estudos de Coortes , Feminino , Humanos , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , Análise de Regressão , Estudos Retrospectivos , Fatores de Risco , Taxa de Sobrevida , Resultado do TratamentoRESUMO
Ventricular assist device implantation is associated with gastrointestinal bleeding (GIB); however, outcomes in terms of initial and repeat GIB risk, severity, location of lesions, and endoscopic interventions need to be better defined. Consecutive patients from a database of adult patients with ventricular assist devices (VADs) implanted between January 1, 2000, and December 31, 2010, at a single center were reviewed and followed through May 31, 2011, in a retrospective manner. The GIB events were further classified by severity, lesion location, and lesion type. Hazard analysis models were calculated for the time to GIB events. Of 166 patients with a VAD, 38 patients experienced 84 GIB events. Seventeen patients experienced ≥2 GIB events. Maximal hazard for the first bleeding event was 2.23 events/patient-year at 21 days and declined to the constant hazard by 71 days postimplantation. The hazard for recurrent GIB was greatest immediately after the first GIB event. When considering all GIB events, most lesions (68%) were located in the proximal bowel. Angiodysplasia was the most common lesion type (17.5%) seen on endoscopy when all GIB events were considered, whereas ulcers were the most common type (13.8%) seen in initial GIB events. The actuarial risk of initial GIB events peaks in the first 3 months after VAD implantation followed by a stable lower risk of bleeding. The hazard for recurrent GIB events is substantially increased immediately after the initial GIB.
Assuntos
Hemorragia Gastrointestinal/etiologia , Insuficiência Cardíaca/terapia , Coração Auxiliar/efeitos adversos , Adulto , Idoso , Anticoagulantes/uso terapêutico , Endoscopia/métodos , Feminino , Hemorragia Gastrointestinal/complicações , Insuficiência Cardíaca/complicações , Humanos , Masculino , Pessoa de Meia-Idade , Complicações Pós-Operatórias , Período Pós-Operatório , Modelos de Riscos Proporcionais , Recidiva , Estudos Retrospectivos , Risco , Fatores de Tempo , Resultado do Tratamento , Adulto JovemRESUMO
Determining optimal timing for implant of ventricular assist device (VAD) in end-stage heart failure remains a challenge and may be aided by a risk assessment tool. For a cohort of 80 consecutive VAD implants at a single center, observed 1 year survival post-VAD was compared with the estimated survival had these patients not received a VAD, using the Seattle Heart Failure Model (SHFM). The SHFM was adjusted with a hazard ratio of 1.17 for inotrope use (Cochrane Meta-analysis of phosphodiesterase inhibitors) and a hazard ratio of 2.92 for balloon pump, ventilator, or renal replacement therapy (Comparative Outcome and Clinical Profiles in Transplantation [COCPIT] Model). Values immediately before surgery were used to calculate the SHFM score. Point estimates of 1 year survival were compared using Z scores. Mean age was 53 ± 14 (± standard deviation [SD]) years with mean left ventricular ejection fraction of 17 ± 6%. At the time of VAD implant, 92% were on inotropes, 53% had balloon pump, and 15% were intubated. For the entire cohort, 1 year survival without VAD predicted by the SHFM was 47% versus observed survival after VAD of 60% (p = 0.06). The model was most helpful in patients electively implanted with a left ventricular assist device (LVAD). In this group predicted 1 year survival on medical management was 49% versus an observed survival of 82% after LVAD placement (p < 0.05). The model was least helpful in patients undergoing placement of biventricular assist devices (BiVAD), where the model paradoxically predicted better survival with ongoing medical management. This indicated that the model was unable to forecast outcome in patients with higher severity of illness, for example, in cases warranting BiVAD placement. Observed 1 year survival was better with VAD versus that predicted with medical management. Tools such as the SHFM may aid in determining appropriate timing for VAD by providing an estimated survival with ongoing medical management. The model is best applied to more stable patients being considered for elective VAD implantation.
Assuntos
Insuficiência Cardíaca/classificação , Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/cirurgia , Coração Auxiliar , Feminino , Humanos , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Medição de RiscoRESUMO
BACKGROUND: Jugular venous pressure (JVP) is assessed to estimate volume status in patients with heart failure because right atrial pressure (RAP) reflects pulmonary capillary wedge pressure (PCWP). In a large cohort of heart failure patients spanning 14 years, we sought to further characterize the relationship between RAP and PCWP, including identifying temporal trends, to optimize estimates of PCWP by JVP. We also sought to determine whether the RAP to PCWP relationship impacts post-transplant mortality. METHODS: Hemodynamic data were obtained from 4,079 patients before cardiac transplantation. Elevated RAP was defined as ≥10 mm Hg and elevated PCWP ≥22 mm Hg. Hemodynamics were "concordant" when both RAP and PCWP were elevated or when both were not elevated. The frequency of concordant hemodynamics was assessed over 3 eras (1993 to 1997, 1998 to 2002, 2003 to 2007). Baseline characteristics were compared among quartiles of the ratio (RAP+1)/PCWP. The association of (RAP+1)/PCWP with 2-year mortality after cardiac transplantation was assessed using multivariate models. RESULTS: The frequency of concordant hemodynamics over time was stable (74%, 72%, 73%; p = 0.4). Increasing (RAP+1)/PCWP was associated with the following variables: female gender; cardiomyopathy etiology besides ischemic or non-ischemic; prior sternotomies; and lower creatinine clearance (p < 0.01 for all). Elevated (RAP+1)/PCWP was associated with post-transplant mortality (relative risk 1.2, 95% confidence interval 1.02 to 1.37, p = 0.02). CONCLUSIONS: [corrected] RAP and PCWP remain concordant in most heart failure patients, supporting the ongoing use of JVP to estimate PCWP. Easily identifiable patient characteristics were associated with an increased RAP/PCWP ratio, and their presence should alert clinicians that PCWP may be overestimated by JVP assessment. A higher RAP/PCWP ratio was an adverse risk factor for post-cardiac transplant survival.
Assuntos
Insuficiência Cardíaca/fisiopatologia , Transplante de Coração , Pressão Propulsora Pulmonar/fisiologia , Pressão Ventricular/fisiologia , Cateterismo Cardíaco , Feminino , Seguimentos , Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/cirurgia , Humanos , Masculino , Pessoa de Meia-Idade , Prognóstico , Estudos Retrospectivos , Índice de Gravidade de Doença , Taxa de Sobrevida/tendências , Fatores de Tempo , Estados Unidos/epidemiologiaRESUMO
BACKGROUND: The purpose of this study was to estimate the relationship of age, race and gender to rejection and infection across time with respect to age at time of transplant, year of transplantation and immunosuppressive era. METHODS: The study group consisted of 10,131 patients from 29 institutions in the Cardiac Transplant Research Database (n = 7,368, from 1990 to 2008) and 32 institutions in the Pediatric Heart Transplant Study (n = 2,763, from 1993 to 2008). The probabilities of rejection death and infection death were estimated with a parametric time-related model and adjusted for gender, ethnicity, date of transplant and age. RESULTS: Actuarial survival by age at transplant showed that, when compared with the majority of patients transplanted between the ages of 30 to 60 years, death due to rejection at 5 years was highest among those transplanted at 10 to 30 years of age (p < 0.0001) and lowest in those transplanted at >60 years of age. Death due to infection at 5 years was highest among patients >60 years of age. Risk factors for death from rejection included age (p < 0.0001), female gender (p = 0.0001), black race (p < 0.0001) and transplant date (p < 0.0001); for infection death, risk factors were age (p < 0.0001), date of transplant (p < 0.0001), age (p = 0.002) and black race (p = 0.01). Modeling with respect to age at time of transplant showed an inverse relationship between infection and rejection death. Among patients transplanted at >60 years of age, there was a steep increase in infection-related deaths and a decrease in rejection deaths. Risk for rejection was elevated among young adults 10 to 30 years of age at time of transplant, particularly for black females. CONCLUSION: Death from rejection affects adolescents and young adults preferentially, especially black recipients, whereas death from infection preferentially affects patients >60 years of age. Relative risk of infection vs rejection death with respect to recipient age should be considered in therapeutic plans for recurrent rejection, particularly in adolescents and the elderly.
Assuntos
Infecções Bacterianas/prevenção & controle , Rejeição de Enxerto/prevenção & controle , Insuficiência Cardíaca/cirurgia , Transplante de Coração/mortalidade , Infecções Oportunistas/prevenção & controle , Adolescente , Adulto , Fatores Etários , Infecções Bacterianas/imunologia , População Negra , Criança , Pré-Escolar , Feminino , Rejeição de Enxerto/imunologia , Insuficiência Cardíaca/etnologia , Transplante de Coração/imunologia , Humanos , Imunossupressores/uso terapêutico , Lactente , Estudos Longitudinais , Masculino , Pessoa de Meia-Idade , Infecções Oportunistas/imunologia , Estudos Retrospectivos , Resultado do Tratamento , População Branca , Adulto JovemRESUMO
BACKGROUND: This study was conducted to determine the effect of a disease-management model termed an "intensive surveillance protocol" (ISP) on survival in ventricular assist device (VAD) patients. This intervention consisted of a formalized, protocol-driven, multi-disciplinary team approach to VAD patient follow-up initiated August 1, 2006. The goal was to attain an internal program benchmark of 70% survival at 2 years. Historically, 2-year survival after VAD implant has been sub-optimal, and no patient management algorithms have been formally tested to determine their effect on 2-year survival. METHODS: The study comprised 76 patients, of whom 26 had a VAD as destination therapy (DT) and 50 as a bridge to transplant (BTT), from July 1, 2003, to June 30, 2008. Survival before and after initiation of ISP was compared. A parametric hazard multivariable analysis, with a time-varying covariable for implementation of ISP, was used to evaluate of other factors affecting survival. RESULTS: Survival at 16 months was 100% for DT patients who received a VAD after August 1, 2006 vs 64% for the earlier era (p = 0.06). For BTT, 16- month survival was 71% vs 43% (p = 0.03). Predicted 2-year survival before and after implementation of the ISP improved from 30% to 87% for DT (p = 0.02) and from 20% to 61% for BTT patients (p = 0.01). Predictors of midterm survival by multivariable analysis included ISP (p = 0.004), younger age (p = 0.03), non-emergent implant (p < 0.0001), and isolated left ventricular VAD (p < 0.0001). After adjustment for covariables, the ISP was associated with a 70% reduction in the hazard for death for the entire cohort (p = 0.004). The effect of ISP was also significant in the patients who received the HeartMate XVE (Thoratec, Pleasanton, CA), which spanned both eras of the study. CONCLUSIONS: Survival improved for DT and BTT VAD patients after implementation of the ISP, with a dramatic decrease in hazard for death. Although the transition from pulsatile to axial flow technology occurred during the study period and likely contributed to improved outcomes, the institution of the ISP provided an important and significant contribution to improved survival through a proactive approach to patient management, allowing earlier identification of potential adverse events. For optimal outcomes, VAD patients require intensive follow-up surveillance protocols that have previously become standard in the care of heart transplant patients.
Assuntos
Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/terapia , Coração Auxiliar , Vigilância de Evento Sentinela , Adolescente , Adulto , Idoso , Algoritmos , Feminino , Seguimentos , Humanos , Estimativa de Kaplan-Meier , Estudos Longitudinais , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Taxa de Sobrevida , Resultado do Tratamento , Adulto JovemRESUMO
BACKGROUND: Total lymphoid irradiation (TLI) has been used in transplantation for over 20 years and is currently used in a number of major heart transplant centers as a secondary therapy for recalcitrant recurrent rejection or rejection with hemodynamic compromise. The purpose of this study is to evaluate the long-term risks and efficacy of TLI in the treatment of rejection. METHODS: Between 1990 and 1996, 73 adult patients (from 211 adult transplant recipients) received TLI for recurrent rejection (71%), rejection with hemodynamic compromise (25%), and rejection with vasculitis (4%). The treatment consisted of 80 cGy twice per week for 5 weeks. Fifty-five patients received at least 80% of the full dose (>640 cGy). Follow-up ended December 31, 2007, comprising a total 18 year experience. RESULTS: Patients treated with TLI exhibited a short-term decrease in hazard for rejection in the first 12 months posttransplantation (relative risk, 0.36) but exhibited increased cumulative rejection over the long term. There were no differences in the rates of infection, allograft coronary disease, or malignancy, but seven patients developed myelodysplasia or acute myelogenous leukemia, four of those being the rare but uniformly fatal acute megakaryocytic leukemia type 7. CONCLUSIONS: Patients treated with TLI seemed to experience a reduction in the early hazard for rejection, but long-term outcomes indicate that such patients continued to accumulate more rejection and rejection-death events, likely because these patients were overall at much higher risk for rejection than the other patient groups. We observed minimal long-term complications, except for the unique occurrence of myelodysplasia and acute megakaryocytic leukemia type 7.
Assuntos
Rejeição de Enxerto/radioterapia , Transplante de Coração/mortalidade , Adulto , Rejeição de Enxerto/prevenção & controle , Hemodinâmica , Humanos , Irradiação Linfática , Paraproteinemias/etiologiaRESUMO
BACKGROUND: In 2003 the Department of Health and Human Services sponsored the Organ Donation Breakthrough Collaborative (ODBC) with the aim to increase organ donation. After the ODBC, increases in the number of all solid organs transplanted, except for heart, were seen. The aim of this study was to determine if ODBC resulted in temporal changes in the use of hearts from high-risk donors. METHODS: We analyzed data from the Cardiac Transplant Research Database in three eras: 1990-1995, 1996-2002, and 2003-2007. We explored temporal changes in high-risk donor characteristics: age, gender, hypertension, diabetes mellitus, abnormal echocardiogram, and ischemic time. RESULTS: Between 1990 and 2007, 7,220 patients underwent transplantation in 26 centers. Donors in the first era were least likely to have high-risk characteristics of higher age (mean, 30 years), female gender (30%), hypertension (8%), diabetes mellitus (1%), structural abnormalities on echocardiogram (7%), and prolonged graft ischemic time (mean, 163 minutes). In the second era, there was a significant increase in the use of donors with the above mentioned high-risk characteristics-32 years, 33%, 10%, 3%, 8% and 181 minutes, respectively. In the third post-ODBC era, no further increase was seen in high-risk donors, but rather a trend for avoidance of risk-32 years, 28%, 10%, 2%, 5% and 186 minutes, respectively. CONCLUSION: Significant temporal changes in the characteristics of heart donors have occurred in the past 17 years. Recent temporal changes, however, cannot be directly attributed to the ODBC efforts.
Assuntos
Rejeição de Enxerto/epidemiologia , Transplante de Coração/tendências , Doadores de Tecidos , Fatores Etários , Complicações do Diabetes/complicações , Ecocardiografia , Humanos , Hipertensão/complicações , Fatores de Risco , Fatores Sexuais , Estados UnidosRESUMO
BACKGROUND: Quantification of donor-associated risk in a specific heart transplant recipient is often difficult. Our aim was to identify donor characteristics that affect survival in the contemporary era. METHODS: Between 1990 and 2006, 7,322 patients from 32 centers in the Cardiac Transplant Research Database underwent heart transplantation. Multivariable logistic regression analysis was used to identify donor-associated risk predictors and important interactions between these donor characteristics. Recipient survival was examined using parametric regression analysis in the hazard function domain. RESULTS: Donor characteristics associated with post-transplant death included donor age, donor requirement for vasoactive therapy, positive donor cytomegalovirus serology, longer graft ischemic time, and lower donor body weight. Several interactions between individual donor characteristics affected survival. In male donors, history of hypertension and diabetes mellitus were risk factors for death (p = 0.006, p = 0.04, respectively), but not in female donors (p = 0.5, p = 0.8, respectively). There was a significant interaction between donor age and recipient-donor weight difference. If the donor was of younger age, increasing recipient-donor weight difference did not result in increased death. With increasing donor age, weight difference did result in compromised survival (p < 0.0003). Donor and recipient gender further modified the degree of risk: risk was higher in female donors and when recipients were male (p < 0.0003). CONCLUSIONS: This multi-institutional analysis identified important interactions between donor characteristics that affect post-transplant survival that explain some of the discrepancies in the results of previous studies. The results are likely to aid in efficient organ allocation.
Assuntos
Cardiopatias/cirurgia , Transplante de Coração/mortalidade , Transplante de Coração/fisiologia , Doadores de Tecidos , Transplante , Adulto , Fatores Etários , Peso Corporal , Diabetes Mellitus , Feminino , Humanos , Hipertensão , Estimativa de Kaplan-Meier , Masculino , Pessoa de Meia-Idade , Análise Multivariada , Estudos Retrospectivos , Fatores Sexuais , Taxa de SobrevidaRESUMO
BACKGROUND: Patients with muscular dystrophy are at risk of developing a dilated cardiomyopathy and can progress to advanced heart failure. At present, it is not known whether such patients can safely undergo cardiac transplantation. METHODS: This was a retrospective review of the Cardiac Transplant Research Database, a multi-institutional registry of 29 transplant centers in the United States, from the years 1990 to 2005. The post-cardiac transplant outcomes of 29 patients with muscular dystrophy were compared with 275 non-muscular dystrophy patients with non-ischemic cardiomyopathy, matched for age, body mass index, gender, and race. RESULTS: Becker's muscular dystrophy was present in 52% of the patients. Survival in the muscular dystrophy patients was similar to the controls at 1 year (89% vs 91%; p = 0.5) and at 5 years (83% vs 78%; p = 0.5). The differences in rates of cumulative infection, rejection, or allograft vasculopathy between the 2 groups were not significant (p > 0.5 for all comparisons). CONCLUSIONS: Recognizing the limitations of the present investigation (ie, selection bias and data lacking in the functional capacity of the muscular dystrophy patients), the current study suggests that the clinical outcomes after cardiac transplantation in selected patients with muscular dystrophy are similar to those seen in age-matched patients with non-ischemic cardiomyopathy.
Assuntos
Cardiomiopatia Dilatada/etiologia , Cardiomiopatia Dilatada/cirurgia , Transplante de Coração , Distrofia Muscular de Duchenne/complicações , Adulto , Cardiomiopatia Dilatada/mortalidade , Estudos de Casos e Controles , Progressão da Doença , Feminino , Rejeição de Enxerto , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Taxa de Sobrevida , Resultado do Tratamento , Estados UnidosRESUMO
BACKGROUND: Donor and recipient risk factors for rejection and infection have been well characterized. The contribution of demographic factors, especially age at the time of transplantation to morbidity and mortality due to rejection and infection, is much less well understood. METHODS: Using parametric hazard analysis and multivariate risk-factor equations for infection and rejection events, we quantitatively determined the relationship of fundamental demographic variables (age, race and gender) to infection and rejection. These analyses were conducted with respect to date of transplant and age at the time of transplantation. The patient group consisted of all primary heart transplants performed at the University of Alabama at Birmingham during the years 1990 to 2007 (n = 526). RESULTS: Risk factors for rejection within 12 months post-transplantation were date of transplant (p < 0.0001) and age at the time of transplantation (young adults 10 to 30 years of age, p < 0.0001). Risk factors for infection were date of transplant (p < 0.0001) and age at the time of transplantation (young children and older adults, p < 0.0001). There were three immunosuppressive eras in 1990 to 2007. Notably, although the proportion of patients experiencing rejection and infection events decreased during each successive immunosuppressive era, the relative relationship of infection to rejection, as well as age at the time of transplantation, remained similar into the most recent era. The maximal frequency of rejection events and rejection death occurred among patients transplanted at ages 10 to 30 years. Conversely, the frequency of infection events was minimal within the same group. In the oldest and youngest patients receiving transplants, infection was the predominant cause of death and rates of rejection events decreased. CONCLUSIONS: These data show that evolving immunosuppressive strategies have successfully reduced rejection and infection frequencies, and those patients transplanted at 30 to 60 years of age have the lowest frequency of rejection/infection events. However, individuals transplanted at younger or older ages, especially non-white recipients in the 10- to 30-year age group, experience significantly more infection or rejection. Therefore, programs should increase the level of surveillance in these patients and consider modification of immunosuppressive regimens in order to lower the frequency of infection and rejection events.
Assuntos
Rejeição de Enxerto/epidemiologia , Transplante de Coração/efeitos adversos , Infecções/epidemiologia , Adolescente , Adulto , Idoso , Criança , Demografia , Feminino , Transplante de Coração/imunologia , Humanos , Terapia de Imunossupressão/métodos , Imunossupressores/efeitos adversos , Masculino , Pessoa de Meia-Idade , Seleção de Pacientes , Modelos de Riscos Proporcionais , Grupos Raciais , Estudos Retrospectivos , Fatores de Risco , Adulto JovemRESUMO
OBJECTIVE: In 1990, Fontan, Kirklin, and colleagues published equations for survival after the so-called "Perfect Fontan" operation. After 1988, we evolved a protocol using an internal or external polytetraflouroethylene tube of 16 to 19 millimetres diameter placed from the inferior caval vein to either the right or left pulmonary artery along with a bidirectional cava-pulmonary connection. The objective of this study was to test the hypothesis that a "perfect" outcome is routinely achievable in the current era when using a standardized surgical procedure. METHODS: Between 1 January, 1988, and 12 December, 2005, 112 patients underwent the Fontan procedure using an internal or external polytetraflouroethylene tube plus a bidirectional cava-pulmonary connection, the latter usually having been constructed as a previous procedure. This constituted 45% of our overall experience in constructing the Fontan circulation between 1988 and 1996, and 96% of the experience between 1996 and 2005. Among all surviving patients, the median follow-up was 7.3 years. We calculated the expected survival for an optimal candidate, given from the initial equations, and compared this to our entire experience in constructing the Fontan circulation. RESULTS: An internal tube was utilized in 61 patients, 97% of whom were operated prior to 1998, and an external tube in 51 patients, the latter accounting for 95% of all operations since 1999. At 1, 5, 10 and 15 years, survival of the entire cohort receiving polytetraflouroethylene tubes is superimposable on the curve calculated for a "perfect" outcome. Freedom from replacement or revision of the tube was 97% at 10 years. CONCLUSION: Using a standardized operative procedure, combining a bidirectional cavopulmonary connection with a polytetraflouroethylene tube placed from the inferior caval vein to the pulmonary arteries for nearly all patients with functionally univentricular hearts, early and late survival within the "perfect" outcome as predicted by the initial equations of Fontan and Kirklin is routinely achievable in the current era. The need for late revision or replacement of the tube is rare.
Assuntos
Técnica de Fontan , Técnica de Fontan/instrumentação , Técnica de Fontan/métodos , Técnica de Fontan/mortalidade , Cardiopatias Congênitas/mortalidade , Cardiopatias Congênitas/cirurgia , Humanos , Análise Multivariada , Politetrafluoretileno , Reoperação , Resultado do TratamentoRESUMO
BACKGROUND: Outcomes of patients with a prior diagnosis of peri-partum cardiomyopathy (PPCM) undergoing heart transplantation are not well described but may be worse than for women who undergo transplantation for other etiologies. METHODS: Between 1999 and 2005, 69 women aged younger than 40 underwent transplantation for PPCM in 29 institutions participating in the Cardiac Transplant Research Database. Patients with PPCM were compared with 90 female recipients of similar age with idiopathic dilated cardiomyopathy (IDC) and history of pregnancy (P+), 53 with no prior pregnancy (P-), and with 459 men of a similar age with IDC. Rejection, infection, cardiac allograft vasculopathy, and survival were compared. RESULTS: Recipients with PPCM accounted for 1% of all transplants and 5% of transplants in women. Comparisons of the 4 patient groups were made. The risk of cumulative rejection was higher in the PPCM Group compared with the P- Group (p < 0.04) and the men (p < 0.0001). Cumulative risk of infection was lowest in the PPCM Group. Freedom from cardiac allograft vasculopathy was similar or higher in the PPCM Group compared with the other groups. Finally, the long-term survival of PPCM patients was comparable with the survival of men (p = 0.9), and there was a trend toward improved survival compared with the P+ Group (p = 0.07) and improved survival compared with the P- Group (p = 0.05). CONCLUSIONS: Heart transplantation for PPCM remains relatively infrequent. Survival and freedom from cardiac allograft vasculopathy in patients who receive a transplant for PPCM are no worse than in women who require a transplant for other indications, regardless of parity.