RESUMEN
BACKGROUND: The formation of transmural lesions is necessary for the ablation of persistent atrial fibrillation (prAF). Ablation index (AI) and generator impedance drop (ID) predict lesion size but their correlation with long-term outcomes in prAF is not known. Furthermore, we proposed a new parameter, efficacy ratio (ER) calculated as ID/AI, to gain indirect insight into the role of factors affecting ID but not considered by AI. METHODS: We included ablations performed during the DECAAF II trial if they had uploaded lesion-by-lesion summary data and were performed with radiofrequency catheters on the CARTO system. Average patient-level parameters were calculated from all generated Vizitags. RESULTS: A total of 427 ablations met inclusion criteria and 166 utilized AI. Analyzed as continuous variables, ID and ER predicted long-term arrhythmia-free survival but not AI. The ideal cut-off for ID was ID ≥ 10.4 ohms and had a C-index of 0.55. It predicted reduced risk of arrhythmia: hazard ratio 0.56 [0.36-0.88], p = .013 (arrhythmia-free survival of 67% vs. 52%). Similarly, an ER of 1.7 ohms/100AI had a C-index of 0.58 and predicted reduced arrhythmia recurrence: HR 0.39 [0.22-0.69], p = .001. ER < 1.7 ohms/100AI was related to just 32% arrhythmia-free survival. ER improved prognostication as compared to ID alone and identified a subset of low ID patients with even worse outcomes. CONCLUSION: Average ID was predictive of improved outcomes following ablation of prAF. The ratio of ID/AI (ER) was postulated as a measure to summarize the overall impact of factors not considered in the AI formula and provided improved prognostication.
RESUMEN
INTRODUCTION: Obesity is implicated in adverse atrial remodeling and worse outcomes in patients with atrial fibrillation. The objective of this study is to assess the effect of body mass index (BMI) on ablation-induced scar formation on late gadolinium enhancement cardiac magnetic resonance imaging (LGE-CMR). METHODS: We conducted an analysis of DECAAF II participants who underwent LGE-CMR scans to measure scar formation 3 months after catheter ablation. Ablation parameters and lesion delivery were not dependent on BMI. The effect of BMI on ablation success was explored. RESULTS: Our analyses included 811 patients. Comorbidities were more prevalent in obese patients. Baseline left atrial volume was higher in obese individuals, 118, 126, 135, 140, and 143 mm3 for normal weight, overweight, obese grade 1, 2, and 3, respectively (p < .001). BMI was associated with scar formation (R = -0.135, p < .001), with patients with Class 3 obesity having the lowest percentage of ablation-induced scar, 11.1%, 10.3%, 9.5%, 8.8%, 6.8% by ascending BMI group. There was an inverse correlation between BMI and the amount of fibrosis covered by ablation scar, 24%, 23%, 21%, and 18% by ascending BMI group (p = .001). For the fibrosis-guided ablation group, BMI was associated with residual fibrosis (R = 0.056, p = .005). CONCLUSION: Obese patients have lower ablation scar formation, covered fibrosis, and more residual fibrosis postablation compared to nonobese patients, regardless of ablation parameters including impedance drop.
RESUMEN
Ectopic variceal bleeding is a potentially under recognized source of gastrointestinal (GI) hemorrhage. While vascular complications following pancreatic transplant are relatively common, the development of symptomatic ectopic venous varices has rarely been reported. We report two patients with a remote history of simultaneous kidney pancreas transplant (SPK) presenting two decades after transplant with an occult GI bleed. In both cases, a lengthy diagnostic course was required. The varices were treated with coil embolization via transhepatic approach. Our findings add to the limited literature on this topic and aid in the recognition, diagnosis, and management of this unusual presentation.
Asunto(s)
Embolización Terapéutica , Várices Esofágicas y Gástricas , Trasplante de Páncreas , Várices , Humanos , Várices Esofágicas y Gástricas/diagnóstico , Várices Esofágicas y Gástricas/etiología , Várices Esofágicas y Gástricas/terapia , Hemorragia Gastrointestinal/diagnóstico , Hemorragia Gastrointestinal/etiología , Hemorragia Gastrointestinal/terapia , Várices/complicaciones , Várices/terapia , Trasplante de Páncreas/efectos adversosRESUMEN
INTRODUCTION: The use of therapeutic apheresis (TA) either as stand-alone or adjunctive treatment in kidney transplantation has increased over the years to become a leading indication. This study shows recent trends in indications for TA related to kidney transplantation, adverse events, and patient outcome in this cohort. METHODS: This is a retrospective cohort review of adults who received TA for kidney transplant-related indications from January 1, 2017, to December 31, 2022, at the University of Virginia Medical Centre, Charlottesville, VA, USA. Data extracted include basic demographics, indication for apheresis, number of procedures, procedure characteristics, procedure-related adverse events (complications), and serum ionized calcium and serum creatinine. Data were analyzed using statistical package for social sciences (SPSS 2022 IBM Inc). RESULTS: Data from a total of 131 patients who received 860 TA procedures were analyzed. Indications for TA were antibody-mediated rejection (65.5%), recurrent focal segmental glomerulosclerosis (15%), thrombotic microangiopathy (5%), desensitization for ABO incompatibility (4.5%) and for HLA-incompatibility (4.5%), and recurrent IgA nephropathy (1%). Some adverse events were encountered in 16.7% of the procedures and include hypocalcemia (7%), vascular access malfunction (0.7%), hypotension (1.2%), arrhythmia (0.6%), and depletion coagulopathy (0.6%). The overall case mortality rate was 8.4% over the 6-year period. There was one death recorded on machine during TA resulting in a procedure-mortality rate of 0.12%. CONCLUSION: Antibody-mediated rejection was the most common indication for TA related to kidney transplantation. Adverse events were minor and patient survival over the time was within usual limits.
Asunto(s)
Eliminación de Componentes Sanguíneos , Trasplante de Riñón , Humanos , Estudios Retrospectivos , Eliminación de Componentes Sanguíneos/métodos , Adulto , Persona de Mediana Edad , Femenino , Masculino , Rechazo de InjertoRESUMEN
INTRODUCTION: Pulmonary hypertension (PH) is prevalent in those with end-stage kidney disease (ESKD) and poses a barrier to kidney transplant due to its association with poor outcomes. Studies examining these adverse outcomes are limited and often utilize echocardiographic measurements of pulmonary artery systolic pressure (PASP) instead of the gold standard right heart catheterization (RHC). We hypothesized that in ESKD patients deemed ineligible for kidney transplant because of an echocardiographic diagnosis of PH the predominant cause of PH is hypervolemia and is potentially reversible. METHODS: We conducted a prospective study of 16 patients with ESKD who were denied transplant candidacy. Prior echocardiograms and RHCs were reviewed for confirmation of PH. Patients were admitted for daily sessions of ultrafiltration for volume removal and repeat RHCs were performed following intervention. RHC parameters and body weight were compared before and after intervention. Statistical analysis was performed using PRISM GraphPad software. A p-value <.05 was considered statistically significant. RESULTS: Following intervention, the mean pulmonary artery pressure (mPAP) and pulmonary arterial wedge pressure decreased from 45.0 ± 3.06 to 29.1 ± 7.77 mmHg (p < .0001) and 22.2 ± 5.06 to 13.1 ± 7.25 mmHg (p = .003), respectively. The pulmonary vascular resistance decreased from 4.73 ± 1.99 to 4.28 ± 2.07 WU (p = .30). Eleven patients from the initial cohort underwent successful kidney transplantation post-intervention with 100% survival at 1-year. CONCLUSIONS: In ESKD patients, diagnoses of PH made by echocardiography may be largely due to hypervolemia and may be optimized using an intensive ultrafiltration strategy to restore transplant candidacy.
Asunto(s)
Insuficiencia Cardíaca , Hipertensión Pulmonar , Trasplante de Riñón , Humanos , Hipertensión Pulmonar/diagnóstico , Estudios Prospectivos , Ecocardiografía , Resistencia Vascular , Cateterismo Cardíaco , Estudios RetrospectivosRESUMEN
Cutaneous squamous cell carcinoma (CSCC) is a major cause of morbidity and mortality after organ transplant. Many patients subsequently develop multiple CSCC following a first CSCC, and the risk of metastasis and death is significantly increased compared to the general population. Post-transplant CSCC represents a disease at the interface of dermatology and transplant medicine. Both systemic chemoprevention and modulation of immunosuppression are frequently employed in patients with multiple CSCC, yet there is little consensus on their use after first CSCC to reduce risk of subsequent tumors. While relatively few controlled trials have been undertaken, extrapolation of observational data suggests the most effective interventions may be at the time of first CSCC. We review the need for intervention after a first post-transplant CSCC and evidence for use of various approaches as secondary prevention, before discussing barriers preventing engagement with this approach and finally highlight areas for future research. Close collaboration between specialties to ensure prompt deployment of these interventions after a first CSCC may improve patient outcomes.
Asunto(s)
Carcinoma de Células Escamosas , Neoplasias Cutáneas , Humanos , Carcinoma de Células Escamosas/etiología , Neoplasias Cutáneas/etiología , Neoplasias Cutáneas/prevención & controlRESUMEN
Organ transplant recipients (OTRs) are at increased risk of cutaneous malignancy. Skin disorders in OTRs of color (OTRoC) have rarely been systematically assessed. We aimed to ascertain the burden of skin disease encountered in OTRoC by prospectively collecting data from OTRs attending 2 posttransplant skin surveillance clinics: 1 in London, UK and 1 in Philadelphia, USA. Retrospective review of all dermatological diagnoses was performed. Data from 1766 OTRs were analyzed: 1024 (58%) white, 376 (21%) black, 261 (15%) Asian, 57 (3%) Middle Eastern/Mediterranean (ME/M), and 48 (2.7%) Hispanic; and 1128 (64%) male. Viral infections affected 45.1% of OTRs, and were more common in white and ME/M patients (P < .001). Fungal infections affected 28.1% and were more common in ME/M patients (P < .001). Inflammatory skin disease affected 24.5%, and was most common in black patients (P < .001). In addition, 26.4% of patients developed skin cancer. There was an increased risk of skin cancer in white vs nonwhite OTRs (HR 4.4, 95% CI 3.5-5.7, P < .001): keratinocyte cancers were more common in white OTRs (P < .001) and Kaposi sarcoma was more common in black OTRs (P < .001). These data support the need for programs that promote targeted dermatology surveillance for all OTRs, regardless of race/ethnicity or country of origin.
Asunto(s)
Trasplante de Órganos , Enfermedades de la Piel , Neoplasias Cutáneas , Humanos , Masculino , Trasplante de Órganos/efectos adversos , Philadelphia , Estudios Retrospectivos , Enfermedades de la Piel/epidemiología , Enfermedades de la Piel/etiología , Neoplasias Cutáneas/epidemiología , Neoplasias Cutáneas/etiología , Receptores de TrasplantesRESUMEN
BACKGROUND: Due to a substantial decline in pancreas transplantation (PT) across the United States over the past 15 years, we sought to understand the perceptions and practices of US PT programs. METHODS: Surveys were sent to members of the American Society of Transplantation Surgeons and the American Society of Transplantation by email and professional society postings between August 2019 and November 2019. RESULTS: One hundred twenty three responses were recorded from 56 unique programs. Program characteristics were obtained from the Scientific Registry of Transplant Recipients. Respondents were transplant surgeons (71%), transplant nephrologists (17%), trainees (9%), and allied professionals (3%). Programs were defined according to annual volume as: low (<5 PT/year), intermediate (6-20), or high (>20). High-volume programs reported that these factors were most important for increased PT: expansion of recipient selection, more aggressive donor utilization, and hiring of PT program-specific personnel. At both the program and national level, the vast majority (82% and 79%, respectively) felt the number of PTs currently performed are not in balance with patients' needs. CONCLUSIONS: Overall, programs reported that the option of PT is not offered adequately to diabetic patients and that strategies to maintain higher PT volume are most evident at intermediate, and especially, high-volume programs.
Asunto(s)
Trasplante de Riñón , Trasplante de Páncreas , Humanos , Encuestas y Cuestionarios , Donantes de Tejidos , Receptores de Trasplantes , Estados UnidosRESUMEN
Kidney transplantation (KT) is the optimal therapy for end-stage kidney disease (ESKD), resulting in significant improvement in survival as well as quality of life when compared with maintenance dialysis. The burden of cardiovascular disease (CVD) in ESKD is reduced after KT; however, it still remains the leading cause of premature patient and allograft loss, as well as a source of significant morbidity and healthcare costs. All major phenotypes of CVD including coronary artery disease, heart failure, valvular heart disease, arrhythmias and pulmonary hypertension are represented in the KT recipient population. Pre-existing risk factors for CVD in the KT recipient are amplified by superimposed cardio-metabolic derangements after transplantation such as the metabolic effects of immunosuppressive regimens, obesity, posttransplant diabetes, hypertension, dyslipidemia and allograft dysfunction. This review summarizes the major risk factors for CVD in KT recipients and describes the individual phenotypes of overt CVD in this population. It highlights gaps in the existing literature to emphasize the need for future studies in those areas and optimize cardiovascular outcomes after KT. Finally, it outlines the need for a joint 'cardio-nephrology' clinical care model to ensure continuity, multidisciplinary collaboration and implementation of best clinical practices toward reducing CVD after KT.
Asunto(s)
Enfermedades Cardiovasculares , Manejo de la Enfermedad , Trasplante de Riñón/efectos adversos , Receptores de Trasplantes , Enfermedades Cardiovasculares/epidemiología , Enfermedades Cardiovasculares/etiología , Enfermedades Cardiovasculares/terapia , Salud Global , Humanos , Incidencia , Fallo Renal Crónico/cirugía , Tasa de Supervivencia/tendenciasRESUMEN
Chronic inflammation is increased in patients with chronic kidney disease (CKD) and contributes to cardiovascular morbidity and mortality. Specific immune mechanisms and pathways that drive and maintain chronic inflammation in CKD are not well described. The TAM ligands (Gas6 and protein S) and receptors (Axl and Mer) have been recently recognized as playing a prominent role in immune regulation. The receptors exist in both soluble and cell-bound forms; the soluble receptors (sAxl and sMer) are believed to compete with the bound receptors and thus inhibit their function. In this study, we determined the expression of cell-bound and soluble TAM proteins in patients with CKD. CKD patients had significantly lower expression of Mer in monocytes, yet increased expression of soluble TAM receptors sAxl and sMer in plasma compared to controls. The metalloproteinase ADAM 17, responsible for cleavage of Mer to its soluble form, was increased in patient monocytes. Elevated levels of soluble TAM receptors were more evident in patients with progressive renal failure. These observations suggest that functional deficiency of TAM receptor-mediated regulation of inflammation may contribute to chronic inflammation in patients with CKD.
Asunto(s)
Regulación de la Expresión Génica/fisiología , Péptidos y Proteínas de Señalización Intercelular/metabolismo , Proteína S/metabolismo , Proteínas Proto-Oncogénicas/metabolismo , Proteínas Tirosina Quinasas Receptoras/metabolismo , Insuficiencia Renal Crónica/metabolismo , Humanos , Inflamación , Péptidos y Proteínas de Señalización Intercelular/genética , Monocitos/metabolismo , Proteína S/genética , Proteínas Proto-Oncogénicas/genética , Proteínas Tirosina Quinasas Receptoras/genética , Insuficiencia Renal Crónica/inmunología , Tirosina Quinasa c-Mer , Tirosina Quinasa del Receptor AxlRESUMEN
Importance: Current left bundle branch block (LBBB) criteria are based on animal experiments or mathematical models of cardiac tissue conduction and may misclassify patients. Improved criteria would impact referral decisions and device type for cardiac resynchronization therapy. Objective: To develop a simple new criterion for LBBB based on electrophysiological studies of human patients, and then to validate this criterion in an independent population. Design, Setting, and Participants: In this diagnostic study, the derivation cohort was from a single-center, prospective study of patients undergoing electrophysiological study from March 2016 through November 2019. The validation cohort was assembled by retrospectively reviewing medical records for patients from the same center who underwent transcatheter aortic valve replacement (TAVR) from October 2015 through May 2022. Exposures: Patients were classified as having LBBB or intraventricular conduction delay (IVCD) as assessed by intracardiac recording. Main Outcomes and Measures: Sensitivity and specificity of the electrocardiography (ECG) criteria assessed in patients with LBBB or IVCD. Results: A total of 75 patients (median [IQR] age, 63 [53-70.5] years; 21 [28.0%] female) with baseline LBBB on 12-lead ECG underwent intracardiac recording of the left ventricular septum: 48 demonstrated complete conduction block (CCB) and 27 demonstrated intact Purkinje activation (IPA). Analysis of surface ECGs revealed that late notches in the QRS complexes of lateral leads were associated with CCB (40 of 48 patients [83.3%] with CCB vs 13 of 27 patients [48.1%] with IPA had a notch or slur in lead I; P = .003). Receiver operating characteristic curves for all septal and lateral leads were constructed, and lead I displayed the best performance with a time to notch longer than 75 milliseconds. Used in conjunction with the criteria for LBBB from the American College of Cardiology/American Heart Association/Heart Rhythm Society, this criterion had a sensitivity of 71% (95% CI, 56%-83%) and specificity of 74% (95% CI, 54%-89%) in the derivation population, contrasting with a sensitivity of 96% (95% CI, 86%-99%) and specificity of 33% (95% CI, 17%-54%) for the Strauss criteria. In an independent validation cohort of 46 patients (median [IQR] age, 78.5 [70-84] years; 21 [45.7%] female) undergoing TAVR with interval development of new LBBB, the time-to-notch criterion demonstrated a sensitivity of 87% (95% CI, 74%-95%). In the subset of 10 patients with preprocedural IVCD, the criterion correctly distinguished IVCD from LBBB in all cases. Application of the Strauss criteria performed similarly in the validation cohort. Conclusions and Relevance: The findings suggest that time to notch longer than 75 milliseconds in lead I is a simple ECG criterion that, when used in conjunction with standard LBBB criteria, may improve specificity for identifying patients with LBBB from conduction block. This may help inform patient selection for cardiac resynchronization or conduction system pacing.
Asunto(s)
Bloqueo de Rama , Electrocardiografía , Humanos , Bloqueo de Rama/fisiopatología , Bloqueo de Rama/diagnóstico , Bloqueo de Rama/terapia , Femenino , Masculino , Anciano , Persona de Mediana Edad , Estudios Prospectivos , Estudios RetrospectivosRESUMEN
Background: The benefit of pulmonary vein isolation (PVI) in patients with atrial fibrillation (AF) and heart failure with reduced ejection fraction (HFrEF) is well established; its efficacy in patients with heart failure preserved ejection fraction (HFpEF) is less clear. Objective: The objective of the study was to compare AF and heart failure (HF) rehospitalizations after PVI in patients with HFpEF vs HFrEF. Methods: The IBM MarketScan Database was used to identify patients undergoing PVI for AF. Patients were categorized by HF status: absence of HF, presence of HFrEF, or presence of HFpEF. Primary outcomes were HF and arrhythmia hospitalizations after PVI. Results: A total of 32,524 patients were analyzed: 27,900 with no HF (86%), 2948 with HFrEF (9%), and 1676 with HFpEF (5%). Compared with those with no HF, both patients with HFrEF and HFpEF were more likely to be hospitalized for HF (hazard ratio [HR] 7.27; P < .01 for HFrEF and HR 9.46; P < .01 for HFpEF) and for AF (HR 1.17; P < .01 for HFrEF and HR 1.74; P < .01 for HFpEF) after PVI. In matched analysis, 23% of patients with HFrEF and 24% patients with HFpEF demonstrated a reduction in HF hospitalizations (P = .31) and approximately one-third demonstrated decreased arrhythmia rehospitalizations (P = .57) in the 6 months after PVI. Compared with those with HFrEF in longer-term follow-up (>1 year), patients with HFpEF were more likely to have HF (HR 1.30; P < .01) and arrhythmia (HR 1.19; P < .01) rehospitalizations. Conclusion: Reductions in HF and arrhythmia hospitalizations are observed early after PVI across all patients with HF, but patients with HFpEF demonstrate higher HF rehospitalization and arrhythmia recurrence in longer-term follow-up than do patients with HFrEF.
RESUMEN
BACKGROUND: Catheter ablation has obtained class 1 indication in ablation of young, healthy patients with symptomatic paroxysmal atrial fibrillation (AF). Anti-arrhythmic drugs (AADs) remain first-line therapy before ablating persistent AF (PersAF). We sought to evaluate the efficacy of a direct-to-catheter ablation approach against catheter ablation post AADs in PersAF. METHODS: In this DECAAF II subanalysis, patients were stratified into two subgroups: 'Direct-to-catheter' group comprising patients who had not received AADs prior to ablation, and'second-line ablation' group, comprising patients who had been on any AAD therapy at any time before ablation. Patients were followed over 18 months. The primary outcome was AF recurrence. Secondary outcomes included AF burden, quality of life (QoL) that assessed by the AFSS and SF-36 scores, and changes in the left atrial volume index (LAVI) assessed by LGE-MRI scans. RESULTS: The analysis included 815 patients, with 279 classified as'direct-to-catheter' group and 536 classified as'Second-line ablation' group. The primary outcome was similar between both groups (44.8% vs 44.4%, p > 0.05), as was AF burden (20% vs 16%, p > 0.05). Early remodeling, reflected by LAVI reduction, was similar between the groups (9.1 [1.6-18.0] in the second-line ablation group and 9.5 [2.5-19.7] in the direct-to-catheter group, p > 0.05). QoL pre/post ablation was also similar (p > 0.05). On multivariate analysis, history of AAD was not predictive of AF recurrence(p > 0.05). CONCLUSION: Prior AAD therapy demonstrated minimal impact on atrial remodeling and QoL improvement, in addition to limited benefit on AF recurrence and burden post-ablation in patients with PersAF. Additional studies are warranted to explore the efficacy of catheter ablation as a first-line therapy in PersAF.
RESUMEN
BACKGROUND: Catheter ablation is recognized as an effective treatment for atrial fibrillation (AF). Despite its effectiveness, significant sex-specific differences have been observed, which influence the outcomes of the procedure. This study explores these differences in a cohort of patients with persistent AF. We aim to assess sex differences in baseline characteristics, symptoms, quality of life, imaging findings, and response to catheter ablation in patients with persistent AF. METHODS: This post hoc analysis of the DECAAF II trial evaluated 815 patients (161 females, 646 males). Between July 2016 and January 2020, participants were enrolled and randomly assigned to receive either personalized ablation targeting left atrial (LA) fibrosis using DE-MRI in conjunction with pulmonary vein isolation (PVI) or PVI alone. In this analysis, we aimed to compare female and male patients in the full cohort in terms of demographics, risk factors, medications, and outcomes such as AF recurrence, AF burden, LA volume reduction assessed by LGE-MRI before and 3 months after ablation, quality of life assessed by the SF-36 score, and safety outcomes. Statistical methods included t-tests, chi-square, and multivariable Cox regression. RESULTS: Females were generally older with more comorbidities and experienced higher rates of arrhythmia recurrence post-ablation (53.3% vs. 40.2%, p < 0.01). Females also showed a higher AF burden (21% vs. 16%, p < 0.01) and a smaller reduction in left atrial volume indexed to body surface area post-ablation compared to male patients (8.36 (9.94) vs 11.35 (13.12), p-value 0.019). Quality of life scores were significantly worse in females both pre- and post-ablation (54 vs. 66 pre-ablation; 69 vs. 81 post-ablation, both p < 0.01), despite similar improvements across sexes. Safety outcomes and procedural parameters were similar between male and female patients. CONCLUSION: The study highlights significant differences in the outcomes of catheter ablation of persistent AF between sexes, with female patients showing worse quality of life, higher recurrence of AF and AF burden after ablation, and worse LA remodeling.
RESUMEN
AIMS: Although myocardial scar assessment using late gadolinium enhancement (LGE) cardiac magnetic resonance (CMR) imaging is frequently indicated for patients with implantable cardioverter defibrillators (ICDs), metal artefact can degrade image quality. With the new wideband technique designed to mitigate device related artefact, CMR is increasingly used in this population. However, the common clinical indications for CMR referral and impact on clinical decision-making and prognosis are not well defined. Our study was designed to address these knowledge gaps. METHODS AND RESULTS: One hundred seventy-nine consecutive patients with an ICD (age 59 ± 13 years, 75% male) underwent CMR using cine and wideband pulse sequences for LGE imaging. Electronic medical records were reviewed to determine the reason for CMR referral, whether there was a change in clinical decision-making, and occurrence of major adverse cardiac events (MACEs). Referral indication was the most common evaluation of ventricular tachycardia (VT) substrate (n = 114, 64%), followed by cardiomyopathy (n = 53, 30%). Overall, CMR resulted in a new or changed diagnosis in 64 (36%) patients and impacted clinical management in 51 (28%). The effect on management change was highest in patients presenting with VT. A total of 77 patients (43%) experienced MACE during the follow-up period (median 1.7 years), including 65 in patients with evidence of LGE. Kaplan-Meier analysis showed that ICD patients with LGE had worse outcomes than those without LGE (P = 0.006). CONCLUSION: The clinical yield from LGE CMR is high and provides management changing and meaningful prognostic information in a significant proportion of patients with ICDs.
Asunto(s)
Desfibriladores Implantables , Taquicardia Ventricular , Humanos , Masculino , Persona de Mediana Edad , Anciano , Femenino , Desfibriladores Implantables/efectos adversos , Medios de Contraste , Imagen por Resonancia Cinemagnética/métodos , Gadolinio , Arritmias Cardíacas/etiología , Taquicardia Ventricular/diagnóstico por imagen , Taquicardia Ventricular/terapia , Espectroscopía de Resonancia Magnética , Valor Predictivo de las PruebasRESUMEN
BACKGROUND: The TAM receptors (tyro3, axl and mer) and their ligands (vitamin K-dependent proteins-Gas6 and Protein S) are crucial modulators of inflammation, which may be relevant in chronic kidney disease (CKD). Gas6 and axl have multiple roles in mediating vascular atherosclerosis and injury, thrombosis and inflammation, yet nothing is known about the Gas6-axl pathway in humans with CKD. Given the prevalence of chronic inflammation and vascular disease in this population, we measured TAM ligands in patients with various levels of renal function. METHODS: Gas6 and protein S were quantified in the plasma by ELISA in three patient groups: end-stage renal disease on chronic hemodialysis (HD), CKD and normal controls. RESULTS: Significantly increased levels of Gas6 and protein S were found in CKD patients compared with normal controls (P < 0.01 and <0.001, respectively). In HD patients, Gas6 levels were elevated compared with controls (P < 0.001) and positively associated with low albumin (r = 0.33; P = 0.01), dialysis vintage (r = 0.36; P = 0.008) and IV iron administration (r = 0.33; P = 0.01). The levels of Gas6 rose with CKD stage and were inversely associated with estimated GFR (P < 0.0001). CONCLUSIONS: Dysregulation of circulating Gas6 is associated with renal disease and inversely proportional to renal function. Low albumin and higher IV iron administration were associated with higher Gas6 levels, suggesting a possible connection between inflammation and oxidative stress mediated by iron. Protein S levels were also elevated in CKD patients, but the relevance of this finding needs to be further investigated.
Asunto(s)
Péptidos y Proteínas de Señalización Intercelular/sangre , Fallo Renal Crónico/sangre , Adulto , Anciano , Anciano de 80 o más Años , Biomarcadores/sangre , Ensayo de Inmunoadsorción Enzimática , Femenino , Humanos , Masculino , Persona de Mediana Edad , Precursores de Proteínas/sangre , Proteína S/análisis , Protrombina , Diálisis RenalRESUMEN
OBJECTIVE: Continuous glucose monitoring (CGM) improves diabetes management, but its reliability in individuals on hemodialysis is poorly understood and potentially affected by interstitial and intravascular volume variations. RESEARCH DESIGN AND METHODS: We assessed the accuracy of a factory-calibrated CGM by using venous blood glucose measurements (vBGM) during hemodialysis sessions and self-monitoring blood glucose (SMBG) at home. RESULTS: Twenty participants completed the protocol. The mean absolute relative difference of the CGM was 13.8% and 14.4%, when calculated on SMBG (n = 684) and on vBGM (n = 624), and 98.7% and 100% of values in the Parkes error grid A/B zones, respectively. Throughout 181 days of CGM monitoring, the median time in range (70-180 mg/dL) was 38.5% (interquartile range 29.3-57.9), with 28.7% (7.8-40.6) of the time >250 mg/dL. CONCLUSIONS: The overall performance of a factory-calibrated CGM appears reasonably accurate and clinically relevant for use in practice by individuals on hemodialysis and health professionals to improve diabetes management.
Asunto(s)
Glucemia , Diabetes Mellitus Tipo 1 , Automonitorización de la Glucosa Sanguínea/métodos , Humanos , Diálisis Renal , Reproducibilidad de los ResultadosRESUMEN
A 45-year-old man with stage IV melanoma presented with incessant nonsustained wide complex tachycardia. He was found to have a right ventricular intracardiac metastasis that created a nidus for ventricular tachycardia refractory to multiple therapeutic interventions. The patient underwent catheter ablation for this rare indication, with successful arrhythmia control by direct ablation over the tumor surface. (Level of Difficulty: Advanced.).
RESUMEN
BACKGROUND AND OBJECTIVE: Variable age thresholds are often used at transplant centers for simultaneous heart and kidney transplantation (HKT). We hypothesize that selected older recipients enjoy comparable outcome to younger recipients in the current era of HKT. METHODS: We performed a retrospective analysis of HKT outcomes in the United Network for Organ Sharing (UNOS) registry from 2006 to 2018, classifying patients by age at transplant as ≥ 65 or < 65 years. The primary outcome was patient death. Secondary outcomes included all-cause kidney graft failure and death-censored kidney allograft failure. RESULTS: Of 973 patients, 774 (80%) were younger than 65 years (mean 52 ± 10 years) and 199 (20%) were 65 years or older (mean 67 ± 2 years). The older HKT cohort had fewer blacks (22% vs 35%, P = .01) and women (12 vs 18%, P = .04). Fewer older patients received dialysis (30% vs 54%, P < .001) and mechanical support (36% vs 45%, P = .03) before HKT. Older recipients received organs from slightly older donors. The median follow-up time was shorter for patients 65 years or older than for the younger group (2.3 vs 3.3 years, P < .001). Patient survival was similar between the groups (mean 8.8 vs 9.8 years, P = .3), with the most common causes of death being cardiovascular (29%) and infectious complications (28%). There was no difference in all-cause kidney graft survival (mean 8.7 vs 9.3 years, P = .8). Most commonly, recipients died with a functional renal allograft (59.8%), and this occurred more commonly in older patients (81.4% vs 54.8%, P = .001). Cox proportional hazard modeling showed that higher donor age (hazard ratio [HR] 1.015, P = .01; HR 1.022, P = .02) and use of pre-transplant dialysis (HR 1.5, P = .004; HR 1.8, P = .006) increased the risk for both all-cause and death-censored kidney allograft failure, respectively. CONCLUSIONS: Our study showed that carefully selected older patients have outcomes similar to those of a younger cohort and argues for comprehensive evaluation of the recipients with age as part of comorbidity assessment rather than use of an arbitrary age threshold for candidacy.