Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
Add more filters

Country/Region as subject
Affiliation country
Publication year range
1.
J Card Fail ; 29(12): 1593-1602, 2023 12.
Article in English | MEDLINE | ID: mdl-37451602

ABSTRACT

BACKGROUND: Duration of recovery and long-term outcomes have not been well-described in a large cohort of patients with heart failure with recovered ejection fraction (HFrecEF) owing to nonischemic cardiomyopathy. The aim of the study was to characterize the duration of recovery and long-term outcomes of patients with HFrecEF. METHODS AND RESULTS: We performed a retrospective analysis of our institution's databases. Only patients with nonischemic cardiomyopathy, a chronic HF diagnosis, and a previous left ventricular ejection fraction (LVEF) of ≤35% who had a subsequent LVEF of ≥50% were considered to have recovery. Patients with an LVEF of ≤35% who did not recover served as the comparison group. Included were 2319 patients with an LVEF of ≤35%, of whom 465 (20% [18.4%-21.7%]) met the above criteria for recovery (HFrecEF group). Recovery in the HFrecEF group was temporary in most cases, with 50% of patients experiencing a decline in LVEF to <50% within 3.5 [interquartile range 2.4-4.9] years after the day of recovery. Age and sex adjusted death and hospitalization were lower in the HFrecEF group than the HFrEF group (HR 0.29 [interquartile range 0.20-0.41] for death and 0.44 [interquartile range 0.32-0.60] for HF hospitalization, P < .0001 for both). Longer recovery was associated with better survival, with patients spending >5 years in recovery (LVEF of ≥50%) displaying the highest survival rates (83% alive at 10 years after recovery). Survival after recurrence of LV dysfunction was longer for those whose recovery duration was >1 year. CONCLUSIONS: Patients with nonischemic HFrecEF display a unique clinical course. Although recovery is temporary in most cases, patients with HFrecEF display lower mortality and hospitalization rates, with the more durable the recovery of LV systolic function, the longer survival can be anticipated.


Subject(s)
Cardiomyopathies , Heart Failure , Humans , Heart Failure/diagnosis , Heart Failure/therapy , Stroke Volume , Ventricular Function, Left , Retrospective Studies , Follow-Up Studies , Cardiomyopathies/diagnosis , Cardiomyopathies/therapy , Prognosis
2.
J Card Fail ; 2023 Oct 27.
Article in English | MEDLINE | ID: mdl-37890655

ABSTRACT

BACKGROUND: Positron emission tomography (PET) myocardial flow reserve (MFR) is a noninvasive method of detecting cardiac allograft vasculopathy in recipients of heart transplants (HTs). There are limited data on longitudinal change and predictors of MFR following HT. METHODS: We conducted a retrospective analysis of HT recipients undergoing PET myocardial perfusion imaging at an academic center. Multivariable linear and Cox regression models were constructed to identify longitudinal trends, predictors and the prognostic value of MFR after HT. RESULTS: Of HT recipients, 183 underwent 658 PET studies. The average MFR was 2.34 ± 0.70. MFR initially increased during the first 3 years following HT (+ 0.12 per year; P = 0.01) before beginning to decline at an annual rate of -0.06 per year (P < 0.001). MFR declines preceding acute rejection and improves after treatment. Treatment with mammalian target of rapamycin (mTOR) inhibitors (37.2%) slowed the rate of annual MFR decline (P = 0.03). Higher-intensity statin therapy was associated with improved MFR. Longer time post-transplant (P < 0.001), hypertension (P < 0.001), chronic kidney disease (P < 0.001), diabetes mellitus (P = 0.038), antibody-mediated rejection (P = 0.040), and cytomegalovirus infection (P = 0.034) were associated with reduced MFR. Reduced MFR (HR: 7.6, 95% CI: 4.4-13.4; P < 0.001) and PET-defined ischemia (HR: 2.3, 95% CI: 1.4-3.9; P < 0.001) were associated with a higher risk of the composite outcome of mortality, retransplantation, heart failure hospitalization, acute coronary syndrome, or revascularization. CONCLUSION: MFR declines after the third post-transplant year and is prognostic for cardiovascular events. Cardiometabolic risk-factor modification and treatment with higher-intensity statin therapy and mechanistic target of rapamycin inhibitors are associated with a higher MFR.

3.
Clin Endocrinol (Oxf) ; 99(3): 285-295, 2023 09.
Article in English | MEDLINE | ID: mdl-37041100

ABSTRACT

OBJECTIVE: For patients with obesity and diabetes, bariatric surgery can lead to the remission of both diseases. However, the possible impact of diabetes on the magnitude of weight loss outcomes after bariatric surgery has not been precisely quantified. RESEARCH DESIGN AND METHODS: Data from Michigan Bariatric Surgery Cohort (MI-BASiC) was extracted to examine the effect of baseline diabetes on weight loss outcomes. Consecutive patients older than 18 years of age undergoing gastric bypass (GB) or sleeve gastrectomy (SG) for obesity at University of Michigan between January 2008 and November 2013 were included. Repeated measures analysis was used to determine if diabetes was a predictor of weight loss outcomes over 5 years postsurgery. RESULTS: Out of the 714 included patients, 380 patients underwent GB [mean BMI 47.3 ± 0.4 kg/m2 , diabetes 149 (39.2%)] and 334 SG [mean BMI 49.9 ± 0.5 kg/m2 , diabetes 108 (32.3%)]. Multivariable repeated measures analysis showed, after adjusting for covariates, that individuals with diabetes had a significantly lower percentage of total (p = .0023) and excess weight loss (p = .0212) compared to individuals without diabetes. CONCLUSIONS: Our data demonstrate that patients with diabetes undergoing bariatric surgery would experience less weight loss than patients without diabetes.


Subject(s)
Bariatric Surgery , Diabetes Mellitus, Type 2 , Gastric Bypass , Obesity, Morbid , Humans , Obesity, Morbid/surgery , Follow-Up Studies , Michigan , Gastric Bypass/adverse effects , Obesity/surgery , Obesity/etiology , Diabetes Mellitus, Type 2/complications , Diabetes Mellitus, Type 2/surgery , Weight Loss , Treatment Outcome , Retrospective Studies
4.
J Card Fail ; 28(5): 765-774, 2022 05.
Article in English | MEDLINE | ID: mdl-34961663

ABSTRACT

BACKGROUND: The Fried Frailty Phenotype predicts adverse outcomes in geriatric populations, but has not been well-studied in advanced heart failure (HF). The Registry Evaluation of Vital Information for Ventricular Assist Devices (VADs) in Ambulatory Life (REVIVAL) study prospectively collected frailty measures in patients with advanced HF to determine relevant assessments and their impact on clinical outcomes. METHODS AND RESULTS: HF-Fried Frailty was defined by 5 baseline components (1 point each): (1) weakness: hand grip strength less than 25% of body weight; (2) slowness based on time to walk 15 feet; (3) weight loss of more than 10 lbs in the past year; (4) inactivity; and (5) exhaustion, both assessed by the Kansas City Cardiomyopathy Questionnaire. A score of 0 or 1 was deemed nonfrail, 2 prefrail, and 3 or greater was considered frail. The primary composite outcome was durable mechanical circulatory support implantation, cardiac transplant or death at 1 year. Event-free survival for each group was determined by the Kaplan-Meier method and the hazard of prefrailty and frailty were compared with nonfrailty with proportional hazards modeling. Among 345 patients with all 5 frailty domains assessed, frailty was present in 17%, prefrailty in 40%, and 43% were nonfrail, with 67% (n = 232) meeting the criteria based on inactivity and 54% (n = 186) for exhaustion. Frail patients had an increased risk of the primary composite outcome (unadjusted hazard ratio [HR] 2.82, 95% confidence interval [CI] 1.52-5.24; adjusted HR 3.41, 95% CI 1.79-6.52), as did prefrail patients (unadjusted HR 1.97, 95% CI 1.14-3.41; adjusted HR 2.11, 95% CI 1.21-3.66) compared with nonfrail patients, however, the predictive value of HF-Fried Frailty criteria was modest (Harrel's C-statistic of 0.603, P = .004). CONCLUSIONS: The HF-Fried Frailty criteria had only modest predictive power in identifying ambulatory patients with advanced HF at high risk for durable mechanical circulatory support, transplant, or death within 1 year, driven primarily by assessments of inactivity and exhaustion. Focus on these patient-reported measures may better inform clinical trajectories in this population.


Subject(s)
Frailty , Heart Failure , Aged , Fatigue , Frail Elderly , Frailty/diagnosis , Frailty/epidemiology , Hand Strength , Heart Failure/diagnosis , Heart Failure/epidemiology , Heart Failure/therapy , Humans , Patient Reported Outcome Measures , Registries
5.
J Card Fail ; 26(4): 316-323, 2020 Apr.
Article in English | MEDLINE | ID: mdl-31809791

ABSTRACT

BACKGROUND: Worsening heart failure (HF) and health-related quality of life (HRQOL) have been shown to impact the decision to proceed with left ventricular assist device (LVAD) implantation, but little is known about how socioeconomic factors influence expressed patient preference for LVAD. METHODS AND RESULTS: Ambulatory patients with advanced systolic HF (n=353) reviewed written information about LVAD therapy and completed a brief survey to indicate whether they would want an LVAD to treat their current level of HF. Ordinal logistic regression analyses identified clinical and demographic predictors of LVAD preference. Higher New York Heart Association (NYHA) class, worse HRQOL measured by Kansas City Cardiomyopathy Questionnaire, lower education level, and lower income were significant univariable predictors of patients wanting an LVAD. In the multivariable model, higher NYHA class (OR [odds ratio]: 1.43, CI [confidence interval]: 1.08-1.90, P = .013) and lower income level (OR: 2.10, CI: 1.18 - 3.76, P = .012 for <$40,000 vs >$80,000) remained significantly associated with wanting an LVAD. CONCLUSION: Among ambulatory patients with advanced systolic HF, treatment preference for LVAD was influenced by level of income independent of HF severity. Understanding the impact of socioeconomic factors on willingness to consider LVAD therapy may help tailor counseling towards individual needs.


Subject(s)
Heart Failure , Heart-Assist Devices , Heart Failure/therapy , Humans , Prospective Studies , Quality of Life , Socioeconomic Factors , Treatment Outcome
6.
AJR Am J Roentgenol ; 213(5): W188-W193, 2019 11.
Article in English | MEDLINE | ID: mdl-31268731

ABSTRACT

OBJECTIVE. The objective of our study was to explore whether clinical factors historically associated with contrast material-causative kidney injury (contrast-induced nephrotoxicity [CIN]) increase risk after use of IV iodinated low-osmolality contrast material (LOCM) in patients with stage IIIb-V chronic kidney disease. MATERIALS AND METHODS. In this retrospective hypothesis-generating study, 1:1 propensity score matching was used to assess post-CT acute kidney injury (AKI) after unenhanced or contrast-enhanced CT in patients with stable estimated glomerular filtration rate (eGFR; 1112 patients with an eGFR = 30-44 mL/min/1.73 m2 and 86 patients with an eGFR < 30 mL/min/1.73 m2 and no dialysis). Historical risk factors including diabetes mellitus, age more than 60 years, hypertension, loop diuretic use, hydrochlorothiazide use, and cardiovascular disease were evaluated for modulation of CIN risk. Stepwise multivariable logistic regression was performed. RESULTS. Overall IV LOCM was an independent risk factor for post-CT AKI in patients with an eGFR of less than 30 mL/min/1.73 m2 (odds ratio, 3.96 [95% CI, 1.29-12.21]; p = 0.016) but not in those with an eGFR of 30-44 mL/min/1.73 m2 (p = 0.24). In patients with an eGFR of less than 30 mL/min/1.73 m2, the tested covariates did not significantly modify the risk of CIN (p = 0.096-0.832). In patients with an eGFR of 30-44 mL/min/1.73 m2, risk of CIN emerged in those with cardiovascular disease (p = 0.015; number needed to harm from LOCM = 11 patients); the other tested cofactors had no significant effect (p = 0.108-0.822). CONCLUSION. CIN was observed when eGFR was less than 30 mL/min/1.73 m2. In those with an eGFR of 30-44 mL/min/1.73 m2, CIN was not observed with LOCM alone but was observed in the presence of cardiovascular disease. Other cofactors historically thought to increase CIN risk (e.g., diabetes mellitus) did not increase risk of CIN. Further study is needed to determine whether these exploratory results are true associations.


Subject(s)
Contrast Media/adverse effects , Drug-Related Side Effects and Adverse Reactions , Glomerular Filtration Rate , Iodine/adverse effects , Kidney Diseases/chemically induced , Adult , Aged , Contrast Media/chemistry , Female , Humans , Injections, Intravenous , Male , Middle Aged , Osmolar Concentration , Propensity Score , Retrospective Studies , Risk Factors , Tomography, X-Ray Computed
7.
Am J Geriatr Psychiatry ; 26(4): 476-483, 2018 04.
Article in English | MEDLINE | ID: mdl-29066038

ABSTRACT

OBJECTIVE: In 2011-2012 the U.S. Food and Drug Administration (FDA) issued safety announcements cautioning providers against prescribing high doses of citalopram given concerns for QT prolongation. The authors evaluated Veterans Affairs (VA) national trends in citalopram use and dose compared with alternative antidepressants after the FDA warnings. METHODS: Time series analyses estimated the effect of the FDA warnings on citalopram and other antidepressant across three periods: before the first FDA warning in August 2011, after the 2011 FDA warning until the second warning in March 2012, and after the 2012 FDA warning. In a National VA health system, adult VA outpatients prescribed citalopram or alternative antidepressants from February 2010 to September 2013 were studied. Outpatient use of high-dose citalopram (>40 or >20 mg daily in adults aged > 60 years) including the proportion of patients prescribed citalopram and difference between study periods. RESULTS: Between the first and second FDA warnings, among patients aged 18-60, high-dose citalopram use decreased by 2.0% per month (p < 0.001) and by 1.9% per month (p < 0.001) for older adults. After the second FDA warning in 2012, 30.7% of older patients remained on doses higher than the newly recommended dose of 20 mg. Reductions in overall use of citalopram were accompanied by significant increases in prescriptions of alternative antidepressants, with sertraline most widely prescribed. CONCLUSION: Although trends in high-dose citalopram use declined after the 2011-2012 FDA warnings, roughly one-third of older adults still remained on higher than recommended doses. Concomitant increases in sertraline and other antidepressant prescriptions suggest potential substitution of these medications for citalopram.


Subject(s)
Antidepressive Agents/administration & dosage , Citalopram/administration & dosage , Drug Labeling , Drug Prescriptions/statistics & numerical data , Hospitals, Veterans/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Antidepressive Agents/adverse effects , Citalopram/adverse effects , Female , Humans , Long QT Syndrome/chemically induced , Male , Middle Aged , Sertraline/administration & dosage , United States , United States Food and Drug Administration , Young Adult
8.
J Nerv Ment Dis ; 206(2): 155-158, 2018 02.
Article in English | MEDLINE | ID: mdl-29373459

ABSTRACT

Mental disorders have been linked to unemployment among veterans. Improving mental health conditions, such as depression, can improve veteran employment outcomes. This study compared mental health treatment among unemployed Operation Enduring Freedom (OEF; in Afghanistan) and Operation Iraqi Freedom (OIF; in Iraq) veterans and veterans from other service eras. The study included 3165 unemployed veterans from six Veterans Affairs medical centers with a positive screen that indicates a possible mental disorder. Chi-squared tests and logistic regression analyses assessed whether veteran era was associated with mental health treatment. Unemployed OEF/OIF veterans were less likely to receive psychotropic medication and four or more psychotherapy sessions compared to veterans from other eras. Multivariable analyses controlling for age found associations based on younger age rather than era. Younger unemployed veterans received fewer mental health services, which is concerning because reducing mental health symptoms may increase employment and employment may reduce symptoms, which are key factors in reintegration into civilian life.


Subject(s)
Mental Health Services/statistics & numerical data , Patient Acceptance of Health Care/statistics & numerical data , Unemployment/statistics & numerical data , Veterans/statistics & numerical data , Adolescent , Adult , Afghan Campaign 2001- , Aged , Female , Humans , Iraq War, 2003-2011 , Male , Mental Disorders/psychology , Mental Disorders/therapy , Middle Aged , Patient Acceptance of Health Care/psychology , Unemployment/psychology , United States , Young Adult
9.
Am J Gastroenterol ; 111(6): 838-44, 2016 06.
Article in English | MEDLINE | ID: mdl-27021199

ABSTRACT

OBJECTIVES: Access to subspecialty care may be difficult for patients with liver disease, but it is unknown whether access influences outcomes among this population. Our objectives were to determine rates and predictors of access to ambulatory gastrointestinal (GI) subspecialty care for patients with liver disease and to determine whether access to subspecialty GI care is associated with better survival. METHODS: We studied 28,861 patients within the Veterans Administration VISN 11 Liver Disease cohort who had an ICD-9-CM diagnosis code for liver disease from 1 January 2000 through 30 May 2011. Access was defined as a completed outpatient clinic visit with a gastroenterologist or hepatologist at any time after diagnosis. Multivariable logistic regression was used to determine predictors of access to a GI subspecialist. Survival curves were compared between those who did and those who did not see a specialist, with propensity score adjustment to account for other covariates that may affect access. RESULTS: Overall, 10,710 patients (37%) had a completed GI visit. On multivariable regression, older patients (odds ratio (OR) 0.98, P<0.001), those with more comorbidities (OR 0.98, P=0.01), and those living farther from a tertiary-care center (OR 0.998/mi, P<0.001) were less likely to be seen in clinic. Patients who were more likely to be seen included those who had hepatitis C (OR 1.5, P<0.001) or cirrhosis (OR 3.5, P<0.001) diagnoses prior to their initial visit. Patients with an ambulatory GI visit at any time after diagnosis were less likely to die at 5 years when compared with propensity-score-matched controls (hazard ratio 0.81, P<0.001). CONCLUSIONS: Access to ambulatory GI care was associated with improved 5-year survival for patients with liver disease. Innovative care coordination techniques may prove beneficial in extending access to care to liver disease patients.


Subject(s)
Ambulatory Care , Health Services Accessibility , Liver Diseases/therapy , Comorbidity , Female , Humans , Liver Function Tests , Male , Middle Aged , Propensity Score , Specialization , Survival Rate , United States , Veterans
10.
J Clin Psychopharmacol ; 36(5): 445-52, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27580492

ABSTRACT

OBJECTIVE: Although previous studies have assessed whether depression is a mortality risk factor, few have examined whether antidepressant medications (ADMs) influence mortality risk. METHODS: We estimated hazards of 1-year all-cause mortality associated with ADMs, with use occurring within 90 days of depression diagnosis among 720 821 patients who received treatment in a Veterans Health Administration facility during fiscal year 2006. We addressed treatment selection biases using conventional Cox regression, propensity-stratified Cox regression (propensity score), and 2 forms of marginal structural models. Models accounted for multiple potential clinical and demographic confounders, and sensitivity analyses compared findings by antidepressant class. RESULTS: Antidepressant medication use compared with no use was associated with significantly lower hazards of 1-year mortality risk in Cox (hazard ratio [HR], 0.93; 95% confidence interval [CI], 0.90-0.97) and propensity score estimates (HR, 0.94; 95% CI, 0.91-0.98), whereas marginal structural model-based estimates showed no difference in mortality risk when the exposure was specified as "as-treated" in every 90-day intervals of the 1-year follow-up (HR, 0.91; 95% CI, 0.66-1.26) but showed increased risk when specified as "intent-to-treat" (HR, 1.07; 95% CI, 1.02-1.13). CONCLUSIONS: Among patients treated with ADMs belonging to a single class in the first 90 days, there were no significant differences in 1-year all-cause mortality risks. When accounting for clinical and demographic characteristics and treatment selection bias, ADM use was associated with no excess harm.


Subject(s)
Antidepressive Agents/adverse effects , Depressive Disorder/drug therapy , Mortality , United States Department of Veterans Affairs/statistics & numerical data , Adolescent , Adult , Aged , Aged, 80 and over , Depressive Disorder/epidemiology , Female , Follow-Up Studies , Humans , Male , Middle Aged , Risk , United States/epidemiology , Young Adult
11.
Obes Surg ; 33(12): 3814-3828, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37940737

ABSTRACT

OBJECTIVE: Obesity and associated comorbidities, such as NAFLD, impose a major healthcare burden worldwide. Bariatric surgery remains the most successful approach for sustained weight loss and the resolution of obesity-related complications. However, the impact of preexisting NAFLD on weight loss after bariatric surgery has not been previously studied. The goal of this study is to assess the impact of preexisting NAFLD on weight loss outcomes up to 5 years after weight loss surgery. RESEARCH DESIGN AND METHODS: Data from the Michigan Bariatric Surgery Cohort (MI-BASiC) was extracted to examine the effect of baseline NAFLD on weight loss outcomes. The cohort included a total of 714 patients older than 18 years of age undergoing gastric bypass (GB; 380 patients) or sleeve gastrectomy (SG; 334 patients) at the University of Michigan between January 2008 and November 2013. Repeated measure analysis was used to determine if preexisting NAFLD was a predictor of weight loss outcomes up to 5 years post-surgery. RESULTS: We identified 221 patients with an established clinical diagnosis of NAFLD at baseline. Multivariable repeated measure analysis with adjustment for covariates shows that patients with preexisting NAFLD had a significantly lower percentage of total and excess weight loss compared to patients without preexisting NAFLD. Furthermore, our data show that baseline dyslipidemia is an indicator of the persistence of NAFLD after bariatric surgery. CONCLUSIONS: Our data show that patients' body weight loss in response to bariatric surgery is impacted by factors such as preexisting NAFLD. Additionally, we show that NAFLD may persist or recur in a subset of patients after surgery, and thus careful continued follow-up is recommended.


Subject(s)
Bariatric Surgery , Gastric Bypass , Non-alcoholic Fatty Liver Disease , Obesity, Morbid , Humans , Non-alcoholic Fatty Liver Disease/complications , Obesity, Morbid/surgery , Treatment Outcome , Obesity/surgery , Weight Loss/physiology , Gastrectomy
12.
Clin Diabetes Endocrinol ; 8(1): 7, 2022 Oct 24.
Article in English | MEDLINE | ID: mdl-36280885

ABSTRACT

BACKGROUND: Several systemic and sociodemographic factors have been associated with the development and progression of diabetic retinopathy (DR). However, there is limited investigation of the potential role sociodemographic factors may play in augmenting systemic risk factors of DR. We hypothesize that age, sex, race, ethnicity, income, and insurance payor have an impact on hemoglobin A1c (HbA1c), body mass index, and systolic blood pressure, and therefore an upstream effect on the development of DR and vision-threatening forms of DR (VTDR). METHODS: Multivariable analysis of longitudinal electronic health record data at a large academic retina clinic was performed. Sociodemographic factors included race, ethnicity, income, and insurance payor. Systemic risk factors for DR included hemoglobin A1c (HbA1c), systolic blood pressure (SBP), and body mass index (BMI). VTDR was identified from encounter diagnostic codes indicating proliferative retinopathy or diabetic macular edema. Patient-reported primary address zip codes were used to approximate income level, stratified into quartiles. RESULTS: From 2016 to 2018, 3,470 patients with diabetes totaled 11,437 visits were identified. Black patients had higher HbA1c and SBP compared to White patients. White patients had higher BMI and SBP compared to patients of unknown/other race and greater odds of VTDR than the latter. Patients of Hispanic ethnicity had significantly higher SBP than non-Hispanic patients. Low-income patients had higher BMI and SBP than high-income patients and greater odds of VTDR than the latter. Medicaid recipients had greater odds of VTDR than those with Blue Care Network (BCN) and Blue Cross Blue Shield (BCBS) insurance. Medicaid and Medicare recipients had higher SBP compared to BCBS recipients. Finally, both higher HbA1c and SBP had greater odds of VTDR. There were no differences in odds of VTDR between White and Black patients or between Hispanic and non-Hispanic patients. CONCLUSION: Significant associations exist between certain sociodemographic factors and well-known risk factors for DR. Income and payor were associated with increased severity of systemic risk factors and presence of VTDR. These results warrant further investigation of how risk factor optimization and disease prevention may be further improved by targeted intervention of these modifiable sociodemographic factors.

13.
J Heart Lung Transplant ; 41(1): 104-112, 2022 01.
Article in English | MEDLINE | ID: mdl-34629234

ABSTRACT

INTRODUCTION: Patients with ambulatory advanced heart failure (HF) are increasingly considered for durable mechanical circulatory support (MCS) and heart transplantation and their effective triage requires careful assessment of the clinical trajectory. METHODS: REVIVAL, a prospective, observational study, enrolled 400 ambulatory advanced HF patients from 21 MCS/transplant centers in 2015-2016. Study design included a clinical re-assessment of Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) profile within 120 days after enrollment. The prognostic impact of a worsening INTERMACS Profile assigned by the treating physician was assessed at 1 year after the Early Relook. RESULTS: Early Relook was done in 325 of 400 patients (81%), of whom 24% had a worsened INTERMACS Profile, associated with longer HF history and worse baseline INTERMACS profile, but no difference in baseline LVEF (median 0.20), 6-minute walk, quality of life, or other baseline parameters. Early worsening predicted higher rate of the combined primary endpoint of death, urgent MCS, or urgent transplant by 1 year after Early Relook, (28% vs 15%), with hazard ratio 2.2 (95% CI 1.2- 3.8; p = .006) even after adjusting for baseline INTERMACS Profile and Seattle HF Model score. Deterioration to urgent MCS occurred in 14% vs 5% (p = .006) during the year after Early Relook. CONCLUSIONS: Early Relook identifies worsening of INTERMACS Profile in a significant population of ambulatory advanced HF, who had worse outcomes over the subsequent year. Early reassessment of ambulatory advanced HF patients should be performed to better define the trajectory of illness and inform triage to advanced therapies.


Subject(s)
Heart Failure/therapy , Aged , Female , Heart-Assist Devices , Humans , Male , Middle Aged , Prospective Studies , Registries , Risk Assessment , Severity of Illness Index
14.
Am J Epidemiol ; 173(12): 1380-7, 2011 Jun 15.
Article in English | MEDLINE | ID: mdl-21571871

ABSTRACT

In this paper, the authors describe a simple method for making longitudinal comparisons of alternative markers of a subsequent event. The method is based on the aggregate prediction gain from knowing whether or not a marker has occurred at any particular age. An attractive feature of the method is the exact decomposition of the measure into 2 components: 1) discriminatory ability, which is the difference in the mean time to the subsequent event for individuals for whom the marker has and has not occurred, and 2) prevalence factor, which is related to the proportion of individuals who are positive for the marker at a particular age. Development of the method was motivated by a study that evaluated proposed markers of the menopausal transition, where the markers are measures based on successive menstrual cycles and the subsequent event is the final menstrual period. Here, results from application of the method to 4 alternative proposed markers of the menopausal transition are compared with previous findings.


Subject(s)
Menopause , Reproductive History , Adult , Age Factors , Biomarkers , Female , Humans , Longitudinal Studies , Menstrual Cycle , Middle Aged , Predictive Value of Tests , Prevalence
15.
Am J Epidemiol ; 173(9): 1078-84, 2011 May 01.
Article in English | MEDLINE | ID: mdl-21422059

ABSTRACT

In longitudinal studies of developmental and disease processes, participants are followed prospectively with intermediate milestones identified as they occur. Frequently, studies enroll participants over a range of ages including ages at which some participants' milestones have already passed. Ages at milestones that occur prior to study entry are left censored if individuals are enrolled in the study or left truncated if they are not. The authors examined the bias incurred by ignoring these issues when estimating the distribution of age at milestones or the time between 2 milestones. Methods that account for left truncation and censoring are considered. Data on the menopausal transition are used to illustrate the problem. Simulations show that bias can be substantial and that standard errors can be severely underestimated in naïve analyses that ignore left truncation. Bias can be reduced when analyses account for left truncation, although the results are unstable when the fraction truncated is high. Simulations suggest that a better solution, when possible, is to modify the study design so that information on current status (i.e., whether or not a milestone has passed) is collected on all potential participants, analyzing those who are past the milestone at the time of recruitment as left censored rather than excluding such individuals from the analysis.


Subject(s)
Bias , Biomedical Research , Longitudinal Studies , Research Design , Disease Progression , Female , Human Development , Humans , Menopause , Middle Aged
16.
J Am Heart Assoc ; 10(14): e019901, 2021 07 20.
Article in English | MEDLINE | ID: mdl-34250813

ABSTRACT

Background Heart failure (HF) imposes significant burden on patients and caregivers. Longitudinal data on caregiver health-related quality of life (HRQOL) and burden in ambulatory advanced HF are limited. Methods and Results Ambulatory patients with advanced HF (n=400) and their participating caregivers (n=95) enrolled in REVIVAL (Registry Evaluation of Vital Information for VADs [Ventricular Assist Devices] in Ambulatory Life) were followed up for 24 months, or until patient death, left ventricular assist device implantation, heart transplantation, or loss to follow-up. Caregiver HRQOL (EuroQol Visual Analog Scale) and burden (Oberst Caregiving Burden Scale) did not change significantly from baseline to follow-up. At time of caregiver enrollment, better patient HRQOL by Kansas City Cardiomyopathy Questionnaire was associated with better caregiver HRQOL (P=0.007) and less burden by both time spent (P<0.0001) and difficulty (P=0.0007) of caregiving tasks. On longitudinal analyses adjusted for baseline values, better patient HRQOL (P=0.034) and being a married caregiver (P=0.016) were independently associated with better caregiver HRQOL. Patients with participating caregivers (versus without) were more likely to prefer left ventricular assist device therapy over time (odds ratio, 1.43; 95% CI, 1.03-1.99; P=0.034). Among patients with participating caregivers, those with nonmarried (versus married) caregivers were at higher composite risk of HF hospitalization, death, heart transplantation or left ventricular assist device implantation (hazard ratio, 2.99; 95% CI, 1.29-6.96; P=0.011). Conclusions Patient and caregiver characteristics may impact their HRQOL and other health outcomes over time. Understanding the patient-caregiver relationship may better inform medical decision making and outcomes in ambulatory advanced HF.


Subject(s)
Caregivers/psychology , Heart Failure/therapy , Quality of Life , Aged , Cost of Illness , Female , Heart Transplantation , Heart-Assist Devices , Hospitalization , Humans , Male , Middle Aged , Multivariate Analysis , Prospective Studies , Registries , Regression Analysis
17.
PLoS One ; 15(4): e0231508, 2020.
Article in English | MEDLINE | ID: mdl-32298308

ABSTRACT

OBJECTIVE: To determine if findings of "cartilage icing" and chondrocalcinosis on knee radiography can differentiate between gout and calcium pyrophosphate deposition (CPPD). METHODS: IRB-approval was obtained and informed consent was waived for this retrospective study. Electronic medical records from over 2.3 million patients were searched for keywords to identify subjects with knee aspiration-proven cases of gout or CPPD. Radiographs were reviewed by two fellowship-trained musculoskeletal radiologists in randomized order, blinded to the patients' diagnoses. Images were evaluated regarding the presence or absence of cartilage icing, chondrocalcinosis, tophi, gastrocnemius tendon calcification, and joint effusion. Descriptive statistics, sensitivity, specificity, positive and negative predictive values, and accuracy were calculated. RESULTS: From 49 knee radiographic studies in 46 subjects (31 males and 15 females; mean age 66±13 years), 39% (19/49) showed gout and 61% (30/49) CPPD on aspiration. On knee radiographs, cartilage icing showed a higher sensitivity for CPPD than gout (53-67% and 26%, respectively). Chondrocalcinosis also showed a higher sensitivity for CPPD than gout (50-57% versus 5%), with 95% specificity and 94% positive predictive value for diagnosis of CPPD versus gout. Soft tissue tophus-like opacities were present in gout at the patellar tendon (5%, 1/19) and at the popliteus groove in CPPD (15%, 4/27). Gastrocnemius tendon calcification was present in 30% (8/27) of subjects with CPPD, and 5% (1/19) of gout. CONCLUSION: In subjects with joint aspiration-proven crystal disease of the knee, the radiographic finding of cartilage icing was seen in both gout and CPPD. Chondrocalcinosis (overall and hyaline cartilage) as well as gastrocnemius tendon calcification positively correlated with the diagnosis of CPPD over gout.


Subject(s)
Calcinosis/diagnostic imaging , Calcium Pyrophosphate/metabolism , Cartilage, Articular/diagnostic imaging , Chondrocalcinosis/diagnostic imaging , Gout/diagnostic imaging , Knee Joint/diagnostic imaging , Aged , Calcinosis/diagnosis , Cartilage, Articular/pathology , Chondrocalcinosis/diagnosis , Diagnosis, Differential , Female , Gout/diagnosis , Humans , Male , Radiography , Retrospective Studies
18.
ESC Heart Fail ; 7(5): 2074-2081, 2020 10.
Article in English | MEDLINE | ID: mdl-32578953

ABSTRACT

AIMS: Statins improve survival and reduce rejection and cardiac allograft vasculopathy after heart transplantation (HT). The impact of different statin intensities on clinical outcomes has never been assessed. We set out to determine the impact of statin exposure on cardiovascular outcomes after HT. METHODS AND RESULTS: We performed a retrospective study of 346 adult patients who underwent HT from 2006 to 2018. Statin intensity was determined longitudinally after HT based on American College of Cardiology/American Heart Association (ACC/AHA) guidelines. The primary outcome was the time to the first primary event defined as the composite of heart failure hospitalization, myocardial infarction, revascularization, and all-cause mortality. Secondary outcomes included time to significant rejection and time to moderate-severe cardiac allograft vasculopathy. Adverse events were evaluated for subjects on high-intensity statin therapy. A Cox proportional hazards model was used to evaluate the relationship between clinical variables, statin intensity, and outcomes. Most subjects were treated with low-intensity statin therapy although this declined from 89.9% of the population at 1month after HT to 42.8% at 5years after HT. History of ischaemic cardiomyopathy, significant acute rejection, older donor age, and lesser statin intensity (p ≤ 0.001) were associated with reduced time to the primary outcome in a multivariable Cox model. Greater intensity of statin therapy was most beneficial early after HT. There were no statin-related adverse events for the 14 subjects on high-intensity statin therapy. CONCLUSIONS: Greater statin intensity was associated with a reduction in adverse cardiovascular outcomes after HT.


Subject(s)
Heart Diseases , Heart Transplantation , Hydroxymethylglutaryl-CoA Reductase Inhibitors , Adult , American Heart Association , Humans , Retrospective Studies , United States/epidemiology
19.
Int J Nurs Stud ; 104: 103531, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32062053

ABSTRACT

BACKGROUND: In 2010, the Veterans Health Administration Office of Nursing Services (VHA ONS) issued a Staffing Methodology (SM) Directive, standardizing the method of determining appropriate nurse staffing for VHA facilities. OBJECTIVES: To assess associations between the Directive, nurse staffing trends, and healthcare-associated infections. RESEARCH DESIGN: We conducted multi-level interrupted time series analyses of nurse staffing trends and the rates of two healthcare-associated infections before and after implementation of the Directive, October 1, 2008 - June 30, 2014. SUBJECTS: Acute care, critical care, mental health acute care, and longterm care nursing units (called Community Living Centers, CLC in VHA) among 285 VHA facilities were included in nurse staffing trends analyses, while acute and critical care units in 123 facilities were used in the analysis of infection rates. MEASURES: Monthly rates were calculated at the facility unit level and included nursing hours per patient day (NHPPD) for all nursing personnel and number of catheter-associated urinary tract infections (CAUTI) and central line-associated bloodstream infections (CLABSI) per 1000 device days. RESULTS: Nursing hours per patient day increased in both time periods. However, the differential change in rate of nursing hours per patient day following implementation of the Directive was not statistically significant. On average, we found a statistically significant decrease of 0.05 unit in the post-Directive central line-associated bloodstream infection rates associated with a unit increase in nursing hours per patient day. CONCLUSIONS: System-wide implementation of Staffing Methodology may be one contributing factor impacting patient outcomes.


Subject(s)
Cross Infection/epidemiology , Interrupted Time Series Analysis , Nursing Staff, Hospital/statistics & numerical data , Personnel Staffing and Scheduling/statistics & numerical data , Delivery of Health Care , Humans
20.
J Clin Psychiatry ; 80(2)2019 02 05.
Article in English | MEDLINE | ID: mdl-30840786

ABSTRACT

OBJECTIVE: Prescriptions for sedative hypnotics are routinely initiated and renewed to treat insomnia, despite evidence supporting nonpharmacologic treatments as comparable and more favorable over time. We used national Veterans Health Administration data to assess patient characteristics associated with high-dose and long-term zolpidem use. METHOD: The study included outpatients with new zolpidem prescriptions (January 1, 2013, to June 3, 2014). We defined high-dose use as use of doses above those recommended in the 2013 FDA safety warning (> 5 mg for women, > 10 mg for men) and defined long-term use as at least 180 days of continued supply. We fit separate logistic regression models by sex to evaluate how patient factors, adjusting for facilities, predicted high-dose and long-term use. RESULTS: Of 139,525 new zolpidem users, < 1% of men and 41% of women used high doses within 180 days of initiation, and 20% continued to use zolpidem long-term. Prior-year use of other sleep medications was associated with both high-dose and long-term use. Substance abuse/dependence was associated with high-dose use in women (odds ratio = 1.20, P < .001). Although long-term use was less likely in those over the age of 85 years, about 1 in 5 users aged 65 to 85 continued long-term. In both sexes, individuals of Hispanic ethnicity and nonwhite races were less likely to use long-term, whereas those with ICD-9-CM-defined psychiatric and sleep disorder diagnoses were more likely to use long-term. CONCLUSIONS: Zolpidem use at a higher-than-recommended dose was common in women who were new zolpidem users. In both sexes, 1 in 5 users continued to use zolpidem for at least 180 days. Efforts to improve access to effective nonpharmacologic treatment alternatives may benefit from attention to subpopulations with higher risk of high-dose and long-term use.


Subject(s)
Sleep Aids, Pharmaceutical/therapeutic use , Substance-Related Disorders/epidemiology , Zolpidem/therapeutic use , Adult , Age Factors , Aged , Aged, 80 and over , Databases, Factual/statistics & numerical data , Drug Administration Schedule , Female , Humans , Male , Middle Aged , Sex Factors , Time Factors , United States/epidemiology , Veterans/statistics & numerical data , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL