Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 114
Filtrar
1.
J Gen Intern Med ; 2024 Oct 07.
Artículo en Inglés | MEDLINE | ID: mdl-39375318

RESUMEN

IMPORTANCE: Traditional risk prediction and risk adjustment models have focused on clinical characteristics, but accounting for social determinants of health (SDOH) and complex health conditions could improve understanding of sepsis outcomes and our ability to predict outcomes, treat patients, and assess quality of care. OBJECTIVE: To evaluate the impact of SDOH and health scales in sepsis mortality risk prediction and hospital performance assessment. DESIGN: Observational cohort study. SETTING: One hundred twenty-nine hospitals in the nationwide Veterans Affairs (VA) Healthcare System between 2017 and 2021. PARTICIPANTS: Veterans admitted through emergency departments with community-acquired sepsis. EXPOSURES: Individual- and community-level SDOH (race, housing instability, marital status, Area Deprivation Index [ADI], and rural residence) and two health scales (the Care Assessment Need [CAN] score and Claims-Based Frailty Index [CFI]). MAIN OUTCOMES AND MEASURES: The primary outcome was 90-day mortality from emergency department arrival; secondary outcomes included 30-day mortality and in-hospital mortality. RESULTS: Among 144,889 patients admitted to the hospital with community-acquired sepsis, 139,080 were men (96.0%), median (IQR) age was 71 (64-77) years, and median (IQR) ADI was 60 (38-81). Multivariable regression models had good calibration and discrimination across models that adjusted for different sets of variables (e.g., AUROC, 0.782; Brier score, 1.33; and standardized mortality rate, 1.00). Risk-adjusted hospital performance was similar across all models. Among 129 VA hospitals, three hospitals shifted from the lowest or highest quintile of performance when comparing models that excluded SDOH to models that adjusted for all variables. Models that adjusted for ADI reported odds ratios (CI) of 1.00 (1.00-1.00), indicating that ADI does not significantly predict sepsis mortality in this cohort of patients. CONCLUSION AND RELEVANCE: In patients with community-acquired sepsis, adjusting for community SDOH variables such as ADI did not improve 90-day sepsis mortality predictions in mortality models and did not substantively alter hospital performance within the VA Healthcare System. Understanding the role of SDOH in risk prediction and risk adjustment models is vital because it could prevent hospitals from being negatively evaluated for treating less advantaged patients. However, we found that in VA hospitals, the potential impact of SDOH on 90-day sepsis mortality was minimal.

2.
J Stroke Cerebrovasc Dis ; 33(12): 108087, 2024 Oct 12.
Artículo en Inglés | MEDLINE | ID: mdl-39401577

RESUMEN

OBJECTIVE: To compare changes in cognitive trajectories after stroke between younger (18-64) and older (65+) adults, accounting for pre-stroke cognitive trajectories. MATERIALS AND METHODS: Pooled cohort study using individual participant data from 3 US cohorts (1971-2019), the Atherosclerosis Risk In Communities Study (ARIC), Framingham Offspring Study (FOS), and REasons for Geographic And Racial Differences in Stroke Study (REGARDS). Linear mixed effect models evaluated the association between age and the initial change (intercept) and rate of change (slope) in cognition after compared to before stroke. Outcomes were global cognition (primary), memory and executive function. RESULTS: We included 1,292 participants with stroke; 197 younger (47.2 % female, 32.5 % Black race) and 1,095 older (50.2 % female, 46.4 % Black race). Median (IQR) age at stroke was 59.7 (56.6-61.7) (younger group) and 75.2 (70.5-80.2) years (older group). Compared to the young, older participants had greater declines in global cognition (-1.69 point [95 % CI, -2.82 to -0.55] greater), memory (-1.05 point [95 % CI, -1.92 to -0.17] greater), and executive function (-3.72 point [95 % CI, -5.23 to -2.21] greater) initially after stroke. Older age was associated with faster declines in global cognition (-0.18 points per year [95 % CI, -0.36 to -0.01] faster) and executive function (-0.16 [95 % CI, -0.26 to -0.06] points per year for every 10 years of higher age), but not memory (-0.006 [95 % CI, -0.15 to 0.14]), after compared to before stroke. CONCLUSION: Older age was associated with greater post-stroke cognitive declines, accounting for differences in pre-stroke cognitive trajectories between the old and the young.

3.
J Gen Intern Med ; 38(3): 699-706, 2023 02.
Artículo en Inglés | MEDLINE | ID: mdl-35819683

RESUMEN

BACKGROUND: Patterns of opioid use vary, including prescribed use without aberrancy, limited aberrant use, and potential opioid use disorder (OUD). In clinical practice, similar opioid-related International Classification of Disease (ICD) codes are applied across this spectrum, limiting understanding of how groups vary by sociodemographic factors, comorbidities, and long-term risks. OBJECTIVE: (1) Examine how Veterans assigned opioid abuse/dependence ICD codes vary at diagnosis and with respect to long-term risks. (2) Determine whether those with limited aberrant use share more similarities to likely OUD vs those using opioids as prescribed. DESIGN: Longitudinal observational cohort study. PARTICIPANTS: National sample of Veterans categorized as having (1) likely OUD, (2) limited aberrant opioid use, or (3) prescribed, non-aberrant use based upon enhanced medical chart review. MAIN MEASURES: Comparison of sociodemographic and clinical factors at diagnosis and rates of age-adjusted mortality, non-fatal opioid overdose, and hospitalization after diagnosis. An exploratory machine learning analysis investigated how closely those with limited aberrant use resembled those with likely OUD. KEY RESULTS: Veterans (n = 483) were categorized as likely OUD (62.1%), limited aberrant use (17.8%), and prescribed, non-aberrant use (20.1%). Age, proportion experiencing homelessness, chronic pain, anxiety disorders, and non-opioid substance use disorders differed by group. All-cause mortality was high (44.2 per 1000 person-years (95% CI 33.9, 56.7)). Hospitalization rates per 1000 person-years were highest in the likely OUD group (831.5 (95% CI 771.0, 895.5)), compared to limited aberrant use (739.8 (95% CI 637.1, 854.4)) and prescribed, non-aberrant use (411.9 (95% CI 342.6, 490.4). The exploratory analysis reclassified 29.1% of those with limited aberrant use as having likely OUD with high confidence. CONCLUSIONS: Veterans assigned opioid abuse/dependence ICD codes are heterogeneous and face variable long-term risks. Limited aberrant use confers increased risk compared to no aberrant use, and some may already have OUD. Findings warrant future investigation of this understudied population.


Asunto(s)
Personas con Mala Vivienda , Sobredosis de Opiáceos , Trastornos Relacionados con Opioides , Veteranos , Humanos , Trastornos Relacionados con Opioides/diagnóstico , Trastornos Relacionados con Opioides/epidemiología , Trastornos Relacionados con Opioides/tratamiento farmacológico , Analgésicos Opioides/efectos adversos , Sobredosis de Opiáceos/tratamiento farmacológico
4.
Health Care Manag Sci ; 26(1): 93-116, 2023 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-36284034

RESUMEN

Preventing chronic diseases is an essential aspect of medical care. To prevent chronic diseases, physicians focus on monitoring their risk factors and prescribing the necessary medication. The optimal monitoring policy depends on the patient's risk factors and demographics. Monitoring too frequently may be unnecessary and costly; on the other hand, monitoring the patient infrequently means the patient may forgo needed treatment and experience adverse events related to the disease. We propose a finite horizon and finite-state Markov decision process to define monitoring policies. To build our Markov decision process, we estimate stochastic models based on longitudinal observational data from electronic health records for a large cohort of patients seen in the national U.S. Veterans Affairs health system. We use our model to study policies for whether or when to assess the need for cholesterol-lowering medications. We further use our model to investigate the role of gender and race on optimal monitoring policies.


Asunto(s)
Anticolesterolemiantes , Enfermedades Cardiovasculares , Humanos , Enfermedades Cardiovasculares/prevención & control , Factores de Riesgo
5.
JAMA ; 330(8): 715-724, 2023 08 22.
Artículo en Inglés | MEDLINE | ID: mdl-37606674

RESUMEN

Importance: Aspirin is an effective and low-cost option for reducing atherosclerotic cardiovascular disease (CVD) events and improving mortality rates among individuals with established CVD. To guide efforts to mitigate the global CVD burden, there is a need to understand current levels of aspirin use for secondary prevention of CVD. Objective: To report and evaluate aspirin use for secondary prevention of CVD across low-, middle-, and high-income countries. Design, Setting, and Participants: Cross-sectional analysis using pooled, individual participant data from nationally representative health surveys conducted between 2013 and 2020 in 51 low-, middle-, and high-income countries. Included surveys contained data on self-reported history of CVD and aspirin use. The sample of participants included nonpregnant adults aged 40 to 69 years. Exposures: Countries' per capita income levels and world region; individuals' socioeconomic demographics. Main Outcomes and Measures: Self-reported use of aspirin for secondary prevention of CVD. Results: The overall pooled sample included 124 505 individuals. The median age was 52 (IQR, 45-59) years, and 50.5% (95% CI, 49.9%-51.1%) were women. A total of 10 589 individuals had a self-reported history of CVD (8.1% [95% CI, 7.6%-8.6%]). Among individuals with a history of CVD, aspirin use for secondary prevention in the overall pooled sample was 40.3% (95% CI, 37.6%-43.0%). By income group, estimates were 16.6% (95% CI, 12.4%-21.9%) in low-income countries, 24.5% (95% CI, 20.8%-28.6%) in lower-middle-income countries, 51.1% (95% CI, 48.2%-54.0%) in upper-middle-income countries, and 65.0% (95% CI, 59.1%-70.4%) in high-income countries. Conclusion and Relevance: Worldwide, aspirin is underused in secondary prevention, particularly in low-income countries. National health policies and health systems must develop, implement, and evaluate strategies to promote aspirin therapy.


Asunto(s)
Aspirina , Enfermedades Cardiovasculares , Prevención Secundaria , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Aspirina/uso terapéutico , Enfermedades Cardiovasculares/epidemiología , Enfermedades Cardiovasculares/mortalidad , Enfermedades Cardiovasculares/prevención & control , Estudios Transversales , Países Desarrollados/economía , Países Desarrollados/estadística & datos numéricos , Países en Desarrollo/economía , Países en Desarrollo/estadística & datos numéricos , Prevención Secundaria/economía , Prevención Secundaria/métodos , Prevención Secundaria/estadística & datos numéricos , Autoinforme/economía , Autoinforme/estadística & datos numéricos , Fármacos Cardiovasculares/uso terapéutico
6.
J Thromb Thrombolysis ; 54(4): 639-646, 2022 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-35699872

RESUMEN

Recent trials suggest that aspirin for primary prevention may do more harm than good for some, including adults over 70 years of age. We sought to assess how primary care providers (PCPs) use aspirin for the primary prevention in older patients and to identify barriers to use according to recent guidelines, which recommend against routine use in patients over age 70. We surveyed PCPs about whether they would recommend aspirin in clinical vignettes of a 75-year-old patient with a 10-year atherosclerotic cardiovascular disease risk of 25%. We also queried perceived difficulty following guideline recommendations, as well as perceived barriers and facilitators. We obtained responses from 372 PCPs (47.9% response). In the patient vignette, 45.4% of clinicians recommended aspirin use, which did not vary by whether the patient was using aspirin initially (p = 0.21); 41.7% believed aspirin was beneficial. Perceived barriers to guideline-based aspirin use included concern about patients being upset (41.6%), possible malpractice claims (25.0%), and not having a strategy for discussing aspirin use (24.5%). The estimated adjusted probability of rating the guideline as "hard to follow" was higher in clinicians who believed aspirin was beneficial (29.4% vs. 8.0%; p < 0.001) and who worried the patient would be upset if told to stop aspirin (26.7% vs. 12.5%; p = 0.001). Internists vary considerably in their recommendations for aspirin use for primary prevention in older patients. A high proportion of PCPs continue to believe aspirin is beneficial in this setting. These results can inform de-implementation efforts to optimize evidence-based aspirin use.


Asunto(s)
Aspirina , Médicos , Humanos , Anciano , Anciano de 80 o más Años , Aspirina/uso terapéutico , Actitud del Personal de Salud , Encuestas y Cuestionarios
7.
BMC Health Serv Res ; 22(1): 739, 2022 Jun 03.
Artículo en Inglés | MEDLINE | ID: mdl-35659234

RESUMEN

BACKGROUND: Hospital-specific template matching (HS-TM) is a newer method of hospital performance assessment. OBJECTIVE: To assess the interpretability, credibility, and usability of HS-TM-based vs. regression-based performance assessments. RESEARCH DESIGN: We surveyed hospital leaders (January-May 2021) and completed follow-up semi-structured interviews. Surveys included four hypothetical performance assessment vignettes, with method (HS-TM, regression) and hospital mortality randomized. SUBJECTS: Nationwide Veterans Affairs Chiefs of Staff, Medicine, and Hospital Medicine. MEASURES: Correct interpretation; self-rated confidence in interpretation; and self-rated trust in assessment (via survey). Concerns about credibility and main uses (via thematic analysis of interview transcripts). RESULTS: In total, 84 participants completed 295 survey vignettes. Respondents correctly interpreted 81.8% HS-TM vs. 56.5% regression assessments, p < 0.001. Respondents "trusted the results" for 70.9% HS-TM vs. 58.2% regression assessments, p = 0.03. Nine concerns about credibility were identified: inadequate capture of case-mix and/or illness severity; inability to account for specialized programs (e.g., transplant center); comparison to geographically disparate hospitals; equating mortality with quality; lack of criterion standards; low power; comparison to dissimilar hospitals; generation of rankings; and lack of transparency. Five concerns were equally relevant to both methods, one more pertinent to HS-TM, and three more pertinent to regression. Assessments were mainly used to trigger further quality evaluation (a "check oil light") and motivate behavior change. CONCLUSIONS: HS-TM-based performance assessments were more interpretable and more credible to VA hospital leaders than regression-based assessments. However, leaders had a similar set of concerns related to credibility for both methods and felt both were best used as a screen for further evaluation.


Asunto(s)
Grupos Diagnósticos Relacionados , Hospitales , Atención a la Salud , Mortalidad Hospitalaria , Humanos , Encuestas y Cuestionarios
8.
Ann Intern Med ; 174(12): 1666-1673, 2021 12.
Artículo en Inglés | MEDLINE | ID: mdl-34606315

RESUMEN

BACKGROUND: There are 2 approaches to intensifying antihypertensive treatment when target blood pressure is not reached, adding a new medication and maximizing dose. Which strategy is better is unknown. OBJECTIVE: To assess the frequency of intensification by adding a new medication versus maximizing dose, as well as the association of each method with intensification sustainability and follow-up systolic blood pressure (SBP). DESIGN: Large-scale, population-based, retrospective cohort study. Observational data were used to emulate a target trial with 2 groups, new medication and maximizing dose, who underwent intensification of their drug regimen. SETTING: Veterans Health Administration (2011 to 2013). PATIENTS: Veterans aged 65 years or older with hypertension, an SBP of 130 mm Hg or higher, and at least 1 antihypertensive medication at less than the maximum dose. MEASUREMENTS: The following 2 intensification approaches were emulated: adding a new medication, defined as a total dose increase with new medication, and maximizing dose, defined as a total dose increase without new medication. Inverse probability weighting was used to assess the observational effectiveness of the intensification approach on sustainability of intensified treatment and follow-up SBP at 3 and 12 months. RESULTS: Among 178 562 patients, 45 575 (25.5%) had intensification by adding a new medication and 132 987 (74.5%) by maximizing dose. Compared with maximizing dose, adding a new medication was associated with less intensification sustainability (average treatment effect, -15.2% [95% CI, -15.7% to -14.6%] at 3 months and -15.1% [CI, -15.6% to -14.5%] at 12 months) but a slightly larger reduction in mean SBP (-0.8 mm Hg [CI, -1.2 to -0.4 mm Hg] at 3 months and -1.1 mm Hg [CI, -1.6 to -0.6 mm Hg] at 12 months). LIMITATION: Observational data; largely male population. CONCLUSION: Adding a new antihypertensive medication was less frequent and was associated with less intensification sustainability but slightly larger reductions in SBP. Trials would provide the most definitive support for our findings. PRIMARY FUNDING SOURCE: National Institute on Aging and Veterans Health Administration.


Asunto(s)
Antihipertensivos/uso terapéutico , Hipertensión/tratamiento farmacológico , Anciano , Antihipertensivos/administración & dosificación , Relación Dosis-Respuesta a Droga , Femenino , Humanos , Masculino , Estudios Retrospectivos , Estados Unidos , Veteranos
10.
Health Care Manag Sci ; 24(1): 1-25, 2021 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-33483911

RESUMEN

Atherosclerotic cardiovascular disease (ASCVD) is among the leading causes of death in the US. Although research has shown that ASCVD has genetic elements, the understanding of how genetic testing influences its prevention and treatment has been limited. To this end, we model the health trajectory of patients stochastically and determine treatment and testing decisions simultaneously. Since the cholesterol level of patients is one controllable risk factor for ASCVD events, we model cholesterol treatment plans as Markov decision processes. We determine whether and when patients should receive a genetic test using value of information analysis. By simulating the health trajectory of over 64 million adult patients, we find that 6.73 million patients undergo genetic testing. The optimal treatment plans informed with clinical and genetic information save 5,487 more quality-adjusted life-years while costing $1.18 billion less than the optimal treatment plans informed with clinical information only. As precision medicine becomes increasingly important, understanding the impact of genetic information becomes essential.


Asunto(s)
Aterosclerosis/prevención & control , Enfermedades Cardiovasculares/prevención & control , Pruebas Genéticas , Hipercolesterolemia/tratamiento farmacológico , Adulto , Anticolesterolemiantes/uso terapéutico , Aterosclerosis/tratamiento farmacológico , Aterosclerosis/genética , Enfermedades Cardiovasculares/genética , Simulación por Computador , Femenino , Predisposición Genética a la Enfermedad , Humanos , Masculino , Cadenas de Markov , Persona de Mediana Edad , Años de Vida Ajustados por Calidad de Vida
11.
BMC Health Serv Res ; 21(1): 561, 2021 Jun 07.
Artículo en Inglés | MEDLINE | ID: mdl-34098973

RESUMEN

BACKGROUND: Although risk prediction has become an integral part of clinical practice guidelines for cardiovascular disease (CVD) prevention, multiple studies have shown that patients' risk still plays almost no role in clinical decision-making. Because little is known about why this is so, we sought to understand providers' views on the opportunities, barriers, and facilitators of incorporating risk prediction to guide their use of cardiovascular preventive medicines. METHODS: We conducted semi-structured interviews with primary care providers (n = 33) at VA facilities in the Midwest. Facilities were chosen using a maximum variation approach according to their geography, size, proportion of MD to non-MD providers, and percentage of full-time providers. Providers included MD/DO physicians, physician assistants, nurse practitioners, and clinical pharmacists. Providers were asked about their reaction to a hypothetical situation in which the VA would introduce a risk prediction-based approach to CVD treatment. We conducted matrix and content analysis to identify providers' reactions to risk prediction, reasons for their reaction, and exemplar quotes. RESULTS: Most providers were classified as Enthusiastic (n = 14) or Cautious Adopters (n = 15), with only a few Non-Adopters (n = 4). Providers described four key concerns toward adopting risk prediction. Their primary concern was that risk prediction is not always compatible with a "whole patient" approach to patient care. Other concerns included questions about the validity of the proposed risk prediction model, potential workflow burdens, and whether risk prediction adds value to existing clinical practice. Enthusiastic, Cautious, and Non-Adopters all expressed both doubts about and support for risk prediction categorizable in the above four key areas of concern. CONCLUSIONS: Providers were generally supportive of adopting risk prediction into CVD prevention, but many had misgivings, which included concerns about impact on workflow, validity of predictive models, the value of making this change, and possible negative effects on providers' ability to address the whole patient. These concerns have likely contributed to the slow introduction of risk prediction into clinical practice. These concerns will need to be addressed for risk prediction, and other approaches relying on "big data" including machine learning and artificial intelligence, to have a meaningful role in clinical practice.


Asunto(s)
Inteligencia Artificial , Médicos , Actitud , Actitud del Personal de Salud , Personal de Salud , Humanos , Investigación Cualitativa
12.
Curr Opin Crit Care ; 26(5): 500-507, 2020 10.
Artículo en Inglés | MEDLINE | ID: mdl-32773618

RESUMEN

PURPOSE OF REVIEW: Critical illness survivorship is associated with new and worsening physical, cognitive, and emotional status. Survivors are vulnerable to further health set-backs, most commonly because of infection and exacerbation of chronic medical conditions. Awareness of survivors' challenges are important given the anticipated rise in critical illness survivors because of SARS-CoV-2 viral sepsis. RECENT FINDINGS: Studies continue to document challenges of critical illness survivorship. Beyond the cognitive, physical, and mental health sequelae encompassed by postintensive case syndrome, patients commonly experience persistent immunosuppression, re-hospitalization, inability to resume prior employment, and reduced quality of life. Although recommended practices for enhancing recovery from sepsis are associated with better outcomes, only a minority of patients receive all recommended practices. ICU follow-up programs or peer support groups remain important interventions to learn about and address the multifaceted challenges of critical illness survivorship, but there is little evidence of benefit to date. SUMMARY: Survivors of sepsis and critical illness commonly experience impaired health status, reduced quality of life, and inability to return to prior employment. Although the challenges of critical illness survivorship are increasingly well documented, there are relatively few studies on enhancing recovery. Future studies must focus on identifying best practices for optimizing recovery and strategies to promote their implementation.


Asunto(s)
Enfermedad Crítica , Unidades de Cuidados Intensivos , Supervivencia , Betacoronavirus , COVID-19 , Infecciones por Coronavirus , Estado de Salud , Humanos , Pandemias , Neumonía Viral , Calidad de Vida , Reinserción al Trabajo , SARS-CoV-2
13.
Stroke ; 50(7): 1669-1675, 2019 07.
Artículo en Inglés | MEDLINE | ID: mdl-31138085

RESUMEN

Background and Purpose- Effective stroke prevention depends on accurate stroke risk prediction. We determined the discriminative ability of NfL (neurofilament light chain) levels for distinguishing between adults with diabetes mellitus who develop incident stroke and those who remain stroke free during a 7-year follow-up period. Methods- We performed a case-control study of participants selected from the previously completed ACCORD trial (Action to Control Cardiovascular Risk in Diabetes). Cases were all ACCORD subjects who were stroke free at enrollment and developed incident stroke during follow-up (n=113). Control subjects (n=250) were randomly selected ACCORD subjects who had no stroke events either before or after randomization. NfL was measured in baseline samples using Single Molecule Array technology (Quanterix). Results- Baseline NfL levels were higher in stroke subjects, compared to controls, after adjusting for age, race, blood pressure, weight, and the Framingham Stroke Risk Score. Relative to the subjects in the lowest quintile of NfL levels, the hazard ratios of incident stroke for subjects in the second to fifth quintiles were 3.91 (1.45-10.53), 4.05 (1.52-10.79), 5.63 (2.16-14.66), and 9.75 (3.84-27.71), respectively, after adjusting for race and Framingham Stroke Risk Score. Incorporating NfL levels into a predictive score that already included race and Framingham Stroke Risk Score increased the score's C statistic from 0.71 (95% CI, 0.66-0.77) to 0.78 (95% CI, 0.73-0.83), P<0.001. Older age, nonwhite race, higher systolic blood pressure, glomerular filtration rate <60, and higher hemoglobin A1C were independent predictors of serum NfL in this cohort but diastolic blood pressure, durations of hypertension or diabetes mellitus, and lipid levels were not. In total, cardiovascular disease risk factors explained 19.2% of the variability in baseline NfL levels. Conclusions- Serum NfL levels predict incident stroke and add considerably to the discriminatory power of the Framingham Stroke Risk Score in a cohort of middle-aged and older adults with diabetes mellitus.


Asunto(s)
Complicaciones de la Diabetes/sangre , Complicaciones de la Diabetes/epidemiología , Proteínas de Neurofilamentos/sangre , Accidente Cerebrovascular/sangre , Accidente Cerebrovascular/epidemiología , Adulto , Factores de Edad , Anciano , Estudios de Casos y Controles , Estudios de Cohortes , Etnicidad , Femenino , Tasa de Filtración Glomerular , Hemoglobina Glucada/análisis , Humanos , Hipertensión/epidemiología , Incidencia , Masculino , Persona de Mediana Edad , Medición de Riesgo , Factores Socioeconómicos
15.
Ann Intern Med ; 169(1): 20-29, 2018 07 03.
Artículo en Inglés | MEDLINE | ID: mdl-29868850

RESUMEN

Background: The 2013 pooled cohort equations (PCEs) are central in prevention guidelines for cardiovascular disease (CVD) but can misestimate CVD risk. Objective: To improve the clinical accuracy of CVD risk prediction by revising the 2013 PCEs using newer data and statistical methods. Design: Derivation and validation of risk equations. Setting: Population-based. Participants: 26 689 adults aged 40 to 79 years without prior CVD from 6 U.S. cohorts. Measurements: Nonfatal myocardial infarction, death from coronary heart disease, or fatal or nonfatal stroke. Results: The 2013 PCEs overestimated 10-year risk for atherosclerotic CVD by an average of 20% across risk groups. Misestimation of risk was particularly prominent among black adults, of whom 3.9 million (33% of eligible black persons) had extreme risk estimates (<70% or >250% those of white adults with otherwise-identical risk factor values). Updating these equations improved accuracy among all race and sex subgroups. Approximately 11.8 million U.S. adults previously labeled high-risk (10-year risk ≥7.5%) by the 2013 PCEs would be relabeled lower-risk by the updated equations. Limitations: Updating the 2013 PCEs with data from modern cohorts reduced the number of persons considered to be at high risk. Clinicians and patients should consider the potential benefits and harms of reducing the number of persons recommended aspirin, blood pressure, or statin therapy. Our findings also indicate that risk equations will generally become outdated over time and require routine updating. Conclusion: Revised PCEs can improve the accuracy of CVD risk estimates. Primary Funding Source: National Institutes of Health.


Asunto(s)
Enfermedad de la Arteria Coronaria/etiología , Medición de Riesgo/métodos , Adulto , Negro o Afroamericano/estadística & datos numéricos , Anciano , Enfermedad de la Arteria Coronaria/epidemiología , Enfermedad de la Arteria Coronaria/mortalidad , Femenino , Humanos , Masculino , Persona de Mediana Edad , Modelos Estadísticos , Factores de Riesgo , Estados Unidos/epidemiología , Población Blanca/estadística & datos numéricos
16.
J Gen Intern Med ; 33(12): 2132-2137, 2018 12.
Artículo en Inglés | MEDLINE | ID: mdl-30284172

RESUMEN

BACKGROUND: Implementation of new practice guidelines for statin use was very poor. OBJECTIVE: To test a multi-component quality improvement intervention to encourage use of new guidelines for statin use. DESIGN: Cluster-randomized, usual-care controlled trial. PARTICIPANTS: The study population was primary care visits for patients who were recommended statins by the 2013 guidelines, but were not receiving them. We excluded patients who were over 75 years old, or had an ICD9 or ICD10 code for end-stage renal disease, muscle pain, pregnancy, or in vitro fertilization in the 2 years prior to the study visit. INTERVENTIONS: A novel quality improvement intervention consisting of a personalized decision support tool, an educational program, a performance measure, and an audit and feedback system. Randomization was at the level of the primary care team. MAIN MEASURES: Our primary outcome was prescription of a medium- or high-strength statin. We studied how receiving the intervention changed care during the quality improvement intervention compared to before it and if that change continued after the intervention. KEY RESULTS: Among 3787 visits to 43 primary care providers, being in the intervention arm tripled the odds of patients being prescribed an appropriate statin (OR 3.0, 95% CI 1.8-4.9), though the effect resolved after the personalized decision support ended (OR 1.7, 95% CI 0.99-2.77). CONCLUSIONS: A simple, personalized quality improvement intervention is promising for enabling the adoption of new guidelines. CLINICALTRIALS. GOV IDENTIFIER: NCT02820870.


Asunto(s)
Inhibidores de Hidroximetilglutaril-CoA Reductasas/uso terapéutico , Medicina de Precisión/normas , Atención Primaria de Salud/normas , Mejoramiento de la Calidad/normas , United States Department of Veterans Affairs/normas , Veteranos , Anciano , Enfermedades Cardiovasculares/tratamiento farmacológico , Enfermedades Cardiovasculares/epidemiología , Análisis por Conglomerados , Femenino , Humanos , Masculino , Persona de Mediana Edad , Medicina de Precisión/tendencias , Atención Primaria de Salud/tendencias , Mejoramiento de la Calidad/tendencias , Estados Unidos/epidemiología , United States Department of Veterans Affairs/tendencias
17.
J Gen Intern Med ; 33(1): 34-41, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-28905179

RESUMEN

BACKGROUND: Deintensification of diabetic therapy is often clinically appropriate for older adults, because the benefit of aggressive diabetes treatment declines with age, while the risks increase. OBJECTIVE: We examined rates of overtreatment and deintensification of therapy for older adults with diabetes, and whether these rates differed by medical, demographic, and socioeconomic characteristics. DESIGN, SUBJECTS, AND MAIN MEASURES: We analyzed Medicare claims data from 10 states, linked to outpatient laboratory values to identify patients potentially overtreated for diabetes (HbA1c < 6.5% with fills for any diabetes medications beyond metformin, 1/1/2011-6/30/2011). We examined characteristics associated with deintensification for potentially overtreated diabetic patients. We used multinomial logistic regression to examine whether patient characteristics associated with overtreatment of diabetes differed from those associated with undertreatment (i.e. HbA1c > 9.0%). KEY RESULTS: Of 78,792 Medicare recipients with diabetes, 8560 (10.9%) were potentially overtreated. Overtreatment of diabetes was more common among those who were over 75 years of age and enrolled in Medicaid (p < 0.001), and was less common among Hispanics (p = 0.009). Therapy was deintensified for 14% of overtreated diabetics. Appropriate deintensification of diabetic therapy was more common for patients with six or more chronic conditions, more outpatient visits, or living in urban areas; deintensification was less common for those over age 75. Only 6.9% of Medicare recipients with diabetes were potentially undertreated. Variables associated with overtreatment of diabetes differed from those associated with undertreatment. CONCLUSIONS: Medicare recipients are more frequently overtreated than undertreated for diabetes. Medicare recipients who are overtreated for diabetes rarely have their regimens deintensified.


Asunto(s)
Diabetes Mellitus/epidemiología , Diabetes Mellitus/terapia , Hipoglucemiantes/administración & dosificación , Uso Excesivo de los Servicios de Salud/prevención & control , Medicare/normas , Anciano , Anciano de 80 o más Años , Glucemia/efectos de los fármacos , Glucemia/metabolismo , Estudios de Cohortes , Femenino , Humanos , Hipoglucemiantes/efectos adversos , Masculino , Estados Unidos/epidemiología
18.
Ann Intern Med ; 166(5): 354-360, 2017 Mar 07.
Artículo en Inglés | MEDLINE | ID: mdl-28055048

RESUMEN

BACKGROUND: Two recent randomized trials produced discordant results when testing the benefits and harms of treatment to reduce blood pressure (BP) in patients with cardiovascular disease (CVD). OBJECTIVE: To perform a theoretical modeling study to identify whether large, clinically important differences in benefit and harm among patients (heterogeneous treatment effects [HTEs]) can be hidden in, and explain discordant results between, treat-to-target BP trials. DESIGN: Microsimulation. DATA SOURCES: Results of 2 trials comparing standard (systolic BP target <140 mm Hg) with intensive (systolic BP target <120 mm Hg) BP treatment and data from the National Health and Nutrition Examination Survey (2013 to 2014). TARGET POPULATION: U.S. adults. TIME HORIZON: 5 years. PERSPECTIVE: Societal. INTERVENTION: BP treatment. OUTCOME MEASURES: CVD events and mortality. RESULTS OF BASE-CASE ANALYSIS: Clinically important HTEs could explain differences in outcomes between 2 trials of intensive BP treatment, particularly diminishing benefit with each additional BP agent (for example, adding a second agent reduces CVD risk [hazard ratio, 0.61], but adding a fourth agent to a third has no benefit) and increasing harm at low diastolic BP. RESULTS OF SENSITIVITY ANALYSIS: Conventional treat-to-target trial designs had poor (<5%) statistical power to detect the HTEs, despite large samples (n > 20 000), and produced biased effect estimates. In contrast, a trial with sequential randomization to more intensive therapy achieved greater than 80% power and unbiased HTE estimates, despite small samples (n = 3500). LIMITATIONS: The HTEs as a function of the number of BP agents only were explored. Simulated aggregate data from the trials were used as model inputs because individual-participant data were not available. CONCLUSION: Clinically important heterogeneity in intensive BP treatment effects remains undetectable in conventional trial designs but can be detected in sequential randomization trial designs. PRIMARY FUNDING SOURCE: National Institutes of Health and U.S. Department of Veterans Affairs.


Asunto(s)
Antihipertensivos/uso terapéutico , Hipertensión/tratamiento farmacológico , Ensayos Clínicos Controlados Aleatorios como Asunto/normas , Proyectos de Investigación/normas , Adulto , Enfermedades Cardiovasculares/complicaciones , Enfermedades Cardiovasculares/prevención & control , Simulación por Computador , Humanos , Hipertensión/complicaciones , Factores de Riesgo
20.
Circulation ; 133(9): 840-8, 2016 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-26762520

RESUMEN

BACKGROUND: The World Health Organization aims to reduce mortality from chronic diseases including cardiovascular disease (CVD) by 25% by 2025. High blood pressure is a leading CVD risk factor. We sought to compare 3 strategies for treating blood pressure in China and India: a treat-to-target (TTT) strategy emphasizing lowering blood pressure to a target, a benefit-based tailored treatment (BTT) strategy emphasizing lowering CVD risk, or a hybrid strategy currently recommended by the World Health Organization. METHODS AND RESULTS: We developed a microsimulation model of adults aged 30 to 70 years in China and in India to compare the 2 treatment approaches across a 10-year policy-planning horizon. In the model, a BTT strategy treating adults with a 10-year CVD event risk of ≥ 10% used similar financial resources but averted ≈ 5 million more disability-adjusted life-years in both China and India than a TTT approach based on current US guidelines. The hybrid strategy in the current World Health Organization guidelines produced no substantial benefits over TTT. BTT was more cost-effective at $205 to $272/disability-adjusted life-year averted, which was $142 to $182 less per disability-adjusted life-year than TTT or hybrid strategies. The comparative effectiveness of BTT was robust to uncertainties in CVD risk estimation and to variations in the age range analyzed, the BTT treatment threshold, or rates of treatment access, adherence, or concurrent statin therapy. CONCLUSIONS: In model-based analyses, a simple BTT strategy was more effective and cost-effective than TTT or hybrid strategies in reducing mortality.


Asunto(s)
Enfermedades Cardiovasculares/mortalidad , Enfermedades Cardiovasculares/terapia , Simulación por Computador , Objetivos , Hipertensión/mortalidad , Hipertensión/terapia , Adulto , Anciano , Presión Sanguínea/fisiología , Enfermedades Cardiovasculares/diagnóstico , China/epidemiología , Análisis Costo-Beneficio/métodos , Femenino , Humanos , Hipertensión/diagnóstico , India/epidemiología , Masculino , Persona de Mediana Edad , Factores de Riesgo
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda