RESUMEN
OBJECTIVES: Quality-adjusted life years (QALYs) have been challenged as a measure of benefit for people with disabilities, particularly for those in low-utility health states or with irreversible disability. This study examined the impact of a QALY-based assessment on the price for a hypothetical treatment for Duchenne muscular dystrophy (DMD), a progressive, genetic neuromuscular disease. METHODS: A previously published, 5-state model, which analyzed treatments for early ambulatory (EA) DMD patients, was replicated, validated, and adapted to include early nonambulatory (ENA) DMD patients. The model was used to assess a QALY-based threshold price (maximum cost-effective price) for a hypothetical treatment for 13-year-old ENA and 5-year-old EA patients (initial health states with lower and higher utility, respectively). All inputs were replicated including willingness-to-pay thresholds of $50 000 to $200 000/QALY. RESULTS: In contrast to EA patients, ENA patients had a 98% modeled decline in QALY-based threshold price at a willingness-to-pay of $150 000/QALY or higher, despite equal treatment benefit (delayed progression/death). At $100 000/QALY or lower, net nontreatment costs exceeded health benefits, implying any treatment for ENA patients would not be considered cost-effective, even at $0 price, including an indefinite pause in disease progression. CONCLUSIONS: For certain severe, disabling conditions, traditional approaches are likely to conclude that treatments are not cost-effective at any price once a patient progresses to a disabled health state with low utility value. These findings elucidate theoretical/ethical concerns regarding potential discriminatory properties of traditional QALY assessments for people with disabilities, particularly those who have lost ambulation or have other physical limitations.
RESUMEN
BACKGROUND: Colchicine has a narrow therapeutic index. Its toxicity can be increased due to concomitant exposure to drugs inhibiting its metabolic pathway; these are cytochrome P450 3A4 (CYP3A4) and P-glycoprotein (P-gp). OBJECTIVE: To examine clinical outcomes associated with colchicine drug interactions using the spontaneous reports of the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS). METHODS: We conducted a disproportionality analysis using FAERS data from January 2004 through June 2020. The reporting odds ratio (ROR) and observed-to-expected ratio (O/E) with shrinkage for adverse events related to colchicine's toxicity (ie, rhabdomyolysis/myopathy, agranulocytosis, hemorrhage, acute renal failure, hepatic failure, arrhythmias, torsade de pointes/QT prolongation, and cardiac failure) were compared between FAERS reports. RESULTS: A total of 787 reports included the combined mention of colchicine, an inhibitor of both CYP3A4 and P-gp drug, and an adverse event of interest. Among reports that indicated the severity, 61% mentioned hospitalization and 24% death. A total of 37 ROR and 34 O/E safety signals involving colchicine and a CYP3A4/P-gp inhibitor were identified. The strongest ROR signal was for colchicine + atazanavir and rhabdomyolysis/myopathy (ROR = 35.4, 95% CI: 12.8-97.6), and the strongest O/E signal was for colchicine + atazanavir and agranulocytosis (O/E = 3.79, 95% credibility interval: 3.44-4.03). CONCLUSION AND RELEVANCE: This study identifies numerous safety signals for colchicine and CYP3A4/P-gp inhibitor drugs. Avoiding the interaction or monitoring for toxicity in patients when co-prescribing colchicine and these agents is highly recommended.
Asunto(s)
Colchicina , Citocromo P-450 CYP3A , Humanos , Estados Unidos , Preparaciones Farmacéuticas , Colchicina/efectos adversos , Citocromo P-450 CYP3A/metabolismo , Miembro 1 de la Subfamilia B de Casetes de Unión a ATP , Sulfato de Atazanavir , Detección de Señal Psicológica , Subfamilia B de Transportador de Casetes de Unión a ATP , Sistemas de Registro de Reacción Adversa a Medicamentos , United States Food and Drug AdministrationRESUMEN
BACKGROUND AND AIMS: To determine the cost-effectiveness of anti-obesity medications (AOM): tirzepatide, semaglutide, liraglutide, phentermine plus topiramate (PpT), and naltrexone plus bupropion (NpB). METHODS AND RESULTS: From a U.S. perspective we developed a Markov model to simulate weight change over a 40-year time horizon using results from clinical studies. According to the body mass index (BMI), cardiovascular diseases, diabetes and mortality risk were the health states considered in the model, being mutually exclusive. Costs of AOM, adverse events, cardiovascular events, and diabetes were included. We applied a 3% per-year discount rate and calculated the incremental cost-effectiveness ratios (ICERs) of cost per quality-adjusted life-year (QALY) gained. Probabilistic sensitivity analyses incorporated uncertainty in input parameters. A deterministic analysis was conducted to determine the robustness of the model. The model included a cohort of 78.2% females with a mean age of 45 years and BMI of 37.1 (SD 4.9) for females and 36.8 (SD 4.9) for males. NpB and PpT were the least costly medications and, all medications differed no more than 0.5 QALYs. Tirzepatide ICER was $355,616 per QALY. Liraglutide and semaglutide options were dominated by PpT. CONCLUSION: Compared to other AOM, PpT was lowest cost treatment with nearly identical QALYs with other agents.
Asunto(s)
Fármacos Antiobesidad , Análisis de Costo-Efectividad , Masculino , Femenino , Humanos , Persona de Mediana Edad , Liraglutida/efectos adversos , Análisis Costo-Beneficio , Años de Vida Ajustados por Calidad de Vida , Fármacos Antiobesidad/efectos adversosRESUMEN
OBJECTIVES: To evaluate the relationship between a modified Tisdale QTc-risk score (QTc-RS) and inpatient mortality and length of stay in a broad inpatient population with an order for a medication with a known risk of torsades de pointes (TdP). BACKGROUND: Managing the risk of TdP is challenging due to the number of medications with known risk of TdP and the complexity of precipitating factors. A model to predict risk of mortality may be useful to guide treatment decisions. METHODS: This was a retrospective observational study using inpatient data from 28 healthcare facilities in the western United States. This risk score ranges from zero to 23 with weights applied to each risk factor based on a previous validation study. Logistic regression and a generalized linear model were performed to assess the relationship between QTc-RS and mortality and length of stay. RESULTS: Between April and December 2020, a QTc-RS was calculated for 92,383 hospitalized patients. Common risk factors were female (55.0%); age > 67 years (32.1%); and receiving a medication with known risk of TdP (24.5%). A total of 2770 (3%) patients died during their hospitalization. Relative to patients with QTc-RS < 7, the odds ratio for mortality was 4.80 (95%CI:4.42-5.21) for patients with QTc-RS = 7-10 and 11.51 (95%CI:10.23-12.94) for those with QTc-RS ≥ 11. Length of hospital stay increased by 0.7 day for every unit increase in the risk score (p < 0.0001). CONCLUSION: There is a strong relationship between increased mortality as well as longer duration of hospitalization with an increasing QTc-RS.
Asunto(s)
Síndrome de QT Prolongado , Torsades de Pointes , Humanos , Femenino , Anciano , Masculino , Pacientes Internos , Síndrome de QT Prolongado/etiología , Electrocardiografía , Factores de Riesgo , Torsades de Pointes/etiología , Proteínas de Unión al ADNRESUMEN
People with type 2 diabetes receiving a second-generation basal insulin (BI) analog may be switched to a first-generation formulation for financial reasons or changes in health insurance. However, because second-generation BI analogs have more even pharmacokinetic profiles, longer durations of action (>24 vs. ≤24 hours), and more stable action profiles than first-generation BI analogs, such a change may result in suboptimal treatment persistence and/or adherence. This study compared treatment persistence, treatment adherence, rates of hypoglycemia, and health care resource utilization outcomes in people with type 2 diabetes who either continued treatment with the second-generation BI Gla-300 or switched to a first-generation BI. The study showed that continuing with Gla-300 was associated with a lower risk of discontinuing therapy, fewer emergency department visits, and lower hypoglycemia event rates than switching to a first-generation BI.
RESUMEN
ABSTRACT: Previous research has identified risk factors that may affect the risk of bleeding when individuals are exposed to oral anticoagulants. It is unclear if the risk continues to exist with the direct oral anticoagulants (DOACs). The purpose of this study was to assess the risk of bleeding in patients on DOACs (apixaban, rivaroxaban, dabigatran, edoxaban, and betrixaban) based on known risk factors including demographics, medical conditions, and concomitant medications. This study was a retrospective analysis using electronic health record data from the University of Utah Hospital (Division of Cardiovascular Medicine) of individuals receiving a DOAC from 2015 to 2020. The primary outcome of interest was bleeding events [gastrointestinal (GI) bleeding, other anatomical site bleeding (excluding GI), and any bleeding] recorded in the electronic health record that codes using International Classification of Diseases 9th and 10th codes. Known risk factors were used to predict bleeding using multivariate logistic regression. A total of 5492 patients received a DOAC during the study period. Less than half the study population were female (2287, 41.6%). During the follow-up, there were 988 patients (18.0%) experiencing a bleeding event. Of them, 351 patients (35.5%) had a GI bleeding event. Significant risk factors of GI bleeding included clopidogrel [odds ratio (OR) 1.71; 95% confidence interval (95% CI), 1.16-2.52] and previous GI bleeding episodes (OR 7.73; 95% CI, 5.36-11.16). Exposure to corticosteroids (OR 1.50; 95% CI, 1.20-1.87) and previous GI bleeding (OR 1.61; 95% CI, 1.10-2.35) were associated with an increase in bleeding at other anatomical sites (not GI included).
Asunto(s)
Centros Médicos Académicos , Inhibidores del Factor Xa , Humanos , Femenino , Masculino , Inhibidores del Factor Xa/efectos adversos , Estudios de Cohortes , Estudios RetrospectivosRESUMEN
OBJECTIVES: To evaluate the prescription sequence symmetry analysis assumption regarding balance between marker drug (i.e., medication used to treat a drug-induced adverse event) initiation rates before and after initiation of an index drug (i.e., medication that is potentially associated with the drug-induced adverse event) in the absence of prescribing cascades, we used a well-described example of loop diuretic initiation to treat dihydropyridine calcium channel blockers (DH CCB)-induced edema. STUDY DESIGN AND SETTING: The University of Florida Health Integrated Data Repository from June 2011 and July 2018 was used to assess temporal prescribing of DH CCB and loop diuretics within the prescription sequence symmetry analysis framework. Validation of the prescribing cascade was performed via clinical expert chart review. RESULTS: Among patients without heart failure who were initiated on DH CCB, 26 and 64 loop diuretics initiators started within 360 days before versus after DH CCB initiation, respectively, resulting in an adjusted sequence ratio (aSR) of 2.27 (95% CI, 1.44-3.58). Overall, 35 (54.7%) patients were determined to have a prescribing cascade. Removing patients who experienced a prescribing cascade resulted in an aSR of 1.05, 95% CI 0.62-1.78). CONCLUSION: Loop diuretic initiation rates before and after DH CCB initiation for reasons other a prescribing cascade were similar, thus confirming the prescription sequence symmetry analysis assumption.
Asunto(s)
Insuficiencia Cardíaca , Hipertensión , Antihipertensivos/uso terapéutico , Bloqueadores de los Canales de Calcio/efectos adversos , Edema/tratamiento farmacológico , Insuficiencia Cardíaca/tratamiento farmacológico , Humanos , Hipertensión/tratamiento farmacológico , Prescripciones , Inhibidores del Simportador de Cloruro Sódico y Cloruro Potásico/efectos adversosRESUMEN
BACKGROUND: Tizanidine's potent muscle relaxant properties and short onset of action makes it desirable for pain management. However, concomitant use of tizanidine with ciprofloxacin, a strong inhibitor of the P450-CYP1A2 cytochrome metabolic pathway of tizanidine, can result in increased tizanidine plasma levels and associated adverse outcomes, particularly hypotension. The aim of this study was to assess the risk of hypotension with coadministration of tizanidine and ciprofloxacin. METHODS: An observational nested cohort study of patients 18 years or older on tizanidine was conducted using data from electronic health records from 2000 to 2018 in the US. We estimated the prevalence and risk of hypotension associated with the DDI between tizanidine and ciprofloxacin using multivariable logistic regression models. RESULTS: Our analysis included 70,110 encounters of patients on tizanidine across 221 hospitals. Most encounters included females (65.7%), whites (82.4%), with an average age of 56 years (SD 14.9) and an Elixhauser comorbidity index mean of 1.6 (SD 2.3). Ciprofloxacin was co-administered with tizanidine in 2487 encounters (3.6%). Compared to patients who did not receive ciprofloxacin, co-administration of tizanidine and ciprofloxacin was associated with an increased likelihood of hypotension (adjusted odds ratio: 1.43, 95% Confidence Intervals:1.25-1.63, p-value<0.001). CONCLUSIONS: Our findings suggest that the concomitant use of tizanidine and ciprofloxacin is associated with an elevated risk of hypotension. The prevalence of co-administration of drugs with a documented interaction highlights the need for continuous education across providers to avoid the incidence of DDI related adverse events and further complications and to improve patient outcomes.
Asunto(s)
Ciprofloxacina , Hipotensión , Ciprofloxacina/efectos adversos , Clonidina/efectos adversos , Clonidina/análogos & derivados , Estudios de Cohortes , Interacciones Farmacológicas , Femenino , Humanos , Hipotensión/inducido químicamente , Hipotensión/epidemiología , Persona de Mediana EdadRESUMEN
Colchicine is increasingly used as the number of potential indications expands. However, it also has a narrow therapeutic index that is associated with bothersome to severe side effects. When concomitantly use with medications inhibiting its metabolism, higher plasma levels will result and increase likelihood of colchicine toxicity. We conducted a cohort study using electronic health records comparing encounters with colchicine plus a macrolide and colchicine with an antibiotic non-macrolide. We assessed the relationship between the two groups using adjusted multivariate logistic regression models and the risk of rhabdomyolysis, pancytopenia, muscular weakness, heart failure, acute hepatic failure and death. 12670 patients on colchicine plus an antibiotic non-macrolide were compared to 2199 patients exposed to colchicine plus a macrolide. Patients exposed to colchicine and a macrolide were majority men (n = 1329, 60.4%) and white (n = 1485, 67.5%) in their late sixties (mean age in years 68.4, SD 15.6). Heart failure was more frequent in the colchicine plus a macrolide cohort (n = 402, 18.3%) vs the colchicine non-macrolide one (n = 1153, 9.1%) (p < 0.0001) and also had a higher mortality rate [(85 (3.87%) vs 289 (2.28%), p < 0.0001 macrolides vs non-macrolides cohorts, respectively]. When the sample was limited to individuals exposed to either clarithromycin or erythromycin and colchicine, the adjusted OR for acute hepatic failure was 2.47 (95% CI 1.04-5.91) and 2.06 for death (95% CI 1.07-3.97). There is a significant increase in the risk of hepatic failure and mortality when colchicine is concomitantly administered with a macrolide. Colchicine should not be used concomitantly with these antibiotics or should be temporarily discontinued to avoid toxic levels of colchicine.
Asunto(s)
Claritromicina , Macrólidos , Antibacterianos/efectos adversos , Claritromicina/efectos adversos , Estudios de Cohortes , Colchicina/efectos adversos , Eritromicina/uso terapéutico , Humanos , Macrólidos/efectos adversos , MasculinoRESUMEN
BACKGROUND: Drug-induced long-QT syndrome (diLQTS) is a major concern among patients who are hospitalized, for whom prediction models capable of identifying individualized risk could be useful to guide monitoring. We have previously demonstrated the feasibility of machine learning to predict the risk of diLQTS, in which deep learning models provided superior accuracy for risk prediction, although these models were limited by a lack of interpretability. OBJECTIVE: In this investigation, we sought to examine the potential trade-off between interpretability and predictive accuracy with the use of more complex models to identify patients at risk for diLQTS. We planned to compare a deep learning algorithm to predict diLQTS with a more interpretable algorithm based on cluster analysis that would allow medication- and subpopulation-specific evaluation of risk. METHODS: We examined the risk of diLQTS among 35,639 inpatients treated between 2003 and 2018 with at least 1 of 39 medications associated with risk of diLQTS and who had an electrocardiogram in the system performed within 24 hours of medication administration. Predictors included over 22,000 diagnoses and medications at the time of medication administration, with cases of diLQTS defined as a corrected QT interval over 500 milliseconds after treatment with a culprit medication. The interpretable model was developed using cluster analysis (K=4 clusters), and risk was assessed for specific medications and classes of medications. The deep learning model was created using all predictors within a 6-layer neural network, based on previously identified hyperparameters. RESULTS: Among the medications, we found that class III antiarrhythmic medications were associated with increased risk across all clusters, and that in patients who are noncritically ill without cardiovascular disease, propofol was associated with increased risk, whereas ondansetron was associated with decreased risk. Compared with deep learning, the interpretable approach was less accurate (area under the receiver operating characteristic curve: 0.65 vs 0.78), with comparable calibration. CONCLUSIONS: In summary, we found that an interpretable modeling approach was less accurate, but more clinically applicable, than deep learning for the prediction of diLQTS. Future investigations should consider this trade-off in the development of methods for clinical prediction.
Asunto(s)
Registros Electrónicos de Salud , Síndrome de QT Prolongado , Humanos , Aprendizaje Automático , Síndrome de QT Prolongado/inducido químicamente , Síndrome de QT Prolongado/diagnóstico , Electrocardiografía , Análisis por ConglomeradosRESUMEN
BACKGROUND: Survivors of opioid overdose have substantially increased mortality risk, although this risk is not evenly distributed across individuals. No study has focused on predicting an individual's risk of death after a nonfatal opioid overdose. OBJECTIVE: To predict risk of death after a nonfatal opioid overdose. DESIGN AND PARTICIPANTS: This retrospective cohort study included 9686 Pennsylvania Medicaid beneficiaries with an emergency department or inpatient claim for nonfatal opioid overdose in 2014-2016. The index date was the first overdose claim during this period. EXPOSURES, MAIN OUTCOME, AND MEASURES: Predictor candidates were measured in the 180 days before the index overdose. Primary outcome was 180-day all-cause mortality. Using a gradient boosting machine model, we classified beneficiaries into six subgroups according to their risk of mortality (< 25th percentile of the risk score, 25th to < 50th, 50th to < 75th, 75th to < 90th, 90th to < 98th, ≥ 98th). We then measured receipt of medication for opioid use disorder (OUD), risk mitigation interventions (e.g., prescriptions for naloxone), and prescription opioids filled in the 180 days after the index overdose, by risk subgroup. KEY RESULTS: Of eligible beneficiaries, 347 (3.6%) died within 180 days after the index overdose. The C-statistic of the mortality prediction model was 0.71. In the highest risk subgroup, the observed 180-day mortality rate was 20.3%, while in the lowest risk subgroup, it was 1.5%. Medication for OUD and risk mitigation interventions after overdose were more commonly seen in lower risk groups, while opioid prescriptions were more likely to be used in higher risk groups (both p trends < .001). CONCLUSIONS: A risk prediction model performed well for classifying mortality risk after a nonfatal opioid overdose. This prediction score can identify high-risk subgroups to target interventions to improve outcomes among overdose survivors.
Asunto(s)
Sobredosis de Droga , Sobredosis de Opiáceos , Trastornos Relacionados con Opioides , Analgésicos Opioides/uso terapéutico , Sobredosis de Droga/tratamiento farmacológico , Servicio de Urgencia en Hospital , Hospitales , Humanos , Trastornos Relacionados con Opioides/tratamiento farmacológico , Pennsylvania/epidemiología , Estudios Retrospectivos , Estados Unidos/epidemiologíaRESUMEN
OBJECTIVES: Little is known about relationships between opioid- and gabapentinoid-use patterns and healthcare expenditures that may be affected by pain management and risk of adverse outcomes. This study examined the association between patients' opioid and gabapentinoid prescription filling/refilling trajectories and direct medical expenditures in US Medicare. METHODS: This cross-sectional study included a 5% national sample (2011-2016) of fee-for-service beneficiaries with fibromyalgia, low back pain, neuropathy, or osteoarthritis newly initiating opioids or gabapentinoids. Using group-based multitrajectory modeling, this study identified patients' distinct opioid and gabapentinoid (OPI-GABA) dose and duration patterns, based on standardized daily doses, within a year of initiating opioids and/or gabapentinoids. Concurrent direct medical expenditures within the same year were estimated using inverse probability of treatment weighted multivariable generalized linear regression, adjusting for sociodemographic and health status factors. RESULTS: Among 67 827 eligible beneficiaries (mean age ± SD = 63.6 ± 14.8 years, female = 65.8%, white = 77.1%), 11 distinct trajectories were identified (3 opioid-only, 4 gabapentinoid-only, and 4 concurrent OPI-GABA trajectories). Compared with opioid-only early discontinuers ($13 830, 95% confidence interval = $13 643-14 019), gabapentinoid-only early discontinuers and consistent low-dose and moderate-dose gabapentinoid-only users were associated with 11% to 23% lower health expenditures (adjusted mean expenditure = $10 607-$11 713). Consistent low-dose opioid-only users, consistent high-dose opioid-only users, consistent low-dose OPI-GABA users, consistent low-dose opioid and high-dose gabapentinoid users, and consistent high-dose opioid and moderate-dose gabapentinoid users were associated with 14% to 106% higher healthcare expenditures (adjusted mean expenditure = $15 721-$28 464). CONCLUSIONS: Dose and duration patterns of concurrent OPI-GABA varied substantially among fee-for-service Medicare beneficiaries. Consistent opioid-only users and all concurrent OPI-GABA users were associated with higher healthcare expenditures compared to opioid-only discontinuers.
Asunto(s)
Analgésicos Opioides/uso terapéutico , Analgésicos/uso terapéutico , Gabapentina/uso terapéutico , Medicare/economía , Dolor/tratamiento farmacológico , Anciano , Anciano de 80 o más Años , Analgésicos/administración & dosificación , Analgésicos Opioides/administración & dosificación , Estudios Transversales , Utilización de Medicamentos , Planes de Aranceles por Servicios/economía , Femenino , Gabapentina/administración & dosificación , Humanos , Masculino , Persona de Mediana Edad , Estados UnidosRESUMEN
Patient-reported outcomes (PROs), such as symptoms, function, and other health-related quality-of-life aspects, are increasingly evaluated in cancer randomised controlled trials (RCTs) to provide information about treatment risks, benefits, and tolerability. However, expert opinion and critical review of the literature showed no consensus on optimal methods of PRO analysis in cancer RCTs, hindering interpretation of results. The Setting International Standards in Analyzing Patient-Reported Outcomes and Quality of Life Endpoints Data Consortium was formed to establish PRO analysis recommendations. Four issues were prioritised: developing a taxonomy of research objectives that can be matched with appropriate statistical methods, identifying appropriate statistical methods for PRO analysis, standardising statistical terminology related to missing data, and determining appropriate ways to manage missing data. This Policy Review presents recommendations for PRO analysis developed through critical literature reviews and a structured collaborative process with diverse international stakeholders, which provides a foundation for endorsement; ongoing developments of these recommendations are also discussed.
Asunto(s)
Neoplasias/terapia , Medición de Resultados Informados por el Paciente , Calidad de Vida , Ensayos Clínicos Controlados Aleatorios como Asunto/normas , Proyectos de Investigación/normas , Consenso , HumanosRESUMEN
The International Society for Pharmacoeconomics and Outcomes Research (ISPOR)'s "Good Practices Task Force" reports are highly cited, multistakeholder perspective expert guidance reports that reflect international standards for health economics and outcomes research (HEOR) and their use in healthcare decision making. In this report, we discuss the criteria, development, and evaluation/consensus review and approval process for initiating a task force. The rationale for a task force must include a justification, including why this good practice guidance is important and its potential impact on the scientific community. The criteria include: (1) necessity (why is this task force required?); (2) a methodology-oriented focus (focus on research methods, approaches, analysis, interpretation, and dissemination); (3) relevance (to ISPOR's mission and its members); (4) durability over time; (5) broad applicability; and 6) an evidence-based approach. In addition, the proposal must be a priority specifically for ISPOR. These reports are valuable to researchers, academics, students, health technology assessors, medical technology developers and service providers, those working in other commercial entities, regulators, and payers. These stakeholder perspectives are represented in task force membership to ensure the report's overall usefulness and relevance to the global ISPOR membership. We hope that this discussion will bring transparency to the process of initiating, approving, and producing these task force reports and encourage participation from a diverse range of experts within and outside ISPOR.
Asunto(s)
Comités Consultivos , Economía Farmacéutica , Evaluación de Resultado en la Atención de Salud/normas , Informe de Investigación/normas , Práctica Clínica Basada en la Evidencia , Humanos , Internacionalidad , Proyectos de InvestigaciónRESUMEN
Low concordance between drug-drug interaction (DDI) knowledge bases is a well-documented concern. One potential cause of inconsistency is variability between drug experts in approach to assessing evidence about potential DDIs. In this study, we examined the face validity and inter-rater reliability of a novel DDI evidence evaluation instrument designed to be simple and easy to use. METHODS: A convenience sample of participants with professional experience evaluating DDI evidence was recruited. Participants independently evaluated pre-selected evidence items for 5 drug pairs using the new instrument. For each drug pair, participants labeled each evidence item as sufficient or insufficient to establish the existence of a DDI based on the evidence categories provided by the instrument. Participants also decided if the overall body of evidence supported a DDI involving the drug pair. Agreement was computed both at the evidence item and drug pair levels. A cut-off of ≥ 70% was chosen as the agreement threshold for percent agreement, while a coefficient > 0.6 was used as the cut-off for chance-corrected agreement. Open ended comments were collected and coded to identify themes related to the participants' experience using the novel approach. RESULTS: The face validity of the new instrument was established by two rounds of evaluation involving a total of 6 experts. Fifteen experts agreed to participate in the reliability assessment, and 14 completed the study. Participant agreement on the sufficiency of 22 of the 34 evidence items (65%) did not exceed the a priori agreement threshold. Similarly, agreement on the sufficiency of evidence for 3 of the 5 drug pairs (60%) was poor. Chance-corrected agreement at the drug pair level further confirmed the poor interrater reliability of the instrument (Gwet's AC1 = 0.24, Conger's Kappa = 0.24). Participant comments suggested several possible reasons for the disagreements including unaddressed subjectivity in assessing an evidence item's type and study design, an infeasible separation of evidence evaluation from the consideration of clinical relevance, and potential issues related to the evaluation of DDI case reports. CONCLUSIONS: Even though the key findings were negative, the study's results shed light on how experts approach DDI evidence assessment, including the importance situating evidence assessment within the context of consideration of clinical relevance. Analysis of participant comments within the context of the negative findings identified several promising future research directions including: novel computer-based support for evidence assessment; formal evaluation of a more comprehensive evidence assessment approach that requires consideration of specific, explicitly stated, clinical consequences; and more formal investigation of DDI case report assessment instruments.
Asunto(s)
Preparaciones Farmacéuticas , Interacciones Farmacológicas , Humanos , Reproducibilidad de los ResultadosRESUMEN
BACKGROUND: Preferences for health states for Duchenne muscular dystrophy (DMD) are necessary to assess costs and benefits of novel therapies. Because DMD progression begins in childhood, the impact of DMD on health-related quality-of-life (HRQoL) affects preferences of both DMD patients and their families. The objective of this review was to synthesize published evidence for health state utility from the DMD patient and caregiver perspectives. METHODS: A systematic review was performed using MEDLINE and Embase, according to best practices. Data were extracted from studies reporting DMD patient or caregiver utilities; these included study and patient characteristics, health states considered, and utility estimates. Quality appraisal of studies was performed. RESULTS: From 888 abstracts, eight publications describing five studies were identified. DMD utility estimates were from preference-based measures presented stratified by ambulatory status, ventilation, and age. Patient (or patient-proxy) utility estimates ranged from 0.75 (early ambulatory DMD) to 0.05 (day-and-night ventilation). Caregiver utilities ranged from 0.87 (for caregivers of adults with DMD) to 0.71 (for caregivers of predominantly childhood patients). Both patient and caregiver utilities trended lower with higher disease severity. Variability in utilities was observed based on instrument, respondent type, and country. Utility estimates for health states within non-ambulatory DMD are under reported; nor were utilities for DMD-related health states such as scoliosis or preserved upper limb function identified. CONCLUSION: Published health state utilities document the substantial HRQoL impacts of DMD, particularly with disease progression. Additional research in patient utilities for additional health states, particularly in non-ambulatory DMD patients, is warranted.
Asunto(s)
Servicios de Salud/normas , Distrofia Muscular de Duchenne/terapia , Calidad de Vida/psicología , Adolescente , Adulto , Niño , Femenino , Humanos , MasculinoRESUMEN
BACKGROUND: Clinical decision support (CDS) design best practices are intended to provide a narrative representation of factors that influence the success of CDS tools. However, they provide incomplete direction on evidence-based implementation principles. OBJECTIVE: This study aims to describe an integrated approach toward applying an existing implementation science (IS) framework with CDS design best practices to improve the effectiveness, sustainability, and reproducibility of CDS implementations. METHODS: We selected the Practical Robust Implementation and Sustainability Model (PRISM) IS framework. We identified areas where PRISM and CDS design best practices complemented each other and defined methods to address each. Lessons learned from applying these methods were then used to further refine the integrated approach. RESULTS: Our integrated approach to applying PRISM with CDS design best practices consists of 5 key phases that iteratively interact and inform each other: multilevel stakeholder engagement, designing the CDS, design and usability testing, thoughtful deployment, and performance evaluation and maintenance. The approach is led by a dedicated implementation team that includes clinical informatics and analyst builder expertise. CONCLUSIONS: Integrating PRISM with CDS design best practices extends user-centered design and accounts for the multilevel, interacting, and dynamic factors that influence CDS implementation in health care. Integrating PRISM with CDS design best practices synthesizes the many known contextual factors that can influence the success of CDS tools, thereby enhancing the reproducibility and sustainability of CDS implementations. Others can adapt this approach to their situation to maximize and sustain CDS implementation success.
Asunto(s)
Sistemas de Apoyo a Decisiones Clínicas/normas , Ciencia de la Implementación , Humanos , Reproducibilidad de los ResultadosRESUMEN
OBJECTIVE: To quantify the extent and identify predictors of potentially inappropriate antidepressant use among older adults with dementia and newly diagnosed major depressive disorders (MDD). METHODS: This retrospective cohort study included older adults (aged ≥65 years) with dementia and newly diagnosed MDD using Medicare 5% sample claims data (2012-2013). Based on Healthcare Effectiveness Data and Information Set guidelines, intake period for new antidepressant medication use was from May 1, 2012, through April 30, 2013. Index prescription start date was the first date of antidepressant prescription claim during the intake period. Dependent variable of this study was potentially inappropriate antidepressant use as defined by the Beers Criteria and the Screening Tool of Older Persons' potentially inappropriate Prescriptions criteria. The authors conducted multiple logistic regression analysis to identify individual-level predictors of potentially inappropriate antidepressant use. RESULTS: The authors' final study sample consisted of 7,625 older adults with dementia and newly diagnosed MDD, among which 7.59% (Nâ¯=â¯579) initiated treatment with a potentially inappropriate antidepressant. Paroxetine (Nâ¯=â¯394) was the most commonly initiated potentially inappropriate antidepressant followed by amitriptyline (Nâ¯=â¯104), nortriptyline (Nâ¯=â¯35), and doxepin (Nâ¯=â¯32). Initiation of a potentially inappropriate antidepressant was associated with age and baseline use of anxiolytic medications. CONCLUSION: More than 7% of older adults in the study sample initiated a potentially inappropriate antidepressant, and the authors identified a few individual-level factors significantly associated with it. Appropriately tailored interventions to address modifiable and nonmodifiable factors significantly associated with potentially inappropriate antidepressant prescribing are required to minimize risks in this vulnerable population.
Asunto(s)
Antidepresivos/uso terapéutico , Demencia/tratamiento farmacológico , Trastorno Depresivo Mayor/tratamiento farmacológico , Prescripción Inadecuada/estadística & datos numéricos , Lista de Medicamentos Potencialmente Inapropiados , Anciano , Anciano de 80 o más Años , Femenino , Humanos , Masculino , Medicare/estadística & datos numéricos , Estudios Retrospectivos , Estados UnidosRESUMEN
OBJECTIVES: To evaluate expenditures and sources of payment for prescription drugs in the United States from 1997 to 2015. METHODS: The Medical Expenditures Panel Survey (MEPS) was used for this analysis. Individuals with one or more prescription medicines were eligible for inclusion. Outcomes were the inflation-adjusted cost per prescription across all payment sources (self or family, public, private, and other sources) before and after the Medicare Part D benefit and the Affordable Care Act. RESULTS: The cost per prescription increased from $38.56 in 1997 to $73.34 in 2015. Nevertheless, consumers' out-of-pocket expenditures decreased from $18.19 to $9.61, whereas public program expenditures per prescription increased from $5.61 to $34.43 over this time. Out-of-pocket expenditures of individuals in the low-income group and near-poor group had larger declined percentages from 51.4% to 20.4% and 46.5% to 17.2% relative to individuals in higher-income groups before and after implementation of the Medicare Part D, respectively. Over 90% prescription purchases were covered by medical insurance by 2015. The per-prescription cost for medications consumed by uninsured individuals increased at a lower rate from $31.83 to $54.96 versus $40.12 to $75.58 for privately insured and $36.00 to $70.96 for publicly insured (P < .001). CONCLUSIONS: Prescription drugs expenditures have increased over the past 2 decades, but public sources now pay for a growing proportion of prescription drugs cost regardless of health insurance coverage or income level. Out-of-pocket expenditures have significantly decreased for persons with lower incomes since the implementation of Medicare Part D and the Affordable Care Act.