RESUMEN
BACKGROUND: The opioid crisis highlights the need to increase access to naloxone, possibly through regulatory approval for over-the-counter sales. To address industry-perceived barriers to such access, the Food and Drug Administration (FDA) developed a model drug facts label for such sales to assess whether consumers understood the key statements for safe and effective use. METHODS: In this label-comprehension study, we conducted individual structured interviews with 710 adults and adolescents, including 430 adults who use opioids and their family and friends. Eight primary end points were developed to assess user comprehension of each of the key steps in the label. Each of these end points included a prespecified target threshold ranging from 80 to 90% that was evaluated through a comparison of the lower boundary of the 95% exact confidence interval. RESULTS: The results for performance on six primary end points met or exceeded thresholds, including the steps "Check for a suspected overdose" (threshold, 85%; point estimate [PE], 95.8%; 95% confidence interval [CI], 94.0 to 97.1) and "Give the first dose" (threshold, 85%; PE, 98.2%; 95% CI, 96.9 to 99.0). The lower boundaries for four other primary end points ranged from 88.8 to 94.0%. One exception was comprehension of "Call 911 immediately," but this instruction closely approximated the target of 90% (PE, 90.3%; 95% CI, 87.9 to 92.4). Another exception was comprehension of the composite step of "Check, give, and call 911 immediately" (threshold, 85%; PE, 81.1%; 95% CI, 78.0 to 83.9). CONCLUSIONS: Consumers met thresholds for sufficient understanding of six of eight components of the instructions in the drug facts label for naloxone use and came close on two others. Overall, the FDA found that the model label was adequate for use in the development of a naloxone product intended for over-the-counter sales.
Asunto(s)
Analgésicos Opioides/envenenamiento , Comprensión , Etiquetado de Medicamentos , Sobredosis de Droga/tratamiento farmacológico , Naloxona/uso terapéutico , Antagonistas de Narcóticos/uso terapéutico , Medicamentos sin Prescripción/uso terapéutico , Adolescente , Adulto , Etiquetado de Medicamentos/legislación & jurisprudencia , Sobredosis de Droga/terapia , Regulación Gubernamental , Humanos , Entrevistas como Asunto , Estados Unidos , United States Food and Drug AdministrationRESUMEN
BACKGROUND: Current methods of burn estimation can lead to incorrect estimates of the total body surface area (TBSA) burned, especially among injured children. Inaccurate estimation of burn size can impact initial management, including unnecessary transfer to burn centres and fluid overload during resuscitation. To address these challenges, we developed a smartphone application (EasyTBSA) that calculates the TBSA of a burn using a body-part by body-part approach. The aims of this study were to assess the accuracy of the EasyTBSA application and compare its performance to three established methods of burn size estimation (Lund-Browder Chart, Rule of Nines and Rule of Palms). METHODS: Twenty-four healthcare providers used each method to estimate burn sizes on moulaged manikins. The manikins represented different ages (infant, child and adult) with different TBSA burns (small <20%, medium 20%-49% and large >49%). We calculated the accuracy of each method as the difference between the user-estimated and actual TBSA. The true value of the complete body surface area of the manikins was obtained by three-dimensional scans. We used multivariable modelling to control for manikin size and method. RESULTS: Among all age groups and burn sizes, the EasyTBSA application had the greatest accuracy for burn size estimation (-0.01%, SD 3.59%) followed by the Rule of Palms (3.92%, SD 10.71%), the Lund-Browder Chart (4.42%, SD 5.52%) and the Rule of Nines (5.05%, SD 6.87%). CONCLUSIONS: The EasyTBSA application may improve the estimation of TBSA compared with existing methods.
Asunto(s)
Quemaduras , Niño , Adulto , Lactante , Humanos , Superficie Corporal , Quemaduras/terapia , Unidades de Quemados , Resucitación/métodos , Personal de SaludRESUMEN
Argininosuccinate lyase (ASL) is essential for the NO-dependent regulation of tyrosine hydroxylase (TH) and thus for catecholamine production. Using a conditional mouse model with loss of ASL in catecholamine neurons, we demonstrate that ASL is expressed in dopaminergic neurons in the substantia nigra pars compacta, including the ALDH1A1 + subpopulation that is pivotal for the pathogenesis of Parkinson disease (PD). Neuronal loss of ASL results in catecholamine deficiency, in accumulation and formation of tyrosine aggregates, in elevation of α-synuclein, and phenotypically in motor and cognitive deficits. NO supplementation rescues the formation of aggregates as well as the motor deficiencies. Our data point to a potential metabolic link between accumulations of tyrosine and seeding of pathological aggregates in neurons as initiators for the pathological processes involved in neurodegeneration. Hence, interventions in tyrosine metabolism via regulation of NO levels may be therapeutic beneficial for the treatment of catecholamine-related neurodegenerative disorders.
Asunto(s)
Familia de Aldehído Deshidrogenasa 1/genética , Familia de Aldehído Deshidrogenasa 1/metabolismo , Argininosuccinatoliasa/genética , Argininosuccinatoliasa/metabolismo , Neuronas Dopaminérgicas/metabolismo , Enfermedad de Parkinson/genética , Enfermedad de Parkinson/metabolismo , Animales , Modelos Animales de Enfermedad , Humanos , Ratones , Fenotipo , Retinal-Deshidrogenasa/genética , Retinal-Deshidrogenasa/metabolismoRESUMEN
BACKGROUND: Urea cycle disorders (UCDs) are among the most common inborn errors of liver metabolism. As therapies for hyperammonemia associated with urea cycle dysfunction have improved, chronic complications, such as liver disease, have become increasingly apparent in individuals with UCDs. Liver disease in UCDs may be associated with hepatic inflammation, hepatic fibrosis, portal hypertension, liver cancer and even liver failure. However, except for monitoring serum aminotransferases, there are no clear guidelines for screening and/or monitoring individuals with UCDs for liver disease. Thus, we systematically evaluated the potential utility of several non-invasive biomarkers for liver fibrosis in UCDs. METHODS: We evaluated grey-scale ultrasonography, liver stiffness obtained from shear wave elastography (SWE), and various serum biomarkers for hepatic fibrosis and necroinflammation, in a cohort of 28 children and adults with various UCDs. RESULTS: Overall, we demonstrate a high burden of liver disease in our participants with 46% of participants having abnormal grey-scale ultrasound pattern of the liver parenchyma, and 52% of individuals having increased liver stiffness. The analysis of serum biomarkers revealed that 32% of participants had elevated FibroTest™ score, a marker for hepatic fibrosis, and 25% of participants had increased ActiTest™ score, a marker for necroinflammation. Interestingly, liver stiffness did not correlate with ultrasound appearance or FibroTest™. CONCLUSION: Overall, our results demonstrate the high overall burden of liver disease in UCDs and highlights the need for further studies exploring new tools for identifying and monitoring individuals with UCDs who are at risk for this complication. TRIAL REGISTRATION: This study has been registered in ClinicalTrials.gov (NCT03721367).
Asunto(s)
Argininosuccinatoliasa/sangre , Enfermedades Genéticas Congénitas/sangre , Cirrosis Hepática/sangre , Hepatopatías/sangre , Trastornos Innatos del Ciclo de la Urea/sangre , Adolescente , Adulto , Biomarcadores/sangre , Niño , Preescolar , Diagnóstico por Imagen de Elasticidad , Femenino , Enfermedades Genéticas Congénitas/diagnóstico por imagen , Enfermedades Genéticas Congénitas/genética , Enfermedades Genéticas Congénitas/patología , Humanos , Hiperamonemia/sangre , Hiperamonemia/genética , Hiperamonemia/metabolismo , Hiperamonemia/patología , Hígado/diagnóstico por imagen , Hígado/patología , Cirrosis Hepática/diagnóstico por imagen , Cirrosis Hepática/genética , Cirrosis Hepática/patología , Hepatopatías/genética , Hepatopatías/metabolismo , Hepatopatías/patología , Masculino , Errores Innatos del Metabolismo/genética , Persona de Mediana Edad , Ultrasonografía , Trastornos Innatos del Ciclo de la Urea/genética , Trastornos Innatos del Ciclo de la Urea/metabolismo , Trastornos Innatos del Ciclo de la Urea/patología , Adulto JovenRESUMEN
BACKGROUND: Standard treatment for both uncomplicated and severe malaria is artemisinin derivatives. Delayed parasite clearance times preceded the appearance of artemisinin treatment failures in Southeast Asia. Most worldwide malaria cases are in sub-Saharan Africa (SSA), where clinically significant artemisinin resistance or treatment failure has not yet been detected. The recent emergence of a resistance-conferring genetic mutation in the Plasmodium falciparum parasite in Africa warrants continued monitoring throughout the continent. METHODS: An analysis was performed on data from a retrospective cohort study of Malawian children with cerebral malaria admitted between 2010 and 2019 to a public referral hospital, ascertaining parasite clearance times across years. Data were collected from patients treated for severe malaria with quinine or artesunate, an artemisinin derivative. Parasite density was determined at admission and every subsequent 6 h until parasitaemia was below 1000 parasites/µl.The mean parasite clearance time in all children admitted in any one year was compared to the parasite clearance time in 2014, the first year of artesunate use in Malawi. RESULTS: The median population parasite clearance time was slower from 2010 to 2013 (quinine-treated patients) compared to 2014, the first year of artesunate use in Malawi (30 h (95% CI: 30-30) vs 18 h (95% CI: 18-24)). After adjustment for admission parasite count, there was no statistically significant difference in the median population parasite clearance time when comparing 2014 with any subsequent year. CONCLUSION: Malaria parasite clearance times in Malawian children with cerebral malaria remained constant between 2014 and 2019, arguing against evolving artemisinin resistance in parasites in this region.
Asunto(s)
Antimaláricos/uso terapéutico , Artesunato/uso terapéutico , Malaria Cerebral/parasitología , Malaria Falciparum/parasitología , Plasmodium falciparum/efectos de los fármacos , Quinina/uso terapéutico , Adolescente , Antimaláricos/farmacología , Artesunato/farmacología , Niño , Preescolar , Estudios de Cohortes , Femenino , Humanos , Lactante , Malaria Cerebral/tratamiento farmacológico , Malaria Falciparum/tratamiento farmacológico , Malaui , Masculino , Quinina/farmacología , Estudios Retrospectivos , Factores de TiempoRESUMEN
STUDY OBJECTIVE: During the COVID-19 pandemic, health care workers have had the highest risk of infection among essential workers. Although personal protective equipment (PPE) use is associated with lower infection rates, appropriate use of PPE has been variable among health care workers, even in settings with COVID-19 patients. We aimed to evaluate the patterns of PPE adherence during emergency department resuscitations that included aerosol-generating procedures. METHODS: We conducted a retrospective, video-based review of pediatric resuscitations involving one or more aerosol-generating procedures during the first 3 months of the COVID-19 pandemic in the United States (March to June 2020). Recommended adherence (complete, inadequate, absent) with 5 PPE items (headwear, eyewear, masks, gowns, gloves) and the duration of potential exposure were evaluated for individuals in the room after aerosol-generating procedure initiation. RESULTS: Among the 345 health care workers observed during 19 resuscitations, 306 (88.7%) were nonadherent (inadequate or absent adherence) with the recommended use of at least 1 PPE type at some time during the resuscitation, 23 (6.7%) of whom had no PPE. One hundred and forty health care workers (40.6%) altered or removed at least 1 type of PPE during the event. The aggregate time in the resuscitation room for health care workers across all events was 118.7 hours. During this time, providers had either absent or inadequate eyewear for 46.4 hours (39.1%) and absent or inadequate masks for 35.2 hours (29.7%). CONCLUSION: Full adherence with recommended PPE use was limited in a setting at increased risk for SARS-CoV-2 virus aerosolization. In addition to ensuring appropriate donning, approaches are needed for ensuring ongoing adherence with PPE recommendations during exposure.
Asunto(s)
COVID-19/prevención & control , Servicio de Urgencia en Hospital/normas , Adhesión a Directriz , Control de Infecciones/normas , Pandemias , Equipo de Protección Personal/normas , Resucitación , COVID-19/epidemiología , COVID-19/transmisión , Niño , Hospitales Pediátricos , Humanos , Control de Infecciones/métodos , Grupo de Atención al Paciente/normas , Guías de Práctica Clínica como Asunto , Estudios Retrospectivos , SARS-CoV-2RESUMEN
PURPOSE: Erythropoiesis-stimulating agents (ESAs), indicated for treating some patients with chemotherapy-induced anemia (CIA), may increase the risk of tumor progression and mortality. FDA required a Risk Evaluation and Mitigation Strategy (REMS) to mitigate these risks. We assessed REMS impact on ESA administration and red blood cell (RBC) transfusion as surrogate metrics for REMS effectiveness. METHODS: Retrospective cohort study including data from January 1, 2006 to December 31, 2018 for beneficiaries ≥65 years enrolled in Centers for Medicare & Medicaid Services (CMS) Medicare Parts A/B with a cancer diagnosis; patients with other indications for ESA use were excluded. Study time was divided into five periods demarcated by issuance of CMS National Coverage Determination (NCD) (Pre-NCD, Pre-REMS) and REMS milestones (Grace Period, REMS, post-REMS). Study outcomes were monthly proportion of chemotherapy episodes (CTEs) with concomitant ESA administration, with post-CTE ESA administration, and with RBC transfusions. RESULTS: Of 1 778 855 beneficiaries treated with CT, 308742 received concomitant ESA for CIA. The proportion of CTEs with concomitant and post-CTE ESA administration decreased Pre-REMS (9.0 percentage points [pp] and 3.5 pp, respectively). There were no significant post-REMS changes in the proportion of CTEs with concomitant (0.0 pp) and post-CTE ESA administration (0.1 pp). Fluctuation in RBC transfusions was <4 pp throughout the study period. CONCLUSIONS: Medicare beneficiaries showed a substantive decrease in ESA administration after NCD, with minimal impact by the REMS and its removal. Small changes in RBC transfusion over the study period were likely due to a national secular trend.
Asunto(s)
Anemia , Antineoplásicos , Hematínicos , Anciano , Anemia/inducido químicamente , Anemia/tratamiento farmacológico , Anemia/epidemiología , Antineoplásicos/efectos adversos , Transfusión Sanguínea , Eritropoyesis , Hematínicos/efectos adversos , Humanos , Medicare , Estudios Retrospectivos , Evaluación y Mitigación de Riesgos , Estados Unidos/epidemiologíaRESUMEN
BACKGROUND: Epidemiological study reporting is improving but is not transparent enough for easy evaluation or replication. One barrier is insufficient details about design elements in published studies. METHODS: Using a previously conducted drug safety evaluation in claims as a test case, we investigated the impact of small changes in five key design elements on risk estimation. These elements are index day of incident exposure's determination of look-back or follow-up periods, exposure duration algorithms, heparin exposure exclusion, propensity score model variables, and Cox proportional hazard model stratification. We covaried these elements using a fractional factorial design, resulting in 24 risk estimates for one outcome. We repeated eight of these combinations for two additional outcomes. We measured design effects on cohort sizes, follow-up time, and risk estimates. RESULTS: Small changes in specifications of index day and exposure algorithm affected the risk estimation process the most. They affected cohort size on average by 8 to 10%, follow-up time by up to 31%, and magnitude of log hazard ratios by up to 0.22. Other elements affected cohort before matching or risk estimate's precision but not its magnitude. Any change in design substantially altered the matched control-group subjects in 1:1 matching. CONCLUSIONS: Exposure-related design elements require attention from investigators initiating, evaluating, or wishing to replicate a study or from analysts standardizing definitions. The methods we developed, using factorial design and mapping design effect on causal estimation process, are applicable to planning of sensitivity analyses in similar studies.
Asunto(s)
Estudios de Cohortes , Incidencia , Revisión de Utilización de Seguros/estadística & datos numéricos , Farmacoepidemiología/estadística & datos numéricos , Proyectos de Investigación , Riesgo , HumanosRESUMEN
Examining medical products' benefits and risks in different population subsets is often necessary for informing public health decisions. In observational cohort studies, safety analyses by pre-specified subgroup can be powered, and are informative about different population subsets' risks if the study designs or analyses adequately control for confounding. However, few guidelines exist on how to simultaneously control for confounding and conduct subgroup analyses. In this simulation study, we evaluated the performance, in terms of bias, efficiency and coverage, of six propensity score methods in 24 scenarios by estimating subgroup-specific hazard ratios of average treatment effect in the treated with Cox regression models. The subgroup analysis methods control for confounding either by propensity score matching or by inverse probability treatment weighting. These methods vary as to whether they subset information or borrow it across subgroups to estimate the propensity score. Simulation scenarios varied by size of subgroup, strength of association of subgroup with exposure, strength of association of subgroup with outcome (simulated survival), and outcome incidence. Results indicated that subsetting the data by the subgrouping variable, to estimate the propensity score and hazard ratio, has the smallest bias, far exceeding any penalty in precision. Moreover, weighting methods pay a heavier price in bias than do matching methods when the propensity score model is misspecified and the subgrouping variable is a strong confounder.
Asunto(s)
Proyectos de Investigación/estadística & datos numéricos , Análisis de Supervivencia , Simulación por Computador , Interpretación Estadística de Datos , Humanos , Modelos Estadísticos , Puntaje de Propensión , Medición de Riesgo , Factores de RiesgoRESUMEN
Case-crossover study designs are observational studies used to assess postmarket safety of medical products (eg, vaccines or drugs). As a case-crossover study is self-controlled, its advantages include better control for confounding because the design controls for any time-invariant measured and unmeasured confounding and potentially greater feasibility as only data from those experiencing an event (or cases) are required. However, self-matching also introduces correlation between case and control periods within a subject or matched unit. To estimate sample size in a case-crossover study, investigators currently use Dupont's formula (Biometrics 1988; 43:1157-1168), which was originally developed for a matched case-control study. This formula is relevant as it takes into account correlation in exposure between controls and cases, which are expected to be high in self-controlled studies. However, in our study, we show that Dupont's formula and other currently used methods to determine sample size for case-crossover studies may be inadequate. Specifically, these formulas tend to underestimate the true required sample size, determined through simulations, for a range of values in the parameter space. We present mathematical derivations to explain where some currently used methods fail and propose two new sample size estimation methods that provide a more accurate estimate of the true required sample size.
Asunto(s)
Estudios Cruzados , Tamaño de la Muestra , Estudios de Casos y Controles , Humanos , Modelos Estadísticos , Estudios Observacionales como Asunto/métodos , Modelos de Riesgos ProporcionalesRESUMEN
PURPOSE: Develop a flexible analytic tool for the Food and Drug Administration's (FDA's) Sentinel System to assess adherence to safe use recommendations with two capabilities: characterize adherence to patient monitoring recommendations for a drug, and characterize concomitant medication use before, during, and/or after drug therapy. METHODS: We applied the tool in the Sentinel Distributed Database to assess adherence to the labeled recommendation that patients treated with dronedarone undergo electrocardiogram (ECG) testing no less often than every 3 months. Measures of length of treatment, time to first ECG, number of ECGs, and time between ECGs were assessed. We also assessed concomitant use of contraception among female users of mycophenolate per label recommendations (concomitancy 4 weeks before through 6 weeks after discontinuation of mycophenolate). Unadjusted results were stratified by age, month-year, and sex. RESULTS: We identified 21 457 new episodes of dronedarone use of greater than or equal to 90 days (July 2009 to September 2015); 86% had greater than or equal to one ECG, and 22% met the recommendation of an ECG no less often than every 3 months. We identified 21 942 new episodes of mycophenolate use among females 12 to 55 years (January 2016 to September 2015); 16% had greater than or equal to 1 day of concomitant contraception dispensed, 12% had concomitant contraception use for greater than or equal to 50% of the 4 weeks before initiation through 6 weeks after mycophenolate; younger females had more concomitancy. These results may be underestimates as the analyses are limited to claims data. CONCLUSIONS: We developed a tool for use in databases formatted to the Sentinel Common Data Model that can assess adherence to safe use recommendations involving patient monitoring and concomitant drug use over time.
Asunto(s)
Sistemas de Registro de Reacción Adversa a Medicamentos/organización & administración , Antiarrítmicos/administración & dosificación , Dronedarona/administración & dosificación , Monitoreo de Drogas/métodos , Ácido Micofenólico/administración & dosificación , Antiarrítmicos/efectos adversos , Anticoncepción/estadística & datos numéricos , Bases de Datos Factuales , Dronedarona/efectos adversos , Interacciones Farmacológicas , Electrocardiografía , Humanos , Cumplimiento de la Medicación , Ácido Micofenólico/efectos adversos , Estados Unidos , United States Food and Drug AdministrationRESUMEN
In a retrospective cohort study of patients enrolled in the UK Clinical Practice Research Datalink during 2000-2013, we evaluated long-term risks of death, stroke, and acute myocardial infarction (AMI) in adults prescribed clarithromycin. Patients were outpatients aged 40-85 years, who were prescribed clarithromycin (n = 287,748), doxycycline (n = 267,729), or erythromycin (n = 442,999), or Helicobacter pylori eradication therapy with a proton pump inhibitor, amoxicillin, and either clarithromycin (n = 27,639) or metronidazole (n = 14,863). We analyzed time to death, stroke, or AMI with Cox proportional hazards regression. The long-term hazard ratio for death following 1 clarithromycin versus 1 doxycycline prescription was 1.29 (95% confidence interval (CI): 1.21, 1.25), increasing to 1.62 (95% CI: 1.43, 1.84) for ≥5 prescriptions of clarithromycin versus ≥5 prescriptions for doxycycline. Erythromycin showed smaller risks in comparison with doxycycline. Stroke and AMI incidences were also increased after clarithromycin but with smaller hazard ratios than for mortality. For H. pylori eradication, the hazard ratio for mortality following clarithromycin versus metronidazole regimens was 1.09 (95% CI: 1.00, 1.18) overall, and it was higher (hazard ratio = 1.65, 95% CI: 0.88, 3.08) following ≥2 prescriptions in subjects not on statins at baseline. Outpatient clarithromycin use was associated with long-term mortality increases, with evidence for a similar, smaller increase with erythromycin.
Asunto(s)
Antibacterianos/efectos adversos , Claritromicina/efectos adversos , Mortalidad/tendencias , Infarto del Miocardio/mortalidad , Accidente Cerebrovascular/mortalidad , Adulto , Anciano , Anciano de 80 o más Años , Antibacterianos/uso terapéutico , Claritromicina/uso terapéutico , Doxiciclina/efectos adversos , Quimioterapia Combinada , Eritromicina/efectos adversos , Femenino , Infecciones por Helicobacter/tratamiento farmacológico , Humanos , Masculino , Persona de Mediana Edad , Modelos de Riesgos Proporcionales , Inhibidores de la Bomba de Protones/uso terapéutico , Estudios Retrospectivos , Factores de Tiempo , Reino UnidoRESUMEN
The tree-based scan statistic is a statistical data mining tool that has been used for signal detection with a self-controlled design in vaccine safety studies. This disproportionality statistic adjusts for multiple testing in evaluation of thousands of potential adverse events. However, many drug safety questions are not well suited for self-controlled analysis. We propose a method that combines tree-based scan statistics with propensity score-matched analysis of new initiator cohorts, a robust design for investigations of drug safety. We conducted plasmode simulations to evaluate performance. In multiple realistic scenarios, tree-based scan statistics in cohorts that were propensity score matched to adjust for confounding outperformed tree-based scan statistics in unmatched cohorts. In scenarios where confounding moved point estimates away from the null, adjusted analyses recovered the prespecified type 1 error while unadjusted analyses inflated type 1 error. In scenarios where confounding moved point estimates toward the null, adjusted analyses preserved power, whereas unadjusted analyses greatly reduced power. Although complete adjustment of true confounders had the best performance, matching on a moderately mis-specified propensity score substantially improved type 1 error and power compared with no adjustment. When there was true elevation in risk of an adverse event, there were often co-occurring signals for clinically related concepts. TreeScan with propensity score matching shows promise as a method for screening and prioritization of potential adverse events. It should be followed by clinical review and safety studies specifically designed to quantify the magnitude of effect, with confounding control targeted to the outcome of interest.
Asunto(s)
Minería de Datos/métodos , Efectos Colaterales y Reacciones Adversas Relacionados con Medicamentos/epidemiología , Factores de Confusión Epidemiológicos , Humanos , Puntaje de Propensión , Programas Informáticos , Estadística como AsuntoRESUMEN
PURPOSE: The US Food and Drug Administration's Sentinel system developed tools for sequential surveillance. METHODS: In patients with non-valvular atrial fibrillation, we sequentially compared outcomes for new users of rivaroxaban versus warfarin, employing propensity score matching and Cox regression. A total of 36 173 rivaroxaban and 79 520 warfarin initiators were variable-ratio matched within 2 monitoring periods. RESULTS: Statistically significant signals were observed for ischemic stroke (IS) (first period) and intracranial hemorrhage (ICH) (second period) favoring rivaroxaban, and gastrointestinal bleeding (GIB) (second period) favoring warfarin. In follow-up analyses using primary position diagnoses from inpatient encounters for increased definition specificity, the hazard ratios (HR) for rivaroxaban vs warfarin new users were 0.61 (0.47, 0.79) for IS, 1.47 (1.29, 1.67) for GIB, and 0.71 (0.50, 1.01) for ICH. For GIB, the HR varied by age: <66 HR = 0.88 (0.60, 1.30) and 66+ HR = 1.49 (1.30, 1.71). CONCLUSIONS: This study demonstrates the capability of Sentinel to conduct prospective safety monitoring and raises no new concerns about rivaroxaban safety.
Asunto(s)
Sistemas de Registro de Reacción Adversa a Medicamentos/estadística & datos numéricos , Inhibidores del Factor Xa/efectos adversos , Rivaroxabán/efectos adversos , United States Food and Drug Administration/estadística & datos numéricos , Anciano , Anciano de 80 o más Años , Fibrilación Atrial/complicaciones , Fibrilación Atrial/tratamiento farmacológico , Infarto Encefálico/epidemiología , Infarto Encefálico/etiología , Infarto Encefálico/prevención & control , Inhibidores del Factor Xa/administración & dosificación , Femenino , Estudios de Seguimiento , Hemorragia Gastrointestinal/inducido químicamente , Hemorragia Gastrointestinal/epidemiología , Humanos , Hemorragias Intracraneales/inducido químicamente , Hemorragias Intracraneales/epidemiología , Masculino , Persona de Mediana Edad , Proyectos Piloto , Estudios Prospectivos , Rivaroxabán/administración & dosificación , Estados Unidos/epidemiología , Warfarina/administración & dosificación , Warfarina/efectos adversosRESUMEN
BACKGROUND: Dabigatran (150 mg twice daily) has been associated with lower rates of stroke than warfarin in trials of atrial fibrillation, but large-scale evaluations in clinical practice are limited. OBJECTIVE: To compare incidence of stroke, bleeding, and myocardial infarction in patients receiving dabigatran versus warfarin in practice. DESIGN: Retrospective cohort. SETTING: National U.S. Food and Drug Administration Sentinel network. PATIENTS: Adults with atrial fibrillation initiating dabigatran or warfarin therapy between November 2010 and May 2014. MEASUREMENTS: Ischemic stroke, intracranial hemorrhage, extracranial bleeding, and myocardial infarction identified from hospital claims among propensity score-matched patients starting treatment with dabigatran or warfarin. RESULTS: Among 25 289 patients starting dabigatran therapy and 25 289 propensity score-matched patients starting warfarin therapy, those receiving dabigatran did not have significantly different rates of ischemic stroke (0.80 vs. 0.94 events per 100 person-years; hazard ratio [HR], 0.92 [95% CI, 0.65 to 1.28]) or extracranial hemorrhage (2.12 vs. 2.63 events per 100 person-years; HR, 0.89 [CI, 0.72 to 1.09]) but were less likely to have intracranial bleeding (0.39 vs. 0.77 events per 100 person-years; HR, 0.51 [CI, 0.33 to 0.79]) and more likely to have myocardial infarction (0.77 vs. 0.43 events per 100 person-years; HR, 1.88 [CI, 1.22 to 2.90]). However, the strength and significance of the association between dabigatran use and myocardial infarction varied in sensitivity analyses and by exposure definition (HR range, 1.13 [CI, 0.78 to 1.64] to 1.43 [CI, 0.99 to 2.08]). Older patients and those with kidney disease had higher gastrointestinal bleeding rates with dabigatran. LIMITATION: Inability to examine outcomes by dabigatran dose (unacceptable covariate balance between matched patients) or quality of warfarin anticoagulation (few patients receiving warfarin had available international normalized ratio values). CONCLUSION: In matched adults with atrial fibrillation treated in practice, the incidences of stroke and bleeding with dabigatran versus warfarin were consistent with those seen in trials. The possible relationship between dabigatran and myocardial infarction warrants further investigation. PRIMARY FUNDING SOURCE: U.S. Food and Drug Administration.
Asunto(s)
Anticoagulantes/uso terapéutico , Antitrombinas/uso terapéutico , Fibrilación Atrial/complicaciones , Dabigatrán/uso terapéutico , Warfarina/uso terapéutico , Anciano , Anciano de 80 o más Años , Anticoagulantes/efectos adversos , Antitrombinas/efectos adversos , Dabigatrán/efectos adversos , Femenino , Hemorragia/inducido químicamente , Hemorragia/epidemiología , Humanos , Incidencia , Masculino , Persona de Mediana Edad , Infarto del Miocardio/epidemiología , Infarto del Miocardio/prevención & control , Puntaje de Propensión , Estudios Retrospectivos , Accidente Cerebrovascular/epidemiología , Accidente Cerebrovascular/prevención & control , Warfarina/efectos adversosRESUMEN
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
Asunto(s)
Legislación de Medicamentos/tendencias , Preparaciones Farmacéuticas/normas , Seguridad/legislación & jurisprudencia , Seguridad/normas , Biometría , Humanos , Metaanálisis como Asunto , Estados Unidos , United States Food and Drug AdministrationAsunto(s)
Recolección de Datos/métodos , Bases de Datos Factuales/estadística & datos numéricos , Aprobación de Drogas/métodos , Vigilancia de Productos Comercializados/métodos , Redes de Comunicación de Computadores/estadística & datos numéricos , Conjuntos de Datos como Asunto , Toma de Decisiones en la Organización , Aprobación de Drogas/organización & administración , Humanos , Proyectos Piloto , Vigilancia de Productos Comercializados/estadística & datos numéricos , Puntaje de Propensión , Estudios Prospectivos , Estados Unidos , United States Food and Drug Administration/legislación & jurisprudencia , United States Food and Drug Administration/organización & administraciónRESUMEN
The use of master protocols allows for innovative approaches to clinical trial designs, potentially enabling new approaches to operations and analytics and creating value for patients and drug developers. Pediatric research has been conducted for many decades, but the use of novel designs such as master protocols in pediatric research is not well understood. This study aims to provide a systematic review on the utilization of master protocols in pediatric drug development. A search was performed in September 2022 using two data sources (PubMed and ClinicalTrials.gov) and included studies conducted in the past10 years. General study information was extracted such as study type, study status, therapeutic area, and clinical trial phase. Study characteristics that are specific to pediatric studies (such as age of the participants and pediatric drug dosing) and important study design elements (such as number of test drug arms and whether randomization and/or concurrent control was used) were also collected. Our results suggest that master protocol studies are being used in pediatrics, with platform and basket trials more common than umbrella trials. Most of this experience is in oncology and early phase studies. There is a rise in the use starting in 2020, largely in oncology and COVID-19 trials. However, adoption of master protocols in pediatric clinical research is still on a small scale and could be substantially expanded. Work is required to further understand the barriers in implementing pediatric master protocols, from setting up infrastructure to interpreting study findings.
Asunto(s)
Pediatría , Proyectos de Investigación , Niño , Humanos , Ensayos Clínicos como Asunto , COVID-19 , Desarrollo de MedicamentosRESUMEN
BACKGROUND: Ornithine transcarbamylase deficiency (OTCD) due to an X-linked OTC mutation, is responsible for moderate to severe hyperammonemia (HA) with substantial morbidity and mortality. About 80% of females with OTCD remain apparently "asymptomatic" with limited studies of their clinical characteristics and long-term health vulnerabilities. Multimodal neuroimaging studies and executive function testing have shown that asymptomatic females exhibit limitations when stressed to perform at higher cognitive load and had reduced activation of the prefrontal cortex. This retrospective study aims to improve understanding of factors that might predict development of defined complications and serious illness in apparent asymptomatic females. A proband and her daughter are presented to highlight the utility of multimodal neuroimaging studies and to underscore that asymptomatic females with OTCD are not always asymptomatic. METHODS: We review data from 302 heterozygote females with OTCD enrolled in the Urea Cycle Disorders Consortium (UCDC) longitudinal natural history database. We apply multiple neuroimaging modalities in the workup of a proband and her daughter. RESULTS: Among the females in the database, 143 were noted as symptomatic at baseline (Sym). We focused on females who were asymptomatic (Asx, n = 111) and those who were asymptomatic initially upon enrollment in study but who later became symptomatic sometime during follow-up (Asx/Sym, n = 22). The majority of Asx (86%) and Asx/Sym (75%) subjects did not restrict protein at baseline, and ~38% of Asx and 33% of Asx/Sym subjects suffered from mild to severe neuropsychiatric conditions such as mood disorder and sleep problems. The risk of mild to severe HA sometime later in life for the Asx and Asx/Sym subjects as a combined group was ~4% (5/133), with ammonia ranging from 77 to 470 µM and at least half (2/4) of subjects requiring hospital admission and nitrogen scavenger therapy. For this combined group, the median age of first HA crisis was 50 years, whereas the median age of first symptom which included neuropsychiatric and/or behavioral symptoms was 17 years. The multimodal neuroimaging studies in female heterozygotes with OTCD also underscore that asymptomatic female heterozygotes with OTCD (e.g., proband) are not always asymptomatic. CONCLUSIONS: Analysis of Asx and Asx/Sym females with OTCD in this study suggests that future evidence-based management guidelines and/or a clinical risk score calculator for this cohort could be useful management tools to reduce morbidity and improve long-term quality of life.