RESUMO
BACKGROUND: The opioid crisis highlights the need to increase access to naloxone, possibly through regulatory approval for over-the-counter sales. To address industry-perceived barriers to such access, the Food and Drug Administration (FDA) developed a model drug facts label for such sales to assess whether consumers understood the key statements for safe and effective use. METHODS: In this label-comprehension study, we conducted individual structured interviews with 710 adults and adolescents, including 430 adults who use opioids and their family and friends. Eight primary end points were developed to assess user comprehension of each of the key steps in the label. Each of these end points included a prespecified target threshold ranging from 80 to 90% that was evaluated through a comparison of the lower boundary of the 95% exact confidence interval. RESULTS: The results for performance on six primary end points met or exceeded thresholds, including the steps "Check for a suspected overdose" (threshold, 85%; point estimate [PE], 95.8%; 95% confidence interval [CI], 94.0 to 97.1) and "Give the first dose" (threshold, 85%; PE, 98.2%; 95% CI, 96.9 to 99.0). The lower boundaries for four other primary end points ranged from 88.8 to 94.0%. One exception was comprehension of "Call 911 immediately," but this instruction closely approximated the target of 90% (PE, 90.3%; 95% CI, 87.9 to 92.4). Another exception was comprehension of the composite step of "Check, give, and call 911 immediately" (threshold, 85%; PE, 81.1%; 95% CI, 78.0 to 83.9). CONCLUSIONS: Consumers met thresholds for sufficient understanding of six of eight components of the instructions in the drug facts label for naloxone use and came close on two others. Overall, the FDA found that the model label was adequate for use in the development of a naloxone product intended for over-the-counter sales.
Assuntos
Analgésicos Opioides/intoxicação , Compreensão , Rotulagem de Medicamentos , Overdose de Drogas/tratamento farmacológico , Naloxona/uso terapêutico , Antagonistas de Entorpecentes/uso terapêutico , Medicamentos sem Prescrição/uso terapêutico , Adolescente , Adulto , Rotulagem de Medicamentos/legislação & jurisprudência , Overdose de Drogas/terapia , Regulamentação Governamental , Humanos , Entrevistas como Assunto , Estados Unidos , United States Food and Drug AdministrationRESUMO
BACKGROUND: Current methods of burn estimation can lead to incorrect estimates of the total body surface area (TBSA) burned, especially among injured children. Inaccurate estimation of burn size can impact initial management, including unnecessary transfer to burn centres and fluid overload during resuscitation. To address these challenges, we developed a smartphone application (EasyTBSA) that calculates the TBSA of a burn using a body-part by body-part approach. The aims of this study were to assess the accuracy of the EasyTBSA application and compare its performance to three established methods of burn size estimation (Lund-Browder Chart, Rule of Nines and Rule of Palms). METHODS: Twenty-four healthcare providers used each method to estimate burn sizes on moulaged manikins. The manikins represented different ages (infant, child and adult) with different TBSA burns (small <20%, medium 20%-49% and large >49%). We calculated the accuracy of each method as the difference between the user-estimated and actual TBSA. The true value of the complete body surface area of the manikins was obtained by three-dimensional scans. We used multivariable modelling to control for manikin size and method. RESULTS: Among all age groups and burn sizes, the EasyTBSA application had the greatest accuracy for burn size estimation (-0.01%, SD 3.59%) followed by the Rule of Palms (3.92%, SD 10.71%), the Lund-Browder Chart (4.42%, SD 5.52%) and the Rule of Nines (5.05%, SD 6.87%). CONCLUSIONS: The EasyTBSA application may improve the estimation of TBSA compared with existing methods.
Assuntos
Queimaduras , Criança , Adulto , Lactente , Humanos , Superfície Corporal , Queimaduras/terapia , Unidades de Queimados , Ressuscitação/métodos , Pessoal de SaúdeRESUMO
Argininosuccinate lyase (ASL) is essential for the NO-dependent regulation of tyrosine hydroxylase (TH) and thus for catecholamine production. Using a conditional mouse model with loss of ASL in catecholamine neurons, we demonstrate that ASL is expressed in dopaminergic neurons in the substantia nigra pars compacta, including the ALDH1A1 + subpopulation that is pivotal for the pathogenesis of Parkinson disease (PD). Neuronal loss of ASL results in catecholamine deficiency, in accumulation and formation of tyrosine aggregates, in elevation of α-synuclein, and phenotypically in motor and cognitive deficits. NO supplementation rescues the formation of aggregates as well as the motor deficiencies. Our data point to a potential metabolic link between accumulations of tyrosine and seeding of pathological aggregates in neurons as initiators for the pathological processes involved in neurodegeneration. Hence, interventions in tyrosine metabolism via regulation of NO levels may be therapeutic beneficial for the treatment of catecholamine-related neurodegenerative disorders.
Assuntos
Família Aldeído Desidrogenase 1/genética , Família Aldeído Desidrogenase 1/metabolismo , Argininossuccinato Liase/genética , Argininossuccinato Liase/metabolismo , Neurônios Dopaminérgicos/metabolismo , Doença de Parkinson/genética , Doença de Parkinson/metabolismo , Animais , Modelos Animais de Doenças , Humanos , Camundongos , Fenótipo , Retinal Desidrogenase/genética , Retinal Desidrogenase/metabolismoRESUMO
BACKGROUND: Urea cycle disorders (UCDs) are among the most common inborn errors of liver metabolism. As therapies for hyperammonemia associated with urea cycle dysfunction have improved, chronic complications, such as liver disease, have become increasingly apparent in individuals with UCDs. Liver disease in UCDs may be associated with hepatic inflammation, hepatic fibrosis, portal hypertension, liver cancer and even liver failure. However, except for monitoring serum aminotransferases, there are no clear guidelines for screening and/or monitoring individuals with UCDs for liver disease. Thus, we systematically evaluated the potential utility of several non-invasive biomarkers for liver fibrosis in UCDs. METHODS: We evaluated grey-scale ultrasonography, liver stiffness obtained from shear wave elastography (SWE), and various serum biomarkers for hepatic fibrosis and necroinflammation, in a cohort of 28 children and adults with various UCDs. RESULTS: Overall, we demonstrate a high burden of liver disease in our participants with 46% of participants having abnormal grey-scale ultrasound pattern of the liver parenchyma, and 52% of individuals having increased liver stiffness. The analysis of serum biomarkers revealed that 32% of participants had elevated FibroTest™ score, a marker for hepatic fibrosis, and 25% of participants had increased ActiTest™ score, a marker for necroinflammation. Interestingly, liver stiffness did not correlate with ultrasound appearance or FibroTest™. CONCLUSION: Overall, our results demonstrate the high overall burden of liver disease in UCDs and highlights the need for further studies exploring new tools for identifying and monitoring individuals with UCDs who are at risk for this complication. TRIAL REGISTRATION: This study has been registered in ClinicalTrials.gov (NCT03721367).
Assuntos
Argininossuccinato Liase/sangue , Doenças Genéticas Inatas/sangue , Cirrose Hepática/sangue , Hepatopatias/sangue , Distúrbios Congênitos do Ciclo da Ureia/sangue , Adolescente , Adulto , Biomarcadores/sangue , Criança , Pré-Escolar , Técnicas de Imagem por Elasticidade , Feminino , Doenças Genéticas Inatas/diagnóstico por imagem , Doenças Genéticas Inatas/genética , Doenças Genéticas Inatas/patologia , Humanos , Hiperamonemia/sangue , Hiperamonemia/genética , Hiperamonemia/metabolismo , Hiperamonemia/patologia , Fígado/diagnóstico por imagem , Fígado/patologia , Cirrose Hepática/diagnóstico por imagem , Cirrose Hepática/genética , Cirrose Hepática/patologia , Hepatopatias/genética , Hepatopatias/metabolismo , Hepatopatias/patologia , Masculino , Erros Inatos do Metabolismo/genética , Pessoa de Meia-Idade , Ultrassonografia , Distúrbios Congênitos do Ciclo da Ureia/genética , Distúrbios Congênitos do Ciclo da Ureia/metabolismo , Distúrbios Congênitos do Ciclo da Ureia/patologia , Adulto JovemRESUMO
BACKGROUND: Standard treatment for both uncomplicated and severe malaria is artemisinin derivatives. Delayed parasite clearance times preceded the appearance of artemisinin treatment failures in Southeast Asia. Most worldwide malaria cases are in sub-Saharan Africa (SSA), where clinically significant artemisinin resistance or treatment failure has not yet been detected. The recent emergence of a resistance-conferring genetic mutation in the Plasmodium falciparum parasite in Africa warrants continued monitoring throughout the continent. METHODS: An analysis was performed on data from a retrospective cohort study of Malawian children with cerebral malaria admitted between 2010 and 2019 to a public referral hospital, ascertaining parasite clearance times across years. Data were collected from patients treated for severe malaria with quinine or artesunate, an artemisinin derivative. Parasite density was determined at admission and every subsequent 6 h until parasitaemia was below 1000 parasites/µl.The mean parasite clearance time in all children admitted in any one year was compared to the parasite clearance time in 2014, the first year of artesunate use in Malawi. RESULTS: The median population parasite clearance time was slower from 2010 to 2013 (quinine-treated patients) compared to 2014, the first year of artesunate use in Malawi (30 h (95% CI: 30-30) vs 18 h (95% CI: 18-24)). After adjustment for admission parasite count, there was no statistically significant difference in the median population parasite clearance time when comparing 2014 with any subsequent year. CONCLUSION: Malaria parasite clearance times in Malawian children with cerebral malaria remained constant between 2014 and 2019, arguing against evolving artemisinin resistance in parasites in this region.
Assuntos
Antimaláricos/uso terapêutico , Artesunato/uso terapêutico , Malária Cerebral/parasitologia , Malária Falciparum/parasitologia , Plasmodium falciparum/efeitos dos fármacos , Quinina/uso terapêutico , Adolescente , Antimaláricos/farmacologia , Artesunato/farmacologia , Criança , Pré-Escolar , Estudos de Coortes , Feminino , Humanos , Lactente , Malária Cerebral/tratamento farmacológico , Malária Falciparum/tratamento farmacológico , Malaui , Masculino , Quinina/farmacologia , Estudos Retrospectivos , Fatores de TempoRESUMO
STUDY OBJECTIVE: During the COVID-19 pandemic, health care workers have had the highest risk of infection among essential workers. Although personal protective equipment (PPE) use is associated with lower infection rates, appropriate use of PPE has been variable among health care workers, even in settings with COVID-19 patients. We aimed to evaluate the patterns of PPE adherence during emergency department resuscitations that included aerosol-generating procedures. METHODS: We conducted a retrospective, video-based review of pediatric resuscitations involving one or more aerosol-generating procedures during the first 3 months of the COVID-19 pandemic in the United States (March to June 2020). Recommended adherence (complete, inadequate, absent) with 5 PPE items (headwear, eyewear, masks, gowns, gloves) and the duration of potential exposure were evaluated for individuals in the room after aerosol-generating procedure initiation. RESULTS: Among the 345 health care workers observed during 19 resuscitations, 306 (88.7%) were nonadherent (inadequate or absent adherence) with the recommended use of at least 1 PPE type at some time during the resuscitation, 23 (6.7%) of whom had no PPE. One hundred and forty health care workers (40.6%) altered or removed at least 1 type of PPE during the event. The aggregate time in the resuscitation room for health care workers across all events was 118.7 hours. During this time, providers had either absent or inadequate eyewear for 46.4 hours (39.1%) and absent or inadequate masks for 35.2 hours (29.7%). CONCLUSION: Full adherence with recommended PPE use was limited in a setting at increased risk for SARS-CoV-2 virus aerosolization. In addition to ensuring appropriate donning, approaches are needed for ensuring ongoing adherence with PPE recommendations during exposure.
Assuntos
COVID-19/prevenção & controle , Serviço Hospitalar de Emergência/normas , Fidelidade a Diretrizes , Controle de Infecções/normas , Pandemias , Equipamento de Proteção Individual/normas , Ressuscitação , COVID-19/epidemiologia , COVID-19/transmissão , Criança , Hospitais Pediátricos , Humanos , Controle de Infecções/métodos , Equipe de Assistência ao Paciente/normas , Guias de Prática Clínica como Assunto , Estudos Retrospectivos , SARS-CoV-2RESUMO
PURPOSE: Erythropoiesis-stimulating agents (ESAs), indicated for treating some patients with chemotherapy-induced anemia (CIA), may increase the risk of tumor progression and mortality. FDA required a Risk Evaluation and Mitigation Strategy (REMS) to mitigate these risks. We assessed REMS impact on ESA administration and red blood cell (RBC) transfusion as surrogate metrics for REMS effectiveness. METHODS: Retrospective cohort study including data from January 1, 2006 to December 31, 2018 for beneficiaries ≥65 years enrolled in Centers for Medicare & Medicaid Services (CMS) Medicare Parts A/B with a cancer diagnosis; patients with other indications for ESA use were excluded. Study time was divided into five periods demarcated by issuance of CMS National Coverage Determination (NCD) (Pre-NCD, Pre-REMS) and REMS milestones (Grace Period, REMS, post-REMS). Study outcomes were monthly proportion of chemotherapy episodes (CTEs) with concomitant ESA administration, with post-CTE ESA administration, and with RBC transfusions. RESULTS: Of 1 778 855 beneficiaries treated with CT, 308742 received concomitant ESA for CIA. The proportion of CTEs with concomitant and post-CTE ESA administration decreased Pre-REMS (9.0 percentage points [pp] and 3.5 pp, respectively). There were no significant post-REMS changes in the proportion of CTEs with concomitant (0.0 pp) and post-CTE ESA administration (0.1 pp). Fluctuation in RBC transfusions was <4 pp throughout the study period. CONCLUSIONS: Medicare beneficiaries showed a substantive decrease in ESA administration after NCD, with minimal impact by the REMS and its removal. Small changes in RBC transfusion over the study period were likely due to a national secular trend.
Assuntos
Anemia , Antineoplásicos , Hematínicos , Idoso , Anemia/induzido quimicamente , Anemia/tratamento farmacológico , Anemia/epidemiologia , Antineoplásicos/efeitos adversos , Transfusão de Sangue , Eritropoese , Hematínicos/efeitos adversos , Humanos , Medicare , Estudos Retrospectivos , Avaliação de Risco e Mitigação , Estados Unidos/epidemiologiaRESUMO
BACKGROUND: Epidemiological study reporting is improving but is not transparent enough for easy evaluation or replication. One barrier is insufficient details about design elements in published studies. METHODS: Using a previously conducted drug safety evaluation in claims as a test case, we investigated the impact of small changes in five key design elements on risk estimation. These elements are index day of incident exposure's determination of look-back or follow-up periods, exposure duration algorithms, heparin exposure exclusion, propensity score model variables, and Cox proportional hazard model stratification. We covaried these elements using a fractional factorial design, resulting in 24 risk estimates for one outcome. We repeated eight of these combinations for two additional outcomes. We measured design effects on cohort sizes, follow-up time, and risk estimates. RESULTS: Small changes in specifications of index day and exposure algorithm affected the risk estimation process the most. They affected cohort size on average by 8 to 10%, follow-up time by up to 31%, and magnitude of log hazard ratios by up to 0.22. Other elements affected cohort before matching or risk estimate's precision but not its magnitude. Any change in design substantially altered the matched control-group subjects in 1:1 matching. CONCLUSIONS: Exposure-related design elements require attention from investigators initiating, evaluating, or wishing to replicate a study or from analysts standardizing definitions. The methods we developed, using factorial design and mapping design effect on causal estimation process, are applicable to planning of sensitivity analyses in similar studies.
Assuntos
Estudos de Coortes , Incidência , Revisão da Utilização de Seguros/estatística & dados numéricos , Farmacoepidemiologia/estatística & dados numéricos , Projetos de Pesquisa , Risco , HumanosRESUMO
Examining medical products' benefits and risks in different population subsets is often necessary for informing public health decisions. In observational cohort studies, safety analyses by pre-specified subgroup can be powered, and are informative about different population subsets' risks if the study designs or analyses adequately control for confounding. However, few guidelines exist on how to simultaneously control for confounding and conduct subgroup analyses. In this simulation study, we evaluated the performance, in terms of bias, efficiency and coverage, of six propensity score methods in 24 scenarios by estimating subgroup-specific hazard ratios of average treatment effect in the treated with Cox regression models. The subgroup analysis methods control for confounding either by propensity score matching or by inverse probability treatment weighting. These methods vary as to whether they subset information or borrow it across subgroups to estimate the propensity score. Simulation scenarios varied by size of subgroup, strength of association of subgroup with exposure, strength of association of subgroup with outcome (simulated survival), and outcome incidence. Results indicated that subsetting the data by the subgrouping variable, to estimate the propensity score and hazard ratio, has the smallest bias, far exceeding any penalty in precision. Moreover, weighting methods pay a heavier price in bias than do matching methods when the propensity score model is misspecified and the subgrouping variable is a strong confounder.
Assuntos
Projetos de Pesquisa/estatística & dados numéricos , Análise de Sobrevida , Simulação por Computador , Interpretação Estatística de Dados , Humanos , Modelos Estatísticos , Pontuação de Propensão , Medição de Risco , Fatores de RiscoRESUMO
Case-crossover study designs are observational studies used to assess postmarket safety of medical products (eg, vaccines or drugs). As a case-crossover study is self-controlled, its advantages include better control for confounding because the design controls for any time-invariant measured and unmeasured confounding and potentially greater feasibility as only data from those experiencing an event (or cases) are required. However, self-matching also introduces correlation between case and control periods within a subject or matched unit. To estimate sample size in a case-crossover study, investigators currently use Dupont's formula (Biometrics 1988; 43:1157-1168), which was originally developed for a matched case-control study. This formula is relevant as it takes into account correlation in exposure between controls and cases, which are expected to be high in self-controlled studies. However, in our study, we show that Dupont's formula and other currently used methods to determine sample size for case-crossover studies may be inadequate. Specifically, these formulas tend to underestimate the true required sample size, determined through simulations, for a range of values in the parameter space. We present mathematical derivations to explain where some currently used methods fail and propose two new sample size estimation methods that provide a more accurate estimate of the true required sample size.
Assuntos
Estudos Cross-Over , Tamanho da Amostra , Estudos de Casos e Controles , Humanos , Modelos Estatísticos , Estudos Observacionais como Assunto/métodos , Modelos de Riscos ProporcionaisRESUMO
PURPOSE: Develop a flexible analytic tool for the Food and Drug Administration's (FDA's) Sentinel System to assess adherence to safe use recommendations with two capabilities: characterize adherence to patient monitoring recommendations for a drug, and characterize concomitant medication use before, during, and/or after drug therapy. METHODS: We applied the tool in the Sentinel Distributed Database to assess adherence to the labeled recommendation that patients treated with dronedarone undergo electrocardiogram (ECG) testing no less often than every 3 months. Measures of length of treatment, time to first ECG, number of ECGs, and time between ECGs were assessed. We also assessed concomitant use of contraception among female users of mycophenolate per label recommendations (concomitancy 4 weeks before through 6 weeks after discontinuation of mycophenolate). Unadjusted results were stratified by age, month-year, and sex. RESULTS: We identified 21 457 new episodes of dronedarone use of greater than or equal to 90 days (July 2009 to September 2015); 86% had greater than or equal to one ECG, and 22% met the recommendation of an ECG no less often than every 3 months. We identified 21 942 new episodes of mycophenolate use among females 12 to 55 years (January 2016 to September 2015); 16% had greater than or equal to 1 day of concomitant contraception dispensed, 12% had concomitant contraception use for greater than or equal to 50% of the 4 weeks before initiation through 6 weeks after mycophenolate; younger females had more concomitancy. These results may be underestimates as the analyses are limited to claims data. CONCLUSIONS: We developed a tool for use in databases formatted to the Sentinel Common Data Model that can assess adherence to safe use recommendations involving patient monitoring and concomitant drug use over time.
Assuntos
Sistemas de Notificação de Reações Adversas a Medicamentos/organização & administração , Antiarrítmicos/administração & dosagem , Dronedarona/administração & dosagem , Monitoramento de Medicamentos/métodos , Ácido Micofenólico/administração & dosagem , Antiarrítmicos/efeitos adversos , Anticoncepção/estatística & dados numéricos , Bases de Dados Factuais , Dronedarona/efeitos adversos , Interações Medicamentosas , Eletrocardiografia , Humanos , Adesão à Medicação , Ácido Micofenólico/efeitos adversos , Estados Unidos , United States Food and Drug AdministrationRESUMO
In a retrospective cohort study of patients enrolled in the UK Clinical Practice Research Datalink during 2000-2013, we evaluated long-term risks of death, stroke, and acute myocardial infarction (AMI) in adults prescribed clarithromycin. Patients were outpatients aged 40-85 years, who were prescribed clarithromycin (n = 287,748), doxycycline (n = 267,729), or erythromycin (n = 442,999), or Helicobacter pylori eradication therapy with a proton pump inhibitor, amoxicillin, and either clarithromycin (n = 27,639) or metronidazole (n = 14,863). We analyzed time to death, stroke, or AMI with Cox proportional hazards regression. The long-term hazard ratio for death following 1 clarithromycin versus 1 doxycycline prescription was 1.29 (95% confidence interval (CI): 1.21, 1.25), increasing to 1.62 (95% CI: 1.43, 1.84) for ≥5 prescriptions of clarithromycin versus ≥5 prescriptions for doxycycline. Erythromycin showed smaller risks in comparison with doxycycline. Stroke and AMI incidences were also increased after clarithromycin but with smaller hazard ratios than for mortality. For H. pylori eradication, the hazard ratio for mortality following clarithromycin versus metronidazole regimens was 1.09 (95% CI: 1.00, 1.18) overall, and it was higher (hazard ratio = 1.65, 95% CI: 0.88, 3.08) following ≥2 prescriptions in subjects not on statins at baseline. Outpatient clarithromycin use was associated with long-term mortality increases, with evidence for a similar, smaller increase with erythromycin.
Assuntos
Antibacterianos/efeitos adversos , Claritromicina/efeitos adversos , Mortalidade/tendências , Infarto do Miocárdio/mortalidade , Acidente Vascular Cerebral/mortalidade , Adulto , Idoso , Idoso de 80 Anos ou mais , Antibacterianos/uso terapêutico , Claritromicina/uso terapêutico , Doxiciclina/efeitos adversos , Quimioterapia Combinada , Eritromicina/efeitos adversos , Feminino , Infecções por Helicobacter/tratamento farmacológico , Humanos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Inibidores da Bomba de Prótons/uso terapêutico , Estudos Retrospectivos , Fatores de Tempo , Reino UnidoRESUMO
The tree-based scan statistic is a statistical data mining tool that has been used for signal detection with a self-controlled design in vaccine safety studies. This disproportionality statistic adjusts for multiple testing in evaluation of thousands of potential adverse events. However, many drug safety questions are not well suited for self-controlled analysis. We propose a method that combines tree-based scan statistics with propensity score-matched analysis of new initiator cohorts, a robust design for investigations of drug safety. We conducted plasmode simulations to evaluate performance. In multiple realistic scenarios, tree-based scan statistics in cohorts that were propensity score matched to adjust for confounding outperformed tree-based scan statistics in unmatched cohorts. In scenarios where confounding moved point estimates away from the null, adjusted analyses recovered the prespecified type 1 error while unadjusted analyses inflated type 1 error. In scenarios where confounding moved point estimates toward the null, adjusted analyses preserved power, whereas unadjusted analyses greatly reduced power. Although complete adjustment of true confounders had the best performance, matching on a moderately mis-specified propensity score substantially improved type 1 error and power compared with no adjustment. When there was true elevation in risk of an adverse event, there were often co-occurring signals for clinically related concepts. TreeScan with propensity score matching shows promise as a method for screening and prioritization of potential adverse events. It should be followed by clinical review and safety studies specifically designed to quantify the magnitude of effect, with confounding control targeted to the outcome of interest.
Assuntos
Mineração de Dados/métodos , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos/epidemiologia , Fatores de Confusão Epidemiológicos , Humanos , Pontuação de Propensão , Software , Estatística como AssuntoRESUMO
PURPOSE: The US Food and Drug Administration's Sentinel system developed tools for sequential surveillance. METHODS: In patients with non-valvular atrial fibrillation, we sequentially compared outcomes for new users of rivaroxaban versus warfarin, employing propensity score matching and Cox regression. A total of 36 173 rivaroxaban and 79 520 warfarin initiators were variable-ratio matched within 2 monitoring periods. RESULTS: Statistically significant signals were observed for ischemic stroke (IS) (first period) and intracranial hemorrhage (ICH) (second period) favoring rivaroxaban, and gastrointestinal bleeding (GIB) (second period) favoring warfarin. In follow-up analyses using primary position diagnoses from inpatient encounters for increased definition specificity, the hazard ratios (HR) for rivaroxaban vs warfarin new users were 0.61 (0.47, 0.79) for IS, 1.47 (1.29, 1.67) for GIB, and 0.71 (0.50, 1.01) for ICH. For GIB, the HR varied by age: <66 HR = 0.88 (0.60, 1.30) and 66+ HR = 1.49 (1.30, 1.71). CONCLUSIONS: This study demonstrates the capability of Sentinel to conduct prospective safety monitoring and raises no new concerns about rivaroxaban safety.
Assuntos
Sistemas de Notificação de Reações Adversas a Medicamentos/estatística & dados numéricos , Inibidores do Fator Xa/efeitos adversos , Rivaroxabana/efeitos adversos , United States Food and Drug Administration/estatística & dados numéricos , Idoso , Idoso de 80 Anos ou mais , Fibrilação Atrial/complicações , Fibrilação Atrial/tratamento farmacológico , Infarto Encefálico/epidemiologia , Infarto Encefálico/etiologia , Infarto Encefálico/prevenção & controle , Inibidores do Fator Xa/administração & dosagem , Feminino , Seguimentos , Hemorragia Gastrointestinal/induzido quimicamente , Hemorragia Gastrointestinal/epidemiologia , Humanos , Hemorragias Intracranianas/induzido quimicamente , Hemorragias Intracranianas/epidemiologia , Masculino , Pessoa de Meia-Idade , Projetos Piloto , Estudos Prospectivos , Rivaroxabana/administração & dosagem , Estados Unidos/epidemiologia , Varfarina/administração & dosagem , Varfarina/efeitos adversosRESUMO
BACKGROUND: Dabigatran (150 mg twice daily) has been associated with lower rates of stroke than warfarin in trials of atrial fibrillation, but large-scale evaluations in clinical practice are limited. OBJECTIVE: To compare incidence of stroke, bleeding, and myocardial infarction in patients receiving dabigatran versus warfarin in practice. DESIGN: Retrospective cohort. SETTING: National U.S. Food and Drug Administration Sentinel network. PATIENTS: Adults with atrial fibrillation initiating dabigatran or warfarin therapy between November 2010 and May 2014. MEASUREMENTS: Ischemic stroke, intracranial hemorrhage, extracranial bleeding, and myocardial infarction identified from hospital claims among propensity score-matched patients starting treatment with dabigatran or warfarin. RESULTS: Among 25 289 patients starting dabigatran therapy and 25 289 propensity score-matched patients starting warfarin therapy, those receiving dabigatran did not have significantly different rates of ischemic stroke (0.80 vs. 0.94 events per 100 person-years; hazard ratio [HR], 0.92 [95% CI, 0.65 to 1.28]) or extracranial hemorrhage (2.12 vs. 2.63 events per 100 person-years; HR, 0.89 [CI, 0.72 to 1.09]) but were less likely to have intracranial bleeding (0.39 vs. 0.77 events per 100 person-years; HR, 0.51 [CI, 0.33 to 0.79]) and more likely to have myocardial infarction (0.77 vs. 0.43 events per 100 person-years; HR, 1.88 [CI, 1.22 to 2.90]). However, the strength and significance of the association between dabigatran use and myocardial infarction varied in sensitivity analyses and by exposure definition (HR range, 1.13 [CI, 0.78 to 1.64] to 1.43 [CI, 0.99 to 2.08]). Older patients and those with kidney disease had higher gastrointestinal bleeding rates with dabigatran. LIMITATION: Inability to examine outcomes by dabigatran dose (unacceptable covariate balance between matched patients) or quality of warfarin anticoagulation (few patients receiving warfarin had available international normalized ratio values). CONCLUSION: In matched adults with atrial fibrillation treated in practice, the incidences of stroke and bleeding with dabigatran versus warfarin were consistent with those seen in trials. The possible relationship between dabigatran and myocardial infarction warrants further investigation. PRIMARY FUNDING SOURCE: U.S. Food and Drug Administration.
Assuntos
Anticoagulantes/uso terapêutico , Antitrombinas/uso terapêutico , Fibrilação Atrial/complicações , Dabigatrana/uso terapêutico , Varfarina/uso terapêutico , Idoso , Idoso de 80 Anos ou mais , Anticoagulantes/efeitos adversos , Antitrombinas/efeitos adversos , Dabigatrana/efeitos adversos , Feminino , Hemorragia/induzido quimicamente , Hemorragia/epidemiologia , Humanos , Incidência , Masculino , Pessoa de Meia-Idade , Infarto do Miocárdio/epidemiologia , Infarto do Miocárdio/prevenção & controle , Pontuação de Propensão , Estudos Retrospectivos , Acidente Vascular Cerebral/epidemiologia , Acidente Vascular Cerebral/prevenção & controle , Varfarina/efeitos adversosRESUMO
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
Assuntos
Legislação de Medicamentos/tendências , Preparações Farmacêuticas/normas , Segurança/legislação & jurisprudência , Segurança/normas , Biometria , Humanos , Metanálise como Assunto , Estados Unidos , United States Food and Drug AdministrationAssuntos
Coleta de Dados/métodos , Bases de Dados Factuais/estatística & dados numéricos , Aprovação de Drogas/métodos , Vigilância de Produtos Comercializados/métodos , Redes de Comunicação de Computadores/estatística & dados numéricos , Conjuntos de Dados como Assunto , Tomada de Decisões Gerenciais , Aprovação de Drogas/organização & administração , Humanos , Projetos Piloto , Vigilância de Produtos Comercializados/estatística & dados numéricos , Pontuação de Propensão , Estudos Prospectivos , Estados Unidos , United States Food and Drug Administration/legislação & jurisprudência , United States Food and Drug Administration/organização & administraçãoRESUMO
BACKGROUND: Ornithine transcarbamylase deficiency (OTCD) due to an X-linked OTC mutation, is responsible for moderate to severe hyperammonemia (HA) with substantial morbidity and mortality. About 80% of females with OTCD remain apparently "asymptomatic" with limited studies of their clinical characteristics and long-term health vulnerabilities. Multimodal neuroimaging studies and executive function testing have shown that asymptomatic females exhibit limitations when stressed to perform at higher cognitive load and had reduced activation of the prefrontal cortex. This retrospective study aims to improve understanding of factors that might predict development of defined complications and serious illness in apparent asymptomatic females. A proband and her daughter are presented to highlight the utility of multimodal neuroimaging studies and to underscore that asymptomatic females with OTCD are not always asymptomatic. METHODS: We review data from 302 heterozygote females with OTCD enrolled in the Urea Cycle Disorders Consortium (UCDC) longitudinal natural history database. We apply multiple neuroimaging modalities in the workup of a proband and her daughter. RESULTS: Among the females in the database, 143 were noted as symptomatic at baseline (Sym). We focused on females who were asymptomatic (Asx, n = 111) and those who were asymptomatic initially upon enrollment in study but who later became symptomatic sometime during follow-up (Asx/Sym, n = 22). The majority of Asx (86%) and Asx/Sym (75%) subjects did not restrict protein at baseline, and ~38% of Asx and 33% of Asx/Sym subjects suffered from mild to severe neuropsychiatric conditions such as mood disorder and sleep problems. The risk of mild to severe HA sometime later in life for the Asx and Asx/Sym subjects as a combined group was ~4% (5/133), with ammonia ranging from 77 to 470 µM and at least half (2/4) of subjects requiring hospital admission and nitrogen scavenger therapy. For this combined group, the median age of first HA crisis was 50 years, whereas the median age of first symptom which included neuropsychiatric and/or behavioral symptoms was 17 years. The multimodal neuroimaging studies in female heterozygotes with OTCD also underscore that asymptomatic female heterozygotes with OTCD (e.g., proband) are not always asymptomatic. CONCLUSIONS: Analysis of Asx and Asx/Sym females with OTCD in this study suggests that future evidence-based management guidelines and/or a clinical risk score calculator for this cohort could be useful management tools to reduce morbidity and improve long-term quality of life.
Assuntos
Doença da Deficiência de Ornitina Carbomoiltransferase , Adolescente , Feminino , Humanos , Pessoa de Meia-Idade , Hiperamonemia/etiologia , Estudos Longitudinais , Doença da Deficiência de Ornitina Carbomoiltransferase/diagnóstico , Doença da Deficiência de Ornitina Carbomoiltransferase/genética , Estudos Retrospectivos , Distúrbios Congênitos do Ciclo da Ureia/epidemiologia , Doenças Assintomáticas , Bases de Dados FactuaisRESUMO
While industry and regulators' interest in decentralized clinical trials (DCTs) is long-standing, the Covid-19 pandemic accelerated and broadened the adoption and experience with these trials. The key idea in decentralization is bringing the clinical trial design, typically on-site, closer to the patient's experience (on-site or off-site). Thus, potential benefits of DCTs include reducing the burden of participation in trials, broadening access to a more diverse population, or using innovative endpoints collected off-site. This paper helps researchers to carefully evaluate the added value and the implications of DCTs beyond the operational aspects of their implementation. The proposed approach is to use the ICH E9(R1) estimand framework to guide the strategic decisions around each decentralization component. Furthermore, the framework can guide the process for clinical trialists to systematically consider the implications of decentralization, in turn, for each attribute of the estimand. We illustrate the use of this approach with a fully DCT case study and show that the proposed systematic process can uncover the scientific opportunities, assumptions, and potential risks associated with a possible use of decentralization components in the design of a trial. This process can also highlight the benefits of specifying estimand attributes in a granular way. Thus, we demonstrate that bringing a decentralization component into the design will not only impact estimators and estimation but can also correspond to addressing more granular questions, thereby uncovering new target estimands.