ABSTRACT
BACKGROUND: Giant axonal neuropathy is a rare, autosomal recessive, pediatric, polysymptomatic, neurodegenerative disorder caused by biallelic loss-of-function variants in GAN, the gene encoding gigaxonin. METHODS: We conducted an intrathecal dose-escalation study of scAAV9/JeT-GAN (a self-complementary adeno-associated virus-based gene therapy containing the GAN transgene) in children with giant axonal neuropathy. Safety was the primary end point. The key secondary clinical end point was at least a 95% posterior probability of slowing the rate of change (i.e., slope) in the 32-item Motor Function Measure total percent score at 1 year after treatment, as compared with the pretreatment slope. RESULTS: One of four intrathecal doses of scAAV9/JeT-GAN was administered to 14 participants - 3.5×1013 total vector genomes (vg) (in 2 participants), 1.2×1014 vg (in 4), 1.8×1014 vg (in 5), and 3.5×1014 vg (in 3). During a median observation period of 68.7 months (range, 8.6 to 90.5), of 48 serious adverse events that had occurred, 1 (fever) was possibly related to treatment; 129 of 682 adverse events were possibly related to treatment. The mean pretreatment slope in the total cohort was -7.17 percentage points per year (95% credible interval, -8.36 to -5.97). At 1 year after treatment, posterior mean changes in slope were -0.54 percentage points (95% credible interval, -7.48 to 6.28) with the 3.5×1013-vg dose, 3.23 percentage points (95% credible interval, -1.27 to 7.65) with the 1.2×1014-vg dose, 5.32 percentage points (95% credible interval, 1.07 to 9.57) with the 1.8×1014-vg dose, and 3.43 percentage points (95% credible interval, -1.89 to 8.82) with the 3.5×1014-vg dose. The corresponding posterior probabilities for slowing the slope were 44% (95% credible interval, 43 to 44); 92% (95% credible interval, 92 to 93); 99% (95% credible interval, 99 to 99), which was above the efficacy threshold; and 90% (95% credible interval, 89 to 90). Between 6 and 24 months after gene transfer, sensory-nerve action potential amplitudes increased, stopped declining, or became recordable after being absent in 6 participants but remained absent in 8. CONCLUSIONS: Intrathecal gene transfer with scAAV9/JeT-GAN for giant axonal neuropathy was associated with adverse events and resulted in a possible benefit in motor function scores and other measures at some vector doses over a year. Further studies are warranted to determine the safety and efficacy of intrathecal AAV-mediated gene therapy in this disorder. (Funded by the National Institute of Neurological Disorders and Stroke and others; ClinicalTrials.gov number, NCT02362438.).
Subject(s)
Gene Transfer Techniques , Genetic Therapy , Giant Axonal Neuropathy , Child , Humans , Cytoskeletal Proteins/genetics , Genetic Therapy/adverse effects , Genetic Therapy/methods , Giant Axonal Neuropathy/genetics , Giant Axonal Neuropathy/therapy , Transgenes , Injections, SpinalABSTRACT
We focus on Bayesian inference for survival probabilities in a prime-boost vaccination regime in the development of an Ebola vaccine. We are interested in the heterologous prime-boost regimen (unmatched vaccine deliverys using the same antigen) due to its demonstrated durable immunity, well-tolerated safety profile, and suitability as a population vaccination strategy. Our research is motivated by the need to estimate the survival probability given the administered dosage. To do so, we establish two key relationships. Firstly, we model the connection between the designed dose concentration and the induced antibody count using a Bayesian response surface model. Secondly, we model the association between the antibody count and the probability of survival when experimental subjects are exposed to the Ebola virus in a controlled setting using a Bayesian probability of survival model. Finally, we employ a combination of the two models with dose concentration as the predictor of the survival probability for a future vaccinated population. We implement our two-level Bayesian model in Stan, and illustrate its use with simulated and real-world data. Performance of this model is evaluated via simulation. Our work offers a new application of drug synergy models to examine prime-boost vaccine efficacy, and does so using a hierarchical Bayesian framework that allows us to use dose concentration to predict survival probability.
Subject(s)
Ebola Vaccines , Hemorrhagic Fever, Ebola , Humans , Immunization, Secondary , Ebola Vaccines/pharmacology , Hemorrhagic Fever, Ebola/prevention & control , Bayes Theorem , VaccinationABSTRACT
In developing products for rare diseases, statistical challenges arise due to the limited number of patients available for participation in drug trials and other clinical research. Bayesian adaptive clinical trial designs offer the possibility of increased statistical efficiency, reduced development cost and ethical hazard prevention via their incorporation of evidence from external sources (historical data, expert opinions, and real-world evidence), and flexibility in the specification of interim looks. In this paper, we propose a novel Bayesian adaptive commensurate design that borrows adaptively from historical information and also uses a particular payoff function to optimize the timing of the study's interim analysis. The trial payoff is a function of how many samples can be saved via early stopping and the probability of making correct early decisions for either futility or efficacy. We calibrate our Bayesian algorithm to have acceptable long-run frequentist properties (Type I error and power) via simulation at the design stage. We illustrate our approach using a pediatric trial design setting testing the effect of a new drug for a rare genetic disease. The optimIA R package available at https://github.com/wxwx1993/Bayesian_IA_Timing provides an easy-to-use implementation of our approach.
Subject(s)
Medical Futility , Research Design , Algorithms , Bayes Theorem , Child , Computer Simulation , HumansABSTRACT
In most drug development settings, the regulatory approval process is accompanied by extensive studies performed to understand the drug's pharmacokinetic (PK) and pharmacodynamic (PD) properties. In this article, we attempt to utilize the rich PK/PD data to inform the borrowing of information from adults during pediatric drug development. In pediatric settings, it is especially crucial that we are parsimonious with the patients recruited for experimentation. We illustrate our approaches in the context of clinical trials of cinacalcet for treating secondary hyperparathyroidism in pediatric and adult patients with chronic kidney disease, where we model both parathyroid hormone (efficacy endpoint) and corrected calcium levels (safety endpoint). We use population PK/PD modeling of the cinacalcet data to quantitatively assess the similarity between adults and children, and use this information in various hierarchical Bayesian adult borrowing rules whose statistical properties can then be evaluated. In particular, we simulate the bias and mean square error performance of our approaches in settings where borrowing is and is not warranted to inform guidelines for the future use of our methods.
Subject(s)
Cinacalcet/pharmacokinetics , Clinical Trials as Topic/statistics & numerical data , Drug Development/statistics & numerical data , Hyperparathyroidism, Secondary/drug therapy , Research Design/statistics & numerical data , Age Factors , Bayes Theorem , Biomarkers/blood , Calcium/blood , Cinacalcet/adverse effects , Computer Simulation , Data Interpretation, Statistical , Humans , Hyperparathyroidism, Secondary/blood , Hyperparathyroidism, Secondary/diagnosis , Models, Statistical , Parathyroid Hormone/blood , Time Factors , Treatment OutcomeABSTRACT
Background A recent focus in the health sciences has been the development of personalized medicine, which includes determining the population for which a given treatment is effective. Due to limited data, identifying the true benefiting population is a challenging task. To tackle this difficulty, the credible subgroups approach provides a pair of bounding subgroups for the true benefiting subgroup, constructed so that one is contained by the benefiting subgroup while the other contains the benefiting subgroup with high probability. However, the method has so far only been developed for parametric linear models. Methods In this article, we develop the details required to follow the credible subgroups approach in more realistic settings by considering nonlinear and semiparametric regression models, supported for regulatory science by conditional power simulations. We also present an improved multiple testing approach using a step-down procedure. We evaluate our approach via simulations and apply it to data from four trials of Alzheimer's disease treatments carried out by AbbVie. Results Semiparametric modeling yields credible subgroups that are more robust to violations of linear treatment effect assumptions, and careful choice of the population of interest as well as the step-down multiple testing procedure result in a higher rate of detection of benefiting types of patients. The approach allows us to identify types of patients that benefit from treatment in the Alzheimer's disease trials. Conclusion Attempts to identify benefiting subgroups of patients in clinical trials are often met with skepticism due to a lack of multiplicity control and unrealistically restrictive assumptions. Our proposed approach merges two techniques, credible subgroups, and semiparametric regression, which avoids these problems and makes benefiting subgroup identification practical and reliable.
Subject(s)
Clinical Trials as Topic/methods , Models, Statistical , Precision Medicine/methods , Age Factors , Algorithms , Alzheimer Disease/drug therapy , Alzheimer Disease/genetics , Computer Simulation , Humans , Monte Carlo Method , Regression Analysis , Research Design , Severity of Illness Index , Sex FactorsABSTRACT
Children represent a large underserved population of "therapeutic orphans," as an estimated 80% of children are treated off-label. However, pediatric drug development often faces substantial challenges, including economic, logistical, technical, and ethical barriers, among others. Among many efforts trying to remove these barriers, increased recent attention has been paid to extrapolation; that is, the leveraging of available data from adults or older age groups to draw conclusions for the pediatric population. The Bayesian statistical paradigm is natural in this setting, as it permits the combining (or "borrowing") of information across disparate sources, such as the adult and pediatric data. In this paper, authored by the pediatric subteam of the Drug Information Association Bayesian Scientific Working Group and Adaptive Design Working Group, we develop, illustrate, and provide suggestions on Bayesian statistical methods that could be used to design improved pediatric development programs that use all available information in the most efficient manner. A variety of relevant Bayesian approaches are described, several of which are illustrated through 2 case studies: extrapolating adult efficacy data to expand the labeling for Remicade to include pediatric ulcerative colitis and extrapolating adult exposure-response information for antiepileptic drugs to pediatrics.
Subject(s)
Clinical Trials as Topic , Adult , Bayes Theorem , Colitis, Ulcerative , Drug Evaluation , Humans , Models, Statistical , Research DesignABSTRACT
Many new experimental treatments benefit only a subset of the population. Identifying the baseline covariate profiles of patients who benefit from such a treatment, rather than determining whether or not the treatment has a population-level effect, can substantially lessen the risk in undertaking a clinical trial and expose fewer patients to treatments that do not benefit them. The standard analyses for identifying patient subgroups that benefit from an experimental treatment either do not account for multiplicity, or focus on testing for the presence of treatment-covariate interactions rather than the resulting individualized treatment effects. We propose a Bayesian credible subgroups method to identify two bounding subgroups for the benefiting subgroup: one for which it is likely that all members simultaneously have a treatment effect exceeding a specified threshold, and another for which it is likely that no members do. We examine frequentist properties of the credible subgroups method via simulations and illustrate the approach using data from an Alzheimer's disease treatment trial. We conclude with a discussion of the advantages and limitations of this approach to identifying patients for whom the treatment is beneficial.
Subject(s)
Models, Statistical , Patient Selection , Remission Induction , Alzheimer Disease/drug therapy , Bayes Theorem , Clinical Trials as Topic , Computer Simulation , Data Interpretation, Statistical , Humans , Treatment OutcomeABSTRACT
Network meta-analysis (NMA), also known as multiple treatment comparisons, is commonly used to incorporate direct and indirect evidence comparing treatments. With recent advances in methods and software, Bayesian approaches to NMA have become quite popular and allow models of previously unanticipated complexity. However, when direct and indirect evidence differ in an NMA, the model is said to suffer from inconsistency. Current inconsistency detection in NMA is usually based on contrast-based (CB) models; however, this approach has certain limitations. In this work, we propose an arm-based random effects model, where we detect discrepancy of direct and indirect evidence for comparing two treatments using the fixed effects in the model while flagging extreme trials using the random effects. We define discrepancy factors to characterize evidence of inconsistency for particular treatment comparisons, which is novel in NMA research. Our approaches permit users to address issues previously tackled via CB models. We compare sources of inconsistency identified by our approach and existing loop-based CB methods using real and simulated datasets and demonstrate that our methods can offer powerful inconsistency detection. Copyright © 2016 John Wiley & Sons, Ltd.
Subject(s)
Bayes Theorem , Meta-Analysis as Topic , Humans , Network Meta-Analysis , SoftwareABSTRACT
AIMS: X-linked adrenoleukodystrophy (X-ALD) is a peroxisomal disorder, most commonly affecting boys, associated with increased very long chain fatty acids (C26:0) in all tissues, causing cerebral demyelination and adrenocortical insufficiency. Certain monounsaturated long chain fatty acids including oleic and erucic acids, known as Lorenzo's oil (LO), lower plasma C26:0 levels. The aims of this study were to characterize the effect of LO administration on plasma C26:0 concentrations and to determine whether there is an association between plasma concentrations of erucic acid or C26:0 and the likelihood of developing brain MRI abnormalities in asymptomatic boys. METHODS: Non-linear mixed effects modelling was performed on 2384 samples collected during an open label single arm trial. The subjects (n = 104) were administered LO daily at ~2-3 mg kg(-1) with a mean follow-up of 4.88 ± 2.76 years. The effect of erucic acid exposure on plasma C26:0 concentrations was characterized by an inhibitory fractional Emax model. A Weibull model was used to characterize the time-to-developing MRI abnormality. RESULTS: The population estimate for the fractional maximum reduction of C26:0 plasma concentrations was 0.76 (bootstrap 95% CI 0.73, 0.793). Our time-to-event analyses showed that every mg l(-1) increase in time-weighted average of erucic acid and C26:0 plasma concentrations was, respectively, associated with a 3.7% reduction and a 753% increase in the hazard of developing MRI abnormality. However, the results were not significant (P = 0.5344, 0.1509, respectively). CONCLUSIONS: LO administration significantly reduces the abnormally high plasma C26:0 concentrations in X-ALD patients. Further studies to evaluate the effect of LO on the likelihood of developing brain MRI abnormality are warranted.
Subject(s)
Adrenoleukodystrophy/metabolism , Adrenoleukodystrophy/pathology , Brain/pathology , Erucic Acids/blood , Erucic Acids/pharmacokinetics , Erucic Acids/therapeutic use , Fatty Acids/blood , Models, Biological , Triolein/pharmacokinetics , Triolein/therapeutic use , Adrenoleukodystrophy/blood , Child , Child, Preschool , Drug Combinations , Erucic Acids/pharmacology , Humans , Infant , Magnetic Resonance Imaging , Male , Neuroimaging , Triolein/pharmacologyABSTRACT
BACKGROUND: Many clinical trial designs are impractical for community-based clinical intervention trials. Stepped wedge trial designs provide practical advantages, but few descriptions exist of their clinical implementational features, statistical design efficiencies, and limitations. OBJECTIVES: Enhance efficiency of stepped wedge trial designs by evaluating the impact of design characteristics on statistical power for the British Columbia Telehealth Trial. METHODS: The British Columbia Telehealth Trial is a community-based, cluster-randomized, controlled clinical trial in rural and urban British Columbia. To determine the effect of an Internet-based telehealth intervention on healthcare utilization, 1000 subjects with an existing diagnosis of congestive heart failure or type 2 diabetes will be enrolled from 50 clinical practices. Hospital utilization is measured using a composite of disease-specific hospital admissions and emergency visits. The intervention comprises online telehealth data collection and counseling provided to support a disease-specific action plan developed by the primary care provider. The planned intervention is sequentially introduced across all participating practices. We adopt a fully Bayesian, Markov chain Monte Carlo-driven statistical approach, wherein we use simulation to determine the effect of cluster size, sample size, and crossover interval choice on type I error and power to evaluate differences in hospital utilization. RESULTS: For our Bayesian stepped wedge trial design, simulations suggest moderate decreases in power when crossover intervals from control to intervention are reduced from every 3 to 2 weeks, and dramatic decreases in power as the numbers of clusters decrease. Power and type I error performance were not notably affected by the addition of nonzero cluster effects or a temporal trend in hospitalization intensity. CONCLUSION/LIMITATIONS: Stepped wedge trial designs that intervene in small clusters across longer periods can provide enhanced power to evaluate comparative effectiveness, while offering practical implementation advantages in geographic stratification, temporal change, use of existing data, and resource distribution. Current population estimates were used; however, models may not reflect actual event rates during the trial. In addition, temporal or spatial heterogeneity can bias treatment effect estimates.
Subject(s)
Comparative Effectiveness Research/methods , Diabetes Mellitus/therapy , Heart Failure/therapy , Patient Compliance , Randomized Controlled Trials as Topic/methods , Telemedicine , Bayes Theorem , British Columbia , Cross-Over Studies , Emergency Service, Hospital/statistics & numerical data , Hospitalization , Humans , Internet , Markov Chains , Monte Carlo Method , Patient Care Planning , Pragmatic Clinical Trials as Topic , Research DesignABSTRACT
X-linked adrenoleukodystrophy (X-ALD) is a rare, progressive, and typically fatal neurodegenerative disease. Lorenzo's oil (LO) is one of the few X-ALD treatments available, but little has been done to establish its clinical efficacy or indications for its use. In this article, we analyze data on 116 male asymptomatic pediatric patients who were administered LO. We offer a hierarchical Bayesian statistical approach to understand LO pharmacokinetics (PK) and pharmacodynamics (PD) resulting from an accumulation of very long-chain fatty acids. We experiment with individual- and observational-level errors and various choices of prior distributions and deal with the limitation of having just one observation per administration of the drug, as opposed to the more usual multiple observations per administration. We link LO dose to the plasma erucic acid concentrations by PK modeling, and then link this concentration to a biomarker (C26, a very long-chain fatty acid) by PD modeling. Next, we design a Bayesian Phase IIa study to estimate precisely what improvements in the biomarker can arise from various LO doses while simultaneously modeling a binary toxicity endpoint. Our Bayesian adaptive algorithm emerges as reasonably robust and efficient while still retaining good classical (frequentist) operating characteristics. Future work looks toward using the results of this trial to design a Phase III study linking LO dose to actual improvements in health status, as measured by the appearance of brain lesions observed via magnetic resonance imaging.
Subject(s)
Adrenoleukodystrophy/drug therapy , Bayes Theorem , Clinical Trials, Phase II as Topic , Erucic Acids/pharmacokinetics , Research Design , Triolein/pharmacokinetics , Dose-Response Relationship, Drug , Drug Combinations , Erucic Acids/blood , Erucic Acids/therapeutic use , Humans , Male , Orphan Drug Production , Triolein/therapeutic useABSTRACT
Stochastic process models are widely employed for analyzing spatiotemporal datasets in various scientific disciplines including, but not limited to, environmental monitoring, ecological systems, forestry, hydrology, meteorology, and public health. After inferring on a spatiotemporal process for a given dataset, inferential interest may turn to estimating rates of change, or gradients, over space and time. This manuscript develops fully model-based inference on spatiotemporal gradients under continuous space, continuous time settings. Our contribution is to offer, within a flexible spatiotemporal process model setting, a framework to estimate arbitrary directional gradients over space at any given timepoint, temporal derivatives at any given spatial location and, finally, mixed spatiotemporal gradients that reflect rapid change in spatial gradients over time and vice-versa. We achieve such inference without compromising on rich and flexible spatiotemporal process models and use nonseparable covariance structures. We illustrate our methodology using a simulated data example and subsequently apply it to a dataset of daily PM2.5 concentrations in California, where the spatiotemporal gradient process reveals the effects of California's unique topography on pollution and detects the aftermath of a devastating series of wildfires.
Subject(s)
Air Pollution/statistics & numerical data , Bayes Theorem , Fires/statistics & numerical data , Models, Statistical , Particulate Matter/analysis , Spatio-Temporal Analysis , Air Pollution/analysis , California , Computer Simulation , Data Interpretation, Statistical , Humans , Reproducibility of Results , Sensitivity and SpecificityABSTRACT
Availability of individual patient-level data (IPD) broadens the scope of network meta-analysis (NMA) and enables us to incorporate patient-level information. Although IPD is a potential gold mine in biomedical areas, methodological development has been slow owing to limited access to such data. In this paper, we propose a Bayesian IPD NMA modeling framework for multiple continuous outcomes under both contrast-based and arm-based parameterizations. We incorporate individual covariate-by-treatment interactions to facilitate personalized decision making. Furthermore, we can find subpopulations performing well with a certain drug in terms of predictive outcomes. We also impute missing individual covariates via an MCMC algorithm. We illustrate this approach using diabetes data that include continuous bivariate efficacy outcomes and three baseline covariates and show its practical implications. Finally, we close with a discussion of our results, a review of computational challenges, and a brief description of areas for future research.
Subject(s)
Biomarkers , Diabetes Mellitus/therapy , Medical Records , Meta-Analysis as Topic , Algorithms , Bayes Theorem , Humans , Middle Aged , Outcome Assessment, Health Care/statistics & numerical dataABSTRACT
Network meta-analysis (NMA) expands the scope of a conventional pairwise meta-analysis to simultaneously handle multiple treatment comparisons. However, some trials may appear to deviate markedly from the others and thus be inappropriate to be synthesized in the NMA. In addition, the inclusion of these trials in evidence synthesis may lead to bias in estimation. We call such trials trial-level outliers. To the best of our knowledge, while heterogeneity and inconsistency in NMA have been extensively discussed and well addressed, few previous papers have considered the proper detection and handling of trial-level outliers. In this paper, we propose several Bayesian outlier detection measures, which are then applied to a diabetes data set. Simulation studies comparing our approaches in both arm-based and contrast-based model settings are provided in two supporting appendices.
Subject(s)
Bayes Theorem , Bias , Clinical Trials as Topic/statistics & numerical data , Meta-Analysis as Topic , Clinical Trials as Topic/methods , Humans , Hypoglycemic Agents/pharmacology , Normal DistributionABSTRACT
Although numerous studies have found a positive association between the density of alcohol establishments and various types of crime, few have examined how neighborhood attributes (e.g., schools, parks) could moderate this association. We used data from Minneapolis, MN with neighborhood as the unit of analysis (n = 83). We examined eight types of crime (assault, rape, robbery, vandalism, nuisance crime, public alcohol consumption, driving while intoxicated, underage alcohol possession/consumption) and measured density as the total number of establishments per roadway mile. Neighborhood attributes assessed as potential moderators included non-alcohol businesses, schools, parks, religious institutions, neighborhood activism, neighborhood quality, and number of condemned houses. Using Bayesian techniques, we created a model for each crime outcome (accounting for spatial auto-correlation and controlling for relevant demographics) with an interaction term (moderator × density) to test each potential moderating effect. Few interaction terms were statistically significant. The presence of at least one college was the only neighborhood attribute that consistently moderated the density-crime association, with the presence of a college attenuating the association between the density and three types of crime (assaults, nuisance crime, and public consumption). However, caution should be used when interpreting the moderating effect of college presence because of the small number of colleges in our sample. The lack of moderating effects of neighborhood attributes, except for presence of a college, suggests that the addition of alcohol establishments to any neighborhood, regardless of its other attributes, could result in an increase in a wide range of crime.
Subject(s)
Alcohol Drinking , Crime , Residence Characteristics , Restaurants , Humans , MinnesotaABSTRACT
Trial investigators often have a primary interest in the estimation of the survival curve in a population for which there exists acceptable historical information from which to borrow strength. However, borrowing strength from a historical trial that is non-exchangeable with the current trial can result in biased conclusions. In this article we propose a fully Bayesian semiparametric method for the purpose of attenuating bias and increasing efficiency when jointly modeling time-to-event data from two possibly non-exchangeable sources of information. We illustrate the mechanics of our methods by applying them to a pair of post-market surveillance datasets regarding adverse events in persons on dialysis that had either a bare metal or drug-eluting stent implanted during a cardiac revascularization surgery. We finish with a discussion of the advantages and limitations of this approach to evidence synthesis, as well as directions for future work in this area. The article's Supplementary Materials offer simulations to show our procedure's bias, mean squared error, and coverage probability properties in a variety of settings.
Subject(s)
Bayes Theorem , Models, Statistical , Product Surveillance, Postmarketing/methods , Survival Analysis , Adult , Aged , Computer Simulation , Drug-Eluting Stents/standards , Humans , Middle Aged , Percutaneous Coronary Intervention/methods , Peritoneal DialysisABSTRACT
BACKGROUND: In the absence of sufficient data directly comparing multiple treatments, indirect comparisons using network meta-analyses (NMAs) can provide useful information. Under current contrast-based (CB) methods for binary outcomes, the patient-centered measures including the treatment-specific event rates and risk differences (RDs) are not provided, which may create some unnecessary obstacles for patients to comprehensively trade-off efficacy and safety measures. PURPOSE: We aim to develop NMA to accurately estimate the treatment-specific event rates. METHODS: A Bayesian hierarchical model is developed to illustrate how treatment-specific event rates, RDs, and risk ratios (RRs) can be estimated. We first compare our approach to alternative methods using two hypothetical NMAs assuming a fixed RR or RD, and then use two published NMAs to illustrate the improved reporting. RESULTS: In the hypothetical NMAs, our approach outperforms current CB NMA methods in terms of bias. In the two published NMAs, noticeable differences are observed in the magnitude of relative treatment effects and several pairwise statistical significance tests from previous report. LIMITATIONS: First, to facilitate the estimation, each study is assumed to hypothetically compare all treatments, with unstudied arms being missing at random. It is plausible that investigators may have selected treatment arms on purpose based on the results of previous trials, which may lead to 'nonignorable missingness' and potentially bias our estimates. Second, we have not considered methods to identify and account for potential inconsistency between direct and indirect comparisons. CONCLUSIONS: The proposed NMA method can accurately estimate treatment-specific event rates, RDs, and RRs and is recommended.
Subject(s)
Meta-Analysis as Topic , Randomized Controlled Trials as Topic , Bayes Theorem , Humans , Odds RatioABSTRACT
The Drug Information Association Bayesian Scientific Working Group (BSWG) was formed in 2011 with a vision to ensure that Bayesian methods are well understood and broadly utilized for design and analysis and throughout the medical product development process, and to improve industrial, regulatory, and economic decision making. The group, composed of individuals from academia, industry, and regulatory, has as its mission to facilitate the appropriate use and contribute to the progress of Bayesian methodology. In this paper, the safety sub-team of the BSWG explores the use of Bayesian methods when applied to drug safety meta-analysis and network meta-analysis. Guidance is presented on the conduct and reporting of such analyses. We also discuss different structural model assumptions and provide discussion on prior specification. The work is illustrated through a case study involving a network meta-analysis related to the cardiovascular safety of non-steroidal anti-inflammatory drugs.
Subject(s)
Anti-Inflammatory Agents, Non-Steroidal/adverse effects , Bayes Theorem , Meta-Analysis as Topic , Cardiovascular Diseases/chemically induced , Drug Discovery , HumansABSTRACT
BACKGROUND: Postmarket device surveillance studies often have important primary objectives tied to estimating a survival function at some future time $$T$$ with a certain amount of precision. PURPOSE: This article presents the details and various operating characteristics of a Bayesian adaptive design for device surveillance, as well as a method for estimating a sample size vector (determined by the maximum sample size and a preset number of interim looks) that will deliver the desired power. METHODS: We adopt a Bayesian adaptive framework, which recognizes the fact that persons enrolled in a study report their results over time, not all at once. At each interim look, we assess whether we expect to achieve our goals with only the current group or the achievement of such goals is extremely unlikely even for the maximum sample size. RESULTS: Our Bayesian adaptive design can outperform two nonadaptive frequentist methods currently recommended by Food and Drug Administration (FDA) guidance documents in many settings. LIMITATIONS: Our method's performance can be sensitive to model misspecification and changes in the trial's enrollment rate. CONCLUSIONS: The proposed design provides a more efficient framework for conducting postmarket surveillance of medical devices.
Subject(s)
Bayes Theorem , Equipment Failure/statistics & numerical data , Equipment and Supplies/statistics & numerical data , Probability Theory , Data Collection , Humans , Research DesignABSTRACT
BACKGROUND: Prospective trial design often occurs in the presence of 'acceptable' historical control data. Typically, these data are only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. PURPOSE: We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al., succeeded a similar trial reported by Saltz et al., and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. METHODS: The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS), characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial's frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. RESULTS: Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure lead to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. LIMITATIONS: Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. CONCLUSIONS: The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on preexisting information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare.