Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
1.
PLoS Biol ; 20(11): e3001886, 2022 11.
Artigo em Inglês | MEDLINE | ID: mdl-36417471

RESUMO

The influence of protocol standardization between laboratories on their replicability of preclinical results has not been addressed in a systematic way. While standardization is considered good research practice as a means to control for undesired external noise (i.e., highly variable results), some reports suggest that standardized protocols may lead to idiosyncratic results, thus undermining replicability. Through the EQIPD consortium, a multi-lab collaboration between academic and industry partners, we aimed to elucidate parameters that impact the replicability of preclinical animal studies. To this end, 3 experimental protocols were implemented across 7 laboratories. The replicability of results was determined using the distance travelled in an open field after administration of pharmacological compounds known to modulate locomotor activity (MK-801, diazepam, and clozapine) in C57BL/6 mice as a worked example. The goal was to determine whether harmonization of study protocols across laboratories improves the replicability of the results and whether replicability can be further improved by systematic variation (heterogenization) of 2 environmental factors (time of testing and light intensity during testing) within laboratories. Protocols were tested in 3 consecutive stages and differed in the extent of harmonization across laboratories and standardization within laboratories: stage 1, minimally aligned across sites (local protocol); stage 2, fully aligned across sites (harmonized protocol) with and without systematic variation (standardized and heterogenized cohort); and stage 3, fully aligned across sites (standardized protocol) with a different compound. All protocols resulted in consistent treatment effects across laboratories, which were also replicated within laboratories across the different stages. Harmonization of protocols across laboratories reduced between-lab variability substantially compared to each lab using their local protocol. In contrast, the environmental factors chosen to introduce systematic variation within laboratories did not affect the behavioral outcome. Therefore, heterogenization did not reduce between-lab variability further compared to the harmonization of the standardized protocol. Altogether, these findings demonstrate that subtle variations between lab-specific study protocols may introduce variation across independent replicate studies even after protocol harmonization and that systematic heterogenization of environmental factors may not be sufficient to account for such between-lab variation. Differences in replicability of results within and between laboratories highlight the ubiquity of study-specific variation due to between-lab variability, the importance of transparent and fine-grained reporting of methodologies and research protocols, and the importance of independent study replication.


Assuntos
Reprodutibilidade dos Testes , Projetos de Pesquisa , Animais , Camundongos , Camundongos Endogâmicos C57BL
2.
Muscle Nerve ; 69(6): 719-729, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38593477

RESUMO

INTRODUCTION/AIMS: Biomarkers have shown promise in amyotrophic lateral sclerosis (ALS) research, but the quest for reliable biomarkers remains active. This study evaluates the effect of debamestrocel on cerebrospinal fluid (CSF) biomarkers, an exploratory endpoint. METHODS: A total of 196 participants randomly received debamestrocel or placebo. Seven CSF samples were to be collected from all participants. Forty-five biomarkers were analyzed in the overall study and by two subgroups characterized by the ALS Functional Rating Scale-Revised (ALSFRS-R). A prespecified model was employed to predict clinical outcomes leveraging biomarkers and disease characteristics. Causal inference was used to analyze relationships between neurofilament light chain (NfL) and ALSFRS-R. RESULTS: We observed significant changes with debamestrocel in 64% of the biomarkers studied, spanning pathways implicated in ALS pathology (63% neuroinflammation, 50% neurodegeneration, and 89% neuroprotection). Biomarker changes with debamestrocel show biological activity in trial participants, including those with advanced ALS. CSF biomarkers were predictive of clinical outcomes in debamestrocel-treated participants (baseline NfL, baseline latency-associated peptide/transforming growth factor beta1 [LAP/TGFß1], change galectin-1, all p < .01), with baseline NfL and LAP/TGFß1 remaining (p < .05) when disease characteristics (p < .005) were incorporated. Change from baseline to the last measurement showed debamestrocel-driven reductions in NfL were associated with less decline in ALSFRS-R. Debamestrocel significantly reduced NfL from baseline compared with placebo (11% vs. 1.6%, p = .037). DISCUSSION: Following debamestrocel treatment, many biomarkers showed increases (anti-inflammatory/neuroprotective) or decreases (inflammatory/neurodegenerative) suggesting a possible treatment effect. Neuroinflammatory and neuroprotective biomarkers were predictive of clinical response, suggesting a potential multimodal mechanism of action. These results offer preliminary insights that need to be confirmed.


Assuntos
Esclerose Lateral Amiotrófica , Biomarcadores , Proteínas de Neurofilamentos , Adulto , Idoso , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Esclerose Lateral Amiotrófica/líquido cefalorraquidiano , Esclerose Lateral Amiotrófica/tratamento farmacológico , Esclerose Lateral Amiotrófica/diagnóstico , Biomarcadores/líquido cefalorraquidiano , Método Duplo-Cego , Proteínas de Neurofilamentos/líquido cefalorraquidiano , Resultado do Tratamento
3.
Molecules ; 27(23)2022 Nov 28.
Artigo em Inglês | MEDLINE | ID: mdl-36500399

RESUMO

In the pharmaceutical field, and more precisely in quality control laboratories, robust liquid chromatographic methods are needed to separate and analyze mixtures of compounds. The development of such chromatographic methods for new mixtures can result in a long and tedious process even while using the design of experiments methodology. However, developments could be accelerated with the help of in silico screening. In this work, the usefulness of a strategy combining response surface methodology (RSM) followed by multicriteria decision analysis (MCDA) applied to predictions from a quantitative structure-retention relationship (QSRR) model is demonstrated. The developed strategy shows that selecting equations for the retention time prediction models based on the pKa of the compound allows flexibility in the models. The MCDA developed is shown to help to make decisions on different criteria while being robust to the user's decision on the weights for each criterion. This strategy is proposed for the screening phase of the method lifecycle. The strategy offers the possibility to the user to select chromatographic conditions based on multiple criteria without being too sensitive to the importance given to them. The conditions with the highest desirability are defined as the starting point for further optimization steps.


Assuntos
Cromatografia de Fase Reversa , Cromatografia Líquida de Alta Pressão/métodos , Cromatografia Líquida , Preparações Farmacêuticas
4.
Pharm Stat ; 20(2): 245-255, 2021 03.
Artigo em Inglês | MEDLINE | ID: mdl-33025743

RESUMO

The use of Bayesian methods to support pharmaceutical product development has grown in recent years. In clinical statistics, the drive to provide faster access for patients to medical treatments has led to a heightened focus by industry and regulatory authorities on innovative clinical trial designs, including those that apply Bayesian methods. In nonclinical statistics, Bayesian applications have also made advances. However, they have been embraced far more slowly in the nonclinical area than in the clinical counterpart. In this article, we explore some of the reasons for this slower rate of adoption. We also present the results of a survey conducted for the purpose of understanding the current state of Bayesian application in nonclinical areas and for identifying areas of priority for the DIA/ASA-BIOP Nonclinical Bayesian Working Group. The survey explored current usage, hurdles, perceptions, and training needs for Bayesian methods among nonclinical statisticians. Based on the survey results, a set of recommendations is provided to help guide the future advancement of Bayesian applications in nonclinical pharmaceutical statistics.


Assuntos
Preparações Farmacêuticas , Pesquisadores , Teorema de Bayes , Avaliação Pré-Clínica de Medicamentos , Previsões , Humanos
5.
Pharm Stat ; 17(6): 701-709, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30112804

RESUMO

The USP<1032> guidelines recommend the screening of bioassay data for outliers prior to performing a relative potency (RP) analysis. The guidelines, however, do not offer advice on the size or type of outlier that should be removed prior to model fitting and calculation of RP. Computer simulation was used to investigate the consequences of ignoring the USP<1032> guidance to remove outliers. For biotherapeutics and vaccines, outliers in potency data may result in the false acceptance/rejection of a bad/good lot of drug product. Biological activity, measured through a potency bioassay, is considered a critical quality attribute in manufacturing. If the concentration-response potency curve of a test sample is deemed to be similar in shape to that of the reference standard, the curves are said to exhibit constant RP, an essential criterion for the interpretation of a RP. One or more outliers in the concentration-response data, however, may result in a failure to declare similarity or may yield a biased RP estimate. Concentration-response curves for test and reference were computer generated with constant RP from four-parameter logistic curves. Single outlier, multiple outlier, and whole-curve outlier scenarios were explored for their effects on the similarity testing and on the RP estimation. Though the simulations point to situations for which outlier removal is unnecessary, the results generally support the USP<1032> recommendation and illustrate the impact on the RP calculation when application of outlier removal procedures are discounted.


Assuntos
Bioensaio , Interpretação Estatística de Dados , Simulação por Computador , Relação Dose-Resposta a Droga , Guias como Assunto , Humanos
6.
J Biopharm Stat ; 25(2): 247-59, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25360720

RESUMO

The concept of quality by design (QbD) as published in ICH-Q8 is currently one of the most recurrent topics in the pharmaceutical literature. This guideline recommends the use of information and prior knowledge gathered during pharmaceutical development studies to provide a scientific rationale for the manufacturing process of a product and provide guarantee of future quality. This poses several challenges from a statistical standpoint and requires a shift in paradigm from traditional statistical practices. First, to provide "assurance of quality" of future lots implies the need to make predictions regarding the quality given past evidence and data. Second, the quality attributes described in the Q8 guidelines are not always a set of unique, independent measurements. In many cases, these criteria are complicated longitudinal data with successive acceptance criteria over a defined period of time. A common example is a dissolution profile for a modified or extended-release solid dosage form that must fall within acceptance limits at several time points. A Bayesian approach for longitudinal data obtained in various conditions of a design of experiment is provided to elegantly address the ICH-Q8 recommendation to provide assurance of quality and derive a scientifically sound design space.


Assuntos
Biofarmácia/estatística & dados numéricos , Modelos Estatísticos , Tecnologia Farmacêutica/estatística & dados numéricos , Teorema de Bayes , Biofarmácia/normas , Química Farmacêutica , Interpretação Estatística de Dados , Preparações de Ação Retardada , Guias como Assunto , Cinética , Controle de Qualidade , Solubilidade , Comprimidos , Tecnologia Farmacêutica/métodos , Tecnologia Farmacêutica/normas , Fatores de Tempo
7.
J Biopharm Stat ; 25(2): 260-8, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25357001

RESUMO

Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.


Assuntos
Biofarmácia/estatística & dados numéricos , Modelos Estatísticos , Tecnologia Farmacêutica/estatística & dados numéricos , Teorema de Bayes , Biofarmácia/normas , Química Farmacêutica , Interpretação Estatística de Dados , Guias como Assunto , Probabilidade , Controle de Qualidade , Reprodutibilidade dos Testes , Tecnologia Farmacêutica/métodos , Tecnologia Farmacêutica/normas
8.
J Biopharm Stat ; 23(6): 1330-51, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-24138435

RESUMO

The International Conference for Harmonization (ICH) has released regulatory guidelines for pharmaceutical development. In the document ICH Q8, the design space of a process is presented as the set of factor settings providing satisfactory results. However, ICH Q8 does not propose any practical methodology to define, derive, and compute design space. In parallel, in the last decades, it has been observed that the diversity and the quality of analytical methods have evolved exponentially, allowing substantial gains in selectivity and sensitivity. However, there is still a lack of a rationale toward the development of robust separation methods in a systematic way. Applying ICH Q8 to analytical methods provides a methodology for predicting a region of the space of factors in which results will be reliable. Combining design of experiments and Bayesian standard multivariate regression, an identified form of the predictive distribution of a new response vector has been identified and used, under noninformative as well as informative prior distributions of the parameters. From the responses and their predictive distribution, various critical quality attributes can be easily derived. This Bayesian framework was then extended to the multicriteria setting to estimate the predictive probability that several critical quality attributes will be jointly achieved in the future use of an analytical method. An example based on a high-performance liquid chromatography (HPLC) method is given. For this example, a constrained sampling scheme was applied to ensure the modeled responses have desirable properties.


Assuntos
Teorema de Bayes , Interpretação Estatística de Dados , Modelos Estatísticos , Análise Multivariada , Projetos de Pesquisa/estatística & dados numéricos , Tecnologia Farmacêutica/estatística & dados numéricos , Cromatografia Líquida de Alta Pressão/estatística & dados numéricos , Modelos Lineares , Reprodutibilidade dos Testes , Incerteza
9.
Orphanet J Rare Dis ; 16(1): 3, 2021 01 06.
Artigo em Inglês | MEDLINE | ID: mdl-33407688

RESUMO

BACKGROUND: Centronuclear myopathies are severe rare congenital diseases. The clinical variability and genetic heterogeneity of these myopathies result in major challenges in clinical trial design. Alternative strategies to large placebo-controlled trials that have been used in other rare diseases (e.g., the use of surrogate markers or of historical controls) have limitations that Bayesian statistics may address. Here we present a Bayesian model that uses each patient's own natural history study data to predict progression in the absence of treatment. This prospective multicentre natural history evaluated 4-year follow-up data from 59 patients carrying mutations in the MTM1 or DNM2 genes. METHODS: Our approach focused on evaluation of forced expiratory volume in 1 s (FEV1) in 6- to 18-year-old children. A patient was defined as a responder if an improvement was observed after treatment and the predictive probability of such improvement in absence of intervention was less than 0.01. An FEV1 response was considered clinically relevant if it corresponded to an increase of more than 8%. RESULTS: The key endpoint of a clinical trial using this model is the rate of response. The power of the study is based on the posterior probability that the rate of response observed is greater than the rate of response that would be observed in the absence of treatment predicted based on the individual patient's previous natural history. In order to appropriately control for Type 1 error, the threshold probability by which the difference in response rates exceeds zero was adapted to 91%, ensuring a 5% overall Type 1 error rate for the trial. CONCLUSIONS: Bayesian statistical analysis of natural history data allowed us to reliably simulate the evolution of symptoms for individual patients over time and to probabilistically compare these simulated trajectories to actual observed post-treatment outcomes. The proposed model adequately predicted the natural evolution of patients over the duration of the study and will facilitate a sufficiently powerful trial design that can cope with the disease's rarity. Further research and ongoing dialog with regulatory authorities are needed to allow for more applications of Bayesian statistics in orphan disease research.


Assuntos
Miopatias Congênitas Estruturais , Adolescente , Teorema de Bayes , Criança , Ensaios Clínicos como Assunto , Progressão da Doença , Humanos , Estudos Prospectivos
10.
Chemosphere ; 71(2): 379-87, 2008 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-17950777

RESUMO

Dioxin analysis in food and feed can be characterized as an analytical application where very high accuracy is required at very low levels of contamination. Gas chromatography (GC) in combination with 13C-label isotope dilution (ID) high resolution mass spectrometry (HRMS) is the reference congener-specific technique characterized by pronounced selectivity, precision and trueness at parts-per-trillion (ppt) and sub-parts-per-trillion (sub-ppt) levels. The quality of the analytical data produced routinely by a laboratory should be adequate for its intended purpose, i.e., one will seek a compromise between the cost and time needed and the consequences of incorrect decisions due to erroneous results. The requirements for reproducibility are usually dependent on the analyte concentrations and have been expressed in various empirical functions. While Horwitz or modified functions are widely useful for many purposes, it would be difficult to expect these functions to cover every analytical problem. This study reports on precision characteristics achieved by the GC-ID-HRMS reference method for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (DL-PCBs) in food and feed in two interlaboratory method-performance studies among expert laboratories with long-standing experience in this field. Striking linear functions in log scale between reproducibility standard deviation and congener's level over a concentration range of 10(-8)-10(-14)g per g fresh weight are observed. The data fit very well to a Horwitz-type function of the form sR=0.153c0.904, where sR and c are dimensionless mass ratios expressed in pg g(-1) on fresh weight, regardless of the nature of the toxic congeners, food and feed matrices, or sample preparation methods. We demonstrate that the proposed function is suitable for use as a fitness-for-purpose criterion for proficiency assessment in interlaboratory comparisons on dioxins and related compounds in food.


Assuntos
Benzofuranos/análise , Análise de Alimentos/métodos , Contaminação de Alimentos/análise , Bifenilos Policlorados/análise , Dibenzodioxinas Policloradas/análogos & derivados , Animais , Bioensaio , Isótopos de Carbono/química , Dibenzofuranos Policlorados , Cromatografia Gasosa-Espectrometria de Massas , Técnicas de Diluição do Indicador , Análise dos Mínimos Quadrados , Dibenzodioxinas Policloradas/análise , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
11.
Anal Chim Acta ; 1019: 1-13, 2018 Aug 17.
Artigo em Inglês | MEDLINE | ID: mdl-29625674

RESUMO

In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines.


Assuntos
Metabolômica , Espectroscopia de Prótons por Ressonância Magnética , Software
12.
J Chromatogr A ; 1158(1-2): 111-25, 2007 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-17420026

RESUMO

All analysts face the same situations as method validation is the process of proving that an analytical method is acceptable for its intended purpose. In order to resolve this problem, the analyst refers to regulatory or guidance documents, and therefore the validity of the analytical methods is dependent on the guidance, terminology and methodology, proposed in these documents. It is therefore of prime importance to have clear definitions of the different validation criteria used to assess this validity. It is also necessary to have methodologies in accordance with these definitions and consequently to use statistical methods which are relevant with these definitions, the objective of the validation and the objective of the analytical method. The main purpose of this paper is to outline the inconsistencies between some definitions of the criteria and the experimental procedures proposed to evaluate those criteria in recent documents dedicated to the validation of analytical methods in the pharmaceutical field, together with the risks and problems when trying to cope with contradictory, and sometimes scientifically irrelevant, requirements and definitions.


Assuntos
Técnicas de Química Analítica , Indústria Farmacêutica/legislação & jurisprudência , Sensibilidade e Especificidade
13.
J Chromatogr A ; 1158(1-2): 126-37, 2007 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-17420025

RESUMO

It is recognized that the purpose of validation of analytical methods is to demonstrate that the method is suited for its intended purpose. Validation is not only required by regulatory authorities, but is also a decisive phase before the routine use of the method. For a quantitative analytical method the objective is to quantify the target analytes with a known and suitable accuracy. For that purpose, first, a decision about the validity of the method based on prediction is proposed: a method is declared proper for routine application if it is considered that most of the future results generated will be accurate enough. This can be achieved by using the "beta-expectation tolerance interval" (accuracy profile) as the decision tool to assess the validity of the analytical method. Moreover, the concept of "fit-for-purpose" is also proposed here to select the most relevant response function as calibration curve, i.e. choosing a response function based solely on the predicted results this model will allow to obtain. This paper reports four case studies where the results obtained with quality control samples in routine were compared to predictions made in the validation phase. Predictions made using the "beta-expectation tolerance interval" are shown to be accurate and trustful for decision making. It is therefore suggested that an adequate way to conciliate both the objectives of the analytical method in routine analysis and those of the validation step consists in taking the decision about the validity of the analytical method based on prediction of the future results using the most appropriate response function curve, i.e. the fit-for-future-purpose concept.


Assuntos
Cromatografia Líquida/métodos , Espectrofotometria Ultravioleta/métodos , Espectrometria de Massas em Tandem/métodos , Preparações Farmacêuticas/análise
14.
J Chromatogr A ; 1120(1-2): 102-11, 2006 Jul 07.
Artigo em Inglês | MEDLINE | ID: mdl-16643932

RESUMO

Nonaqueous capillary electrophoresis (NACE) was successfully applied to the enantiomeric purity determination of S-timolol maleate using heptakis(2,3-di-O-methyl-6-O-sulfo)-beta-cyclodextrin (HDMS-beta-CD) as chiral selector. With a background electrolyte made up of a methanolic solution of 0.75 M formic acid, 30 mM potassium camphorsulfonate and containing 30 mM HDMS-beta-CD, the determination of 0.1% of R-timolol in S-timolol could be performed with an enantiomeric resolution of 8.5. Pyridoxine was selected as internal standard. The NACE method was then fully validated by applying a novel strategy using accuracy profiles. It is based on beta-expectation tolerance intervals for the total measurement error which includes trueness and intermediate precision. The uncertainty of measurements derived from beta-expectation tolerance intervals was estimated at each concentration level of the validation standards. To confirm the suitability of the developed and validated method, several real samples of S-timolol maleate containing R-timolol maleate at different concentrations were analysed and the results were compared to those obtained by liquid chromatography.


Assuntos
Eletroforese Capilar/métodos , Timolol/isolamento & purificação , beta-Ciclodextrinas/química , Anti-Hipertensivos/química , Anti-Hipertensivos/isolamento & purificação , Estrutura Molecular , Padrões de Referência , Análise de Regressão , Reprodutibilidade dos Testes , Estereoisomerismo , Timolol/química , beta-Ciclodextrinas/normas
15.
Eur J Pharm Biopharm ; 80(1): 226-34, 2012 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-21983606

RESUMO

From a quality by design perspective, the aim of the present study was to demonstrate the applicability of a Bayesian statistical methodology to identify the Design Space (DS) of a spray-drying process. Following the ICH Q8 guideline, the DS is defined as the "multidimensional combination and interaction of input variables (e.g., materials attributes) and process parameters that have been demonstrated to provide assurance of quality." Thus, a predictive risk-based approach was set up in order to account for the uncertainties and correlations found in the process and in the derived critical quality attributes such as the yield, the moisture content, the inhalable fraction of powder, the compressibility index, and the Hausner ratio. This allowed quantifying the guarantees and the risks to observe whether the process shall run according to specifications. These specifications describe satisfactory quality outputs and were defined a priori given safety, efficiency, and economical reasons. Within the identified DS, validation of the optimal condition was effectuated. The optimized process was shown to perform as expected, providing a product for which the quality is built in by the design and controlled setup of the equipment, regarding identified critical process parameters: the inlet temperature, the feed rate, and the spray flow rate.


Assuntos
Química Farmacêutica/métodos , Tecnologia Farmacêutica/métodos , Teorema de Bayes , Inalação , Modelos Lineares , Modelos Estatísticos , Análise Multivariada , Tamanho da Partícula , Pós/química , Controle de Qualidade , Reprodutibilidade dos Testes , Temperatura , Termogravimetria/métodos
16.
Anal Chim Acta ; 705(1-2): 193-206, 2011 Oct 31.
Artigo em Inglês | MEDLINE | ID: mdl-21962362

RESUMO

Methods validation is mandatory in order to assess the fitness of purpose of the developed analytical method. Of core importance at the end of the validation is the evaluation of the reliability of the individual results that will be generated during the routine application of the method. Regulatory guidelines provide a general framework to assess the validity of a method, but none address the issue of results reliability. In this study, a Bayesian approach is proposed to address this concern. Results reliability is defined here as "the probability (π) of an analytical method to provide analytical results (X) within predefined acceptance limits (±λ) around their reference or conventional true concentration values (µ(T)) over a defined concentration range and under given environmental and operating conditions." By providing the minimum reliability probability (π(min)) needed for the subsequent routine application of the method, as well as specifications or acceptance limits (±λ), the proposed Bayesian approach provides the effective probability of obtaining reliable future analytical results over the whole concentration range investigated. This is summarised in a single graph: the reliability profile. This Bayesian reliability profile is also compared to two frequentist approaches, the first one derived from the work of Dewé et al. [W. Dewé, B. Govaerts, B. Boulanger, E. Rozet, P. Chiap, Ph. Hubert, Chemometr. Intell. Lab. Syst. 85 (2007) 262-268] and the second proposed by Govaerts et al. [B. Govaerts, W. Dewé, M. Maumy, B. Boulanger, Qual. Reliab. Eng. Int. 24 (2008) 667-680]. Furthermore, to illustrate the applicability of the Bayesian reliability profile, this approach is also applied here to a bioanalytical method dedicated to the determination of ketoglutaric acid (KG) and hydroxymethylfurfural (HMF) in human plasma by SPE-HPLC-UV.


Assuntos
Cromatografia Líquida de Alta Pressão/métodos , Reprodutibilidade dos Testes , Extração em Fase Sólida/métodos , Teorema de Bayes , Simulação por Computador , Furaldeído/análogos & derivados , Furaldeído/sangue , Humanos , Ácidos Cetoglutáricos/sangue , Modelos Estatísticos , Probabilidade , Espectrofotometria Ultravioleta/métodos
17.
Anal Chim Acta ; 691(1-2): 33-42, 2011 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-21458628

RESUMO

HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods.


Assuntos
Cromatografia Líquida de Alta Pressão/métodos , Algoritmos , Automação , Modelos Químicos , Projetos de Pesquisa
18.
J Chromatogr B Analyt Technol Biomed Life Sci ; 877(23): 2235-43, 2009 Aug 01.
Artigo em Inglês | MEDLINE | ID: mdl-19577965

RESUMO

The 3rd American Association of Pharmaceutical Scientists (AAPS)/Food and Drug Administration (FDA) Bioanalytical workshop in 2006 concluded with several new recommendations regarding the validation of bioanalytical methods in a report published in 2007. It was aimed to conciliate or adapt validation principles for small and large molecules and an opportunity to revisit some of the major decision rules related to acceptance criteria given the experience accumulated since 1990. The purpose here is to provide a "risk-based" reading of the recommendations of 3rd AAPS/FDA Bioanalytical Workshop. Five decision rules were compared using simulations: the proposed pre-study FDA and Total Error Rules, the rules based on the beta-Expectation Tolerance and beta-gamma-Content Tolerance Interval and, finally, the 4-6-20 rule for in-study acceptance of runs. The simulation results demonstrated that the beta-Expectation Tolerance Rule controls appropriately the risk. The beta-gamma-Content Tolerance Interval was found to be too conservative, depending on the objective, and to lead to a high rate of rejection of procedures that could be considered as acceptable. On the other side, the FDA and the AAPS/FDA workshop Total Error Rule, combined or not, did not achieve their intended objective. With these rules, the risk is high to deliver results in study that would not meet the targeted acceptance criteria. This can be explained because, first, there is confusion between the quality of a procedure and the fitness of purpose of the results it could produce and, second, between the initial performances of a procedure, for example evaluated during pre-study validation and the quality of the future results.


Assuntos
Bioensaio/normas , Técnicas de Química Analítica/normas , Guias como Assunto , Bioensaio/métodos , Técnicas de Química Analítica/métodos , Congressos como Assunto/legislação & jurisprudência , Estudos de Validação como Assunto
19.
Talanta ; 79(1): 77-85, 2009 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-19376347

RESUMO

One of the major issues within the context of the fully automated development of chromatographic methods consists of the automated detection and identification of peaks coming from complex samples such as multi-component pharmaceutical formulations or stability studies of these formulations. The same problem can also occur with plant materials or biological matrices. This step is thus critical and time-consuming, especially when a Design of Experiments (DOE) approach is used to generate chromatograms. The use of DOE will often maximize the changes of the analytical conditions in order to explore an experimental domain. Unfortunately, this generally provides very different and "unpredictable" chromatograms which can be difficult to interpret, thus complicating peak detection and peak tracking (i.e. matching peaks among all the chromatograms). In this context, Independent Components Analysis (ICA), a new statistically based signal processing methods was investigated to solve this problem. The ICA principle assumes that the observed signal is the resultant of several phenomena (known as sources) and that all these sources are statistically independent. Under those assumptions, ICA is able to recover the sources which will have a high probability of representing the constitutive components of a chromatogram. In the present study, ICA was successfully applied for the first time to HPLC-UV-DAD chromatograms and it was shown that ICA allows differentiation of noise and artifact components from those of interest by applying clustering methods based on high-order statistics computed on these components. Furthermore, on the basis of the described numerical strategy, it was also possible to reconstruct a cleaned chromatogram with minimum influence of noise and baseline artifacts. This can present a significant advance towards the objective of providing helpful tools for the automated development of liquid chromatography (LC) methods. It seems that analytical investigations could be shortened when using this type of methodologies.


Assuntos
Cromatografia Líquida de Alta Pressão/métodos , Misturas Complexas/análise , Processamento de Sinais Assistido por Computador , Automação , Cromatografia Líquida de Alta Pressão/instrumentação , Análise por Conglomerados , Análise de Componente Principal/métodos
20.
Electrophoresis ; 27(12): 2386-99, 2006 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-16718642

RESUMO

Analyses of statistical variance were applied to evaluate the precision and practicality of a CD-based NACE assay for R-timolol after enantiomeric separation of R- and S-timolol. Data were collected in an interlaboratory study by 11 participating laboratories located in Europe and North America. General qualitative method performance was examined using suitability descriptors (i.e. resolution, selectivity, migration times and S/N), while precision was determined by quantification of variances in the determination of R-timolol at four different impurity levels in S-timolol maleate samples. The interlaboratory trials were designed in accordance with the ISO guideline 5725-2. This allowed estimating for each sample, the different variances, i.e. between-laboratory (s2(Laboratories)), between-day (s2(Days)) and between-replicate (s2(Replicates)). The variances of repeatability (s2r) and reproducibility (s2R) were then calculated. The estimated uncertainty, derived from the precision estimates, seems to be concentration-dependent above a given threshold. This example of R-timolol illustrates how a laboratory can evaluate uncertainty in general.


Assuntos
Antagonistas Adrenérgicos beta/análise , Eletroforese Capilar/métodos , Transferência de Tecnologia , Timolol/análise , Contaminação de Medicamentos , Reprodutibilidade dos Testes , Incerteza
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA