Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 207
Filter
Add more filters

Publication year range
1.
Semin Cancer Biol ; 97: 30-41, 2023 12.
Article in English | MEDLINE | ID: mdl-37979714

ABSTRACT

Cardiotoxicity is a common side-effect of many cancer therapeutics; however, to-date there has been very little push to understand the mechanisms underlying this group of pathologies. This has led to the emergence of cardio-oncology, a field of medicine focused on understanding the effects of cancer and its treatment on the human heart. Here, we describe how mechanistic modeling approaches have been applied to study open questions in the cardiovascular system and how these approaches are being increasingly applied to advance knowledge of the underlying effects of cancer treatments on the human heart. A variety of mechanistic, mathematical modeling techniques have been applied to explore the link between common cancer treatments, such as chemotherapy, radiation, targeted therapy, and immunotherapy, and cardiotoxicity, nevertheless there is limited coverage in the different types of cardiac dysfunction that may be associated with these treatments. Moreover, cardiac modeling has a rich heritage of mathematical modeling and is well suited for the further development of novel approaches for understanding the cardiotoxicities associated with cancer therapeutics. There are many opportunities to combine mechanistic, bottom-up approaches with data-driven, top-down approaches to improve personalized, precision oncology to better understand, and ultimately mitigate, cardiac dysfunction in cancer patients.


Subject(s)
Antineoplastic Agents , Cardiovascular System , Heart Diseases , Neoplasms , Humans , Neoplasms/pathology , Cardiotoxicity/etiology , Cardiotoxicity/drug therapy , Antineoplastic Agents/adverse effects , Precision Medicine , Heart Diseases/drug therapy , Cardiovascular System/pathology
2.
Brief Bioinform ; 23(6)2022 11 19.
Article in English | MEDLINE | ID: mdl-36184188

ABSTRACT

In recent years, artificial intelligence (AI)/machine learning has emerged as a plausible alternative to systems biology for the elucidation of biological phenomena and in attaining specified design objective in synthetic biology. Although considered highly disruptive with numerous notable successes so far, we seek to bring attention to both the fundamental and practical pitfalls of their usage, especially in illuminating emergent behaviors from chaotic or stochastic systems in biology. Without deliberating on their suitability and the required data qualities and pre-processing approaches beforehand, the research and development community could experience similar 'AI winters' that had plagued other fields. Instead, we anticipate the integration or combination of the two approaches, where appropriate, moving forward.


Subject(s)
Artificial Intelligence , Systems Biology , Machine Learning
3.
Biotechnol Bioeng ; 2024 Jun 10.
Article in English | MEDLINE | ID: mdl-38853778

ABSTRACT

The fifth modeling workshop (5MW) was held in June 2023 at Favrholm, Denmark and sponsored by Recovery of Biological Products Conference Series. The goal of the workshop was to assemble modeling practitioners to review and discuss the current state, progress since the last fourth mini modeling workshop (4MMW), gaps and opportunities for development, deployment and maintenance of models in bioprocess applications. Areas of focus were four categories: biophysics and molecular modeling, mechanistic modeling, computational fluid dynamics (CFD) and plant modeling. Highlights of the workshop included significant advancements in biophysical/molecular modeling to novel protein constructs, mechanistic models for filtration and initial forays into modeling of multiphase systems using CFD for a bioreactor and mapped strategically to cell line selection/facility fit. A significant impediment to more fully quantitative and calibrated models for biophysics is the lack of large, anonymized datasets. A potential solution would be the use of specific descriptors in a database that would allow for detailed analyzes without sharing proprietary information. Another gap identified was the lack of a consistent framework for use of models that are included or support a regulatory filing beyond the high-level guidance in ICH Q8-Q11. One perspective is that modeling can be viewed as a component or precursor of machine learning (ML) and artificial intelligence (AI). Another outcome was alignment on a key definition for "mechanistic modeling." Feedback from participants was that there was progression in all of the fields of modeling within scope of the conference. Some areas (e.g., biophysics and molecular modeling) have opportunities for significant research investment to realize full impact. However, the need for ongoing research and development for all model types does not preclude the application to support process development, manufacturing and use in regulatory filings. Analogous to ML and AI, given the current state of the four modeling types, a prospective investment in educating inter-disciplinary subject matter experts (e.g., data science, chromatography) is essential to advancing the modeling community.

4.
Biotechnol Bioeng ; 121(5): 1609-1625, 2024 May.
Article in English | MEDLINE | ID: mdl-38454575

ABSTRACT

Digitalization has paved the way for new paradigms such as digital shadows and digital twins for fermentation processes, opening the door for real-time process monitoring, control, and optimization. With a digital shadow, real-time model adaptation to accommodate complex metabolic phenomena such as metabolic shifts of a process can be monitored. Despite the many benefits of digitalization, the potential has not been fully reached in the industry. This study investigates the development of a digital shadow for a very complex fungal fermentation process in terms of microbial physiology and fermentation operation on pilot-scale at Novonesis and the challenges thereof. The process has historically been difficult to optimize and control due to a lack of offline measurements and an absence of biomass measurements. Pilot-scale and lab-scale fermentations were conducted for model development and validation. With all available pilot-scale data, a data-driven soft sensor was developed to estimate the main substrate concentration (glucose) with a normalized root mean squared error (N-RMSE) of 2%. This robust data-driven soft sensor was able to estimate accurately in lab-scale (volume < 20× pilot) with a N-RMSE of 7.8%. A hybrid soft sensor was developed by combining the data-driven soft sensor with a mass balance to estimate the glycerol and biomass concentrations on pilot-scale data with N-RMSEs of 11% and 21%, respectively. A digital shadow modeling framework was developed by coupling a mechanistic model (MM) with the hybrid soft sensor. The digital shadow modeling framework significantly improved the predictability compared with the MM. The contribution of this study brings the application of digital shadows closer to industrial implementation. It demonstrates the high potential of using this type of modeling framework for scale-up and leads the way to a new generation of in silico-based process development.


Subject(s)
Bioreactors , Glucose , Fermentation , Bioreactors/microbiology , Glycerol , Biomass
5.
Biotechnol Bioeng ; 2024 May 02.
Article in English | MEDLINE | ID: mdl-38695152

ABSTRACT

The in vitro transcription (IVT) reaction used in the production of messenger RNA vaccines and therapies remains poorly quantitatively understood. Mechanistic modeling of IVT could inform reaction design, scale-up, and control. In this work, we develop a mechanistic model of IVT to include nucleation and growth of magnesium pyrophosphate crystals and subsequent agglomeration of crystals and DNA. To help generalize this model to different constructs, a novel quantitative description is included for the rate of transcription as a function of target sequence length, DNA concentration, and T7 RNA polymerase concentration. The model explains previously unexplained trends in IVT data and quantitatively predicts the effect of adding the pyrophosphatase enzyme to the reaction system. The model is validated on additional literature data showing an ability to predict transcription rates as a function of RNA sequence length.

6.
Biotechnol Bioeng ; 121(2): 719-734, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37942560

ABSTRACT

Recombinant adeno-associated viral vectors (rAAVs) have become an industry-standard technology in the field of gene therapy, but there are still challenges to be addressed in their biomanufacturing. One of the biggest challenges is the removal of capsid species other than that which contains the gene of interest. In this work, we develop a mechanistic model for the removal of empty capsids-those that contain no genetic material-and enrichment of full rAAV using anion-exchange membrane chromatography. The mechanistic model was calibrated using linear gradient experiments, resulting in good agreement with the experimental data. The model was then applied to optimize the purification process through maximization of yield studying the impact of mobile phase salt concentration and pH, isocratic wash and elution length, flow rate, percent full (purity) requirement, loading density (challenge), and the use of single-step or two-step elution modes. A solution from the optimization with purity of 90% and recovery yield of 84% was selected and successfully validated, as the model could predict the recovery yield with remarkable fidelity and was able to find process conditions that led to significant enrichment. This is, to the best of our knowledge, the first case study of the application of de novo mechanistic modeling for the enrichment of full capsids in rAAV manufacturing, and it serves as demonstration of the potential of mechanistic modeling in rAAV process development.


Subject(s)
Dependovirus , Genetic Vectors , Chromatography, Ion Exchange/methods , Dependovirus/genetics , Genetic Therapy , Capsid/chemistry
7.
Pharm Res ; 41(7): 1369-1379, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38918309

ABSTRACT

PURPOSE: Recently, there has been rapid development in model-informed drug development, which has the potential to reduce animal experiments and accelerate drug discovery. Physiologically based pharmacokinetic (PBPK) and machine learning (ML) models are commonly used in early drug discovery to predict drug properties. However, basic PBPK models require a large number of molecule-specific inputs from in vitro experiments, which hinders the efficiency and accuracy of these models. To address this issue, this paper introduces a new computational platform that combines ML and PBPK models. The platform predicts molecule PK profiles with high accuracy and without the need for experimental data. METHODS: This study developed a whole-body PBPK model and ML models of plasma protein fraction unbound ( f up ), Caco-2 cell permeability, and total plasma clearance to predict the PK of small molecules after intravenous administration. Pharmacokinetic profiles were simulated using a "bottom-up" PBPK modeling approach with ML inputs. Additionally, 40 compounds were used to evaluate the platform's accuracy. RESULTS: Results showed that the ML-PBPK model predicted the area under the concentration-time curve (AUC) with 65.0 % accuracy within a 2-fold range, which was higher than using in vitro inputs with 47.5 % accuracy. CONCLUSION: The ML-PBPK model platform provides high accuracy in prediction and reduces the number of experiments and time required compared to traditional PBPK approaches. The platform successfully predicts human PK parameters without in vitro and in vivo experiments and can potentially guide early drug discovery and development.


Subject(s)
Machine Learning , Models, Biological , Humans , Caco-2 Cells , Computer Simulation , Pharmacokinetics , Drug Discovery/methods , Area Under Curve , Administration, Intravenous , Male , Pharmaceutical Preparations/metabolism , Blood Proteins/metabolism
8.
Proc Natl Acad Sci U S A ; 118(15)2021 04 13.
Article in English | MEDLINE | ID: mdl-33837149

ABSTRACT

Many animals restrict their movements to a characteristic home range. This constrained pattern of space use is thought to result from the foraging benefits of memorizing the locations and quality of heterogeneously distributed resources. However, due to the confounding effects of sensory perception, the role of memory in home-range movement behavior lacks definitive evidence in the wild. Here, we analyze the foraging decisions of a large mammal during a field resource manipulation experiment designed to disentangle the effects of memory and perception. We parametrize a mechanistic model of spatial transitions using experimental data to quantify the cognitive processes underlying animal foraging behavior and to predict how individuals respond to resource heterogeneity in space and time. We demonstrate that roe deer (Capreolus capreolus) rely on memory, not perception, to track the spatiotemporal dynamics of resources within their home range. Roe deer foraging decisions were primarily based on recent experience (half-lives of 0.9 and 5.6 d for attribute and spatial memory, respectively), enabling them to adapt to sudden changes in resource availability. The proposed memory-based model was able to both quantify the cognitive processes underlying roe deer behavior and accurately predict how they shifted resource use during the experiment. Our study highlights the fact that animal foraging decisions are based on incomplete information on the locations of available resources, a factor that is critical to developing accurate predictions of animal spatial behavior but is typically not accounted for in analyses of animal movement in the wild.


Subject(s)
Deer/physiology , Feeding Behavior , Memory , Animals , Cognition , Decision Making , Movement
9.
Proc Natl Acad Sci U S A ; 118(40)2021 10 05.
Article in English | MEDLINE | ID: mdl-34599095

ABSTRACT

Far from a uniform band, the biodiversity found across Earth's tropical moist forests varies widely between the high diversity of the Neotropics and Indomalaya and the relatively lower diversity of the Afrotropics. Explanations for this variation across different regions, the "pantropical diversity disparity" (PDD), remain contentious, due to difficulty teasing apart the effects of contemporary climate and paleoenvironmental history. Here, we assess the ubiquity of the PDD in over 150,000 species of terrestrial plants and vertebrates and investigate the relationship between the present-day climate and patterns of species richness. We then investigate the consequences of paleoenvironmental dynamics on the emergence of biodiversity gradients using a spatially explicit model of diversification coupled with paleoenvironmental and plate tectonic reconstructions. Contemporary climate is insufficient in explaining the PDD; instead, a simple model of diversification and temperature niche evolution coupled with paleoaridity constraints is successful in reproducing the variation in species richness and phylogenetic diversity seen repeatedly among plant and animal taxa, suggesting a prevalent role of paleoenvironmental dynamics in combination with niche conservatism. The model indicates that high biodiversity in Neotropical and Indomalayan moist forests is driven by complex macroevolutionary dynamics associated with mountain uplift. In contrast, lower diversity in Afrotropical forests is associated with lower speciation rates and higher extinction rates driven by sustained aridification over the Cenozoic. Our analyses provide a mechanistic understanding of the emergence of uneven diversity in tropical moist forests across 110 Ma of Earth's history, highlighting the importance of deep-time paleoenvironmental legacies in determining biodiversity patterns.


Subject(s)
Biodiversity , Forests , Tropical Climate , Animals , Biological Evolution , Earth, Planet
10.
Mol Biol Evol ; 39(4)2022 04 11.
Article in English | MEDLINE | ID: mdl-35298641

ABSTRACT

Research over the past two decades has made substantial inroads into our understanding of somatic mutations. Recently, these studies have focused on understanding their presence in homeostatic tissue. In parallel, agent-based mechanistic models have emerged as an important tool for understanding somatic mutation in tissue; yet no common methodology currently exists to provide base-pair resolution data for these models. Here, we present Gattaca as the first method for introducing and tracking somatic mutations at the base-pair resolution within agent-based models that typically lack nuclei. With nuclei that incorporate human reference genomes, mutational context, and sequence coverage/error information, Gattaca is able to realistically evolve sequence data, facilitating comparisons between in silico cell tissue modeling with experimental human somatic mutation data. This user-friendly method, incorporated into each in silico cell, allows us to fully capture somatic mutation spectra and evolution.


Subject(s)
Genome, Human , Neoplasms , Clonal Evolution , Humans , Mutation , Neoplasms/genetics
11.
New Phytol ; 240(5): 1743-1757, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37753542

ABSTRACT

The oxygen isotope composition (δ18 O) of tree-ring cellulose is used to evaluate tree physiological responses to climate, but their interpretation is still limited due to the complexity of the isotope fractionation pathways. We assessed the relative contribution of seasonal needle and xylem water δ18 O variations to the intra-annual tree-ring cellulose δ18 O signature of larch trees at two sites with contrasting soil water availability in the Swiss Alps. We combined biweekly δ18 O measurements of soil water, needle water, and twig xylem water with intra-annual δ18 O measurements of tree-ring cellulose, xylogenesis analysis, and mechanistic and structural equation modeling. Intra-annual cellulose δ18 O values resembled source water δ18 O mean levels better than needle water δ18 O. Large parts of the rings were formed under high proportional exchange with unenriched xylem water (pex ). Maximum pex values were achieved in August and imprinted on sections at 50-75% of the ring. High pex values were associated with periods of high atmospheric evaporative demand (VPD). While VPD governed needle water δ18 O variability, we estimated a limited Péclet effect at both sites. Due to a variable pex , source water has a strong influence over large parts of the intra-annual tree-ring cellulose δ18 O variations, potentially masking signals coming from needle-level processes.


Subject(s)
Trees , Water , Trees/metabolism , Water/metabolism , Oxygen Isotopes/metabolism , Xylem/metabolism , Cellulose/metabolism , Soil/chemistry , Carbon Isotopes/metabolism
12.
Biotechnol Bioeng ; 2023 May 31.
Article in English | MEDLINE | ID: mdl-37256724

ABSTRACT

An optimal purification process for biopharmaceutical products is important to meet strict safety regulations, and for economic benefits. To find the global optimum, it is desirable to screen the overall design space. Advanced model-based approaches enable to screen a broad range of the design-space, in contrast to traditional statistical or heuristic-based approaches. Though, chromatographic mechanistic modeling (MM), one of the advanced model-based approaches, can be speed-limiting for flowsheet optimization, which evaluates every purification possibility (e.g., type and order of purification techniques, and their operating conditions). Therefore, we propose to use artificial neural networks (ANNs) during global optimization to select the most optimal flowsheets. So, the number of flowsheets for final local optimization is reduced and consequently the overall optimization time. Employing ANNs during global optimization proved to reduce the number of flowsheets from 15 to only 3. From these three, one flowsheet was optimized locally and similar final results were found when using the global outcome of either the ANN or MM as starting condition. Moreover, the overall flowsheet optimization time was reduced by 50% when using ANNs during global optimization. This approach accelerates the early purification process design; moreover, it is generic, flexible, and regardless of sample material's type.

13.
Mol Pharm ; 20(11): 5616-5630, 2023 11 06.
Article in English | MEDLINE | ID: mdl-37812508

ABSTRACT

Accurate prediction of human pharmacokinetics (PK) remains one of the key objectives of drug metabolism and PK (DMPK) scientists in drug discovery projects. This is typically performed by using in vitro-in vivo extrapolation (IVIVE) based on mechanistic PK models. In recent years, machine learning (ML), with its ability to harness patterns from previous outcomes to predict future events, has gained increased popularity in application to absorption, distribution, metabolism, and excretion (ADME) sciences. This study compares the performance of various ML and mechanistic models for the prediction of human IV clearance for a large (645) set of diverse compounds with literature human IV PK data, as well as measured relevant in vitro end points. ML models were built using multiple approaches for the descriptors: (1) calculated physical properties and structural descriptors based on chemical structure alone (classical QSAR/QSPR); (2) in vitro measured inputs only with no structure-based descriptors (ML IVIVE); and (3) in silico ML IVIVE using in silico model predictions for the in vitro inputs. For the mechanistic models, well-stirred and parallel-tube liver models were considered with and without the use of empirical scaling factors and with and without renal clearance. The best ML model for the prediction of in vivo human intrinsic clearance (CLint) was an in vitro ML IVIVE model using only six in vitro inputs with an average absolute fold error (AAFE) of 2.5. The best mechanistic model used the parallel-tube liver model, with empirical scaling factors resulting in an AAFE of 2.8. The corresponding mechanistic model with full in silico inputs achieved an AAFE of 3.3. These relative performances of the models were confirmed with the prediction of 16 Pfizer drug candidates that were not part of the original data set. Results show that ML IVIVE models are comparable to or superior to their best mechanistic counterparts. We also show that ML IVIVE models can be used to derive insights into factors for the improvement of mechanistic PK prediction.


Subject(s)
Body Fluids , Humans , Computer Simulation , Drug Discovery , Kinetics , Machine Learning , Models, Biological , Metabolic Clearance Rate
14.
Mol Pharm ; 20(11): 5332-5344, 2023 11 06.
Article in English | MEDLINE | ID: mdl-37783568

ABSTRACT

Dry powder inhaler (DPI) products are commonly formulated as a mixture of micronized drug particles and large carrier particles, with or without additional fine particle excipients, followed by final powder filling into dose containment systems such as capsules, blisters, or reservoirs. DPI product manufacturing consists of a series of unit operations, including particle size reduction, blending, and filling. This review provides an overview of the relevant critical process parameters used for jet milling, high-shear blending, and dosator/drum capsule filling operations across commonly utilized instruments. Further, this review describes the recent achievements regarding the application of empirical and mechanistic models, especially discrete element method (DEM) simulation, in DPI process development. Although to date only limited modeling/simulation work has been accomplished, in the authors' perspective, process design and development are destined to be more modeling/simulation driven with the emphasis on evaluating the impact of material attributes/process parameters on process performance. The advancement of computational power is expected to enable modeling/simulation approaches to tackle more complex problems with better accuracy when dealing with real-world DPI process operations.


Subject(s)
Drug Carriers , Dry Powder Inhalers , Powders , Drug Compounding/methods , Administration, Inhalation , Computer Simulation , Particle Size , Aerosols
15.
J Theor Biol ; 557: 111334, 2023 01 21.
Article in English | MEDLINE | ID: mdl-36306828

ABSTRACT

The COVID-19 pandemic has underscored the need to understand the dynamics of SARS-CoV-2 respiratory infection and protection provided by the immune response. SARS-CoV-2 infections are characterized by a particularly high viral load, and further by the small number of inhaled virions sufficient to generate a high viral titer in the nasal passage a few days after exposure. SARS-CoV-2 specific antibodies (Ab), induced from vaccines, previous infection, or inhaled monoclonal Ab, have proven effective against SARS-CoV-2 infection. Our goal in this work is to model the protective mechanisms that Ab can provide and to assess the degree of protection from individual and combined mechanisms at different locations in the respiratory tract. Neutralization, in which Ab bind to virion spikes and inhibit them from binding to and infecting target cells, is one widely reported protective mechanism. A second mechanism of Ab protection is muco-trapping, in which Ab crosslink virions to domains on mucin polymers, effectively immobilizing them in the mucus layer. When muco-trapped, the continuous clearance of the mucus barrier by coordinated ciliary propulsion entrains the trapped viral load toward the esophagus to be swallowed. We model and simulate the protection provided by either and both mechanisms at different locations in the respiratory tract, parametrized by the Ab titer and binding-unbinding rates of Ab to viral spikes and mucin domains. Our results illustrate limits in the degree of protection by neutralizing Ab alone, the powerful protection afforded by muco-trapping Ab, and the potential for dual protection by muco-trapping and neutralizing Ab to arrest a SARS-CoV-2 infection. This manuscript was submitted as part of a theme issue on "Modelling COVID-19 and Preparedness for Future Pandemics".


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Pandemics , Antibodies, Viral , Respiratory System , Mucins
16.
J Theor Biol ; 565: 111470, 2023 05 21.
Article in English | MEDLINE | ID: mdl-36965846

ABSTRACT

The SARS-CoV-2 coronavirus continues to evolve with scores of mutations of the spike, membrane, envelope, and nucleocapsid structural proteins that impact pathogenesis. Infection data from nasal swabs, nasal PCR assays, upper respiratory samples, ex vivo cell cultures and nasal epithelial organoids reveal extreme variabilities in SARS-CoV-2 RNA titers within and between the variants. Some variabilities are naturally prone to clinical testing protocols and experimental controls. Here we focus on nasal viral load sensitivity arising from the timing of sample collection relative to onset of infection and from heterogeneity in the kinetics of cellular infection, uptake, replication, and shedding of viral RNA copies. The sources of between-variant variability are likely due to SARS-CoV-2 structural protein mutations, whereas within-variant population variability is likely due to heterogeneity in cellular response to that particular variant. With the physiologically faithful, agent-based mechanistic model of inhaled exposure and infection from (Chen et al., 2022), we perform statistical sensitivity analyses of the progression of nasal viral titers in the first 0-48 h post infection, focusing on three kinetic mechanisms. Model simulations reveal shorter latency times of infected cells (including cellular uptake, viral RNA replication, until the onset of viral RNA shedding) exponentially accelerate nasal viral load. Further, the rate of infectious RNA copies shed per day has a proportional influence on nasal viral load. Finally, there is a very weak, negative correlation of viral load with the probability of infection per virus-cell encounter, the model proxy for spike-receptor binding affinity.


Subject(s)
COVID-19 , SARS-CoV-2 , Humans , SARS-CoV-2/genetics , RNA, Viral/genetics , Viral Load , COVID-19 Testing
17.
Crit Rev Food Sci Nutr ; 63(26): 8275-8291, 2023.
Article in English | MEDLINE | ID: mdl-35380483

ABSTRACT

Drying is one of the oldest and most widely used methods for food preservation. It reduces the availability of moisture and inhibits microbial and enzymatic spoilage in food products. Foam mat drying is a mild drying technique used for semiliquid and liquid foodstuff. It is useful for heat-sensitive and sticky liquid food products. In this process, liquid food is converted into foam using surfactant additives, which can be a foaming agent or foam stabilizer. These additives are surface-active compounds of vegetative and animal origins. The foamed material is then convectively dried using hot air. The foam mat drying is an efficient and economical technique. With the emergence of different hybrid techniques such as foam mat freeze drying, foamed spray drying, foamed vacuum drying, and microwave assisted foam mat drying, the powders' physical, chemical, and functional properties have enhanced many folds. These strategies have shown very promising results in terms of cost and time efficiency in almost all the cases barring a few exceptions. This review article attempts to comprehensively summarize the mechanisms dictating the foam mat drying process, novel technological tools for modeling, mathematical and computational modeling, effects of various foaming additives, and various hybrid techniques employed to foam mat drying.


Subject(s)
Desiccation , Hot Temperature , Animals , Desiccation/methods , Freeze Drying , Surface-Active Agents , Excipients
18.
Pharm Res ; 40(2): 359-373, 2023 Feb.
Article in English | MEDLINE | ID: mdl-35169960

ABSTRACT

PURPOSE: In drug discovery, rats are widely used for pharmacological and toxicological studies. We previously reported that a mechanism-based oral absorption model, the gastrointestinal unified theoretical framework (GUT framework), can appropriately predict the fraction of a dose absorbed (Fa) in humans and dogs. However, there are large species differences between humans and rats. The purpose of the present study was to evaluate the predictability of the GUT framework for rat Fa. METHOD: The Fa values of 20 model drugs (a total of 39 Fa data) were predicted in a bottom-up manner. Based on the literature survey, the bile acid concentration (Cbile) and the intestinal fluid volume were set to 15 mM and 4 mL/kg, respectively, five and two times higher than in humans. LogP, pKa, molecular weight, intrinsic solubility, bile micelle partition coefficients, and Caco-2 permeability were used as input data. RESULTS: The Fa values were appropriately predicted for highly soluble drugs (absolute average fold error (AAFE) = 1.65, 18 Fa data) and poorly soluble drugs (AAFE = 1.57, 21 Fa data). When the species difference in Cbile was ignored, Fa was over- and under-predicted for permeability and solubility limited cases, respectively. High Cbile in rats reduces the free fraction of drug molecules available for epithelial membrane permeation while increasing the solubility of poorly soluble drugs. CONCLUSION: The Fa values in rats were appropriately predicted by the GUT framework. This result would be of great help for a better understanding of species differences and model-informed preclinical formulation development.


Subject(s)
Bile , Intestinal Absorption , Humans , Rats , Animals , Dogs , Administration, Oral , Caco-2 Cells , Drug Discovery , Solubility , Permeability
19.
J Sep Sci ; 46(9): e2300031, 2023 May.
Article in English | MEDLINE | ID: mdl-36846902

ABSTRACT

In process development and characterization, the scale-up of chromatographic steps is a crucial part and brings a number of challenges. Usually, scale-down models are used to represent the process step, and constant column properties are assumed. The scaling is then typically based on the concept of linear scale-up. In this work, a mechanistic model describing an anti-Langmuirian to Langmuirian elution behavior of a polypeptide, calibrated with a pre-packed 1 ml column, is applied to demonstrate the scalability to larger column volumes up to 28.2 ml. Using individual column parameters for each column size, scaling to similar eluting salt concentrations, peak heights, and shapes is experimentally demonstrated by considering the model's relationship between the normalized gradient slope and the eluting salt concentration. Further scale-up simulations show improved model predictions when radial inhomogeneities in packing quality are considered.

20.
Am J Epidemiol ; 191(1): 1-6, 2022 01 01.
Article in English | MEDLINE | ID: mdl-34447984

ABSTRACT

Dynamical models, commonly used in infectious disease epidemiology, are formal mathematical representations of time-changing systems or processes. For many chronic disease epidemiologists, the link between dynamical models and predominant causal inference paradigms is unclear. In this commentary, we explain the use of dynamical models for representing causal systems and the relevance of dynamical models for causal inference. In certain simple settings, dynamical modeling and conventional statistical methods (e.g., regression-based methods) are equivalent, but dynamical modeling has advantages over conventional statistical methods for many causal inference problems. Dynamical models can be used to transparently encode complex biological knowledge, interference and spillover, effect modification, and variables that influence each other in continuous time. As our knowledge of biological and social systems and access to computational resources increases, there will be growing utility for a variety of mathematical modeling tools in epidemiology.


Subject(s)
Causality , Epidemiologic Methods , Models, Theoretical , Humans , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL