RESUMO
BACKGROUND: The transition from explanative modeling of fitted data to the predictive modeling of unseen data for systems biology endeavors necessitates the effective recovery of reaction parameters. Yet, the relative efficacy of optimization algorithms in doing so remains under-studied, as to the specific reaction kinetics and the effect of measurement noises. To this end, we simulate the reactions of an artificial pathway using 4 kinetic formulations: generalized mass action (GMA), Michaelis-Menten, linear-logarithmic, and convenience kinetics. We then compare the effectiveness of 5 evolutionary algorithms (CMAES, DE, SRES, ISRES, G3PCX) for objective function optimization in kinetic parameter hyperspace to determine the corresponding estimated parameters. RESULTS: We quickly dropped the DE algorithm due to its poor performance. Baring measurement noise, we find the CMAES algorithm to only require a fraction of the computational cost incurred by other EAs for both GMA and linear-logarithmic kinetics yet performing as well by other criteria. However, with increasing noise, SRES and ISRES perform more reliably for GMA kinetics, but at considerably higher computational cost. Conversely, G3PCX is among the most efficacious for estimating Michaelis-Menten parameters regardless of noise, while achieving numerous folds saving in computational cost. Cost aside, we find SRES to be versatilely applicable across GMA, Michaelis-Menten, and linear-logarithmic kinetics, with good resilience to noise. Nonetheless, we could not identify the parameters of convenience kinetics using any algorithm. CONCLUSIONS: Altogether, we identify a protocol for predicting reaction parameters under marked measurement noise, as a step towards predictive modeling for systems biology endeavors.
Assuntos
Algoritmos , Cinética , Biologia de Sistemas/métodos , Modelos Biológicos , Simulação por Computador , Evolução BiológicaRESUMO
Cardiotoxicity is a common side-effect of many cancer therapeutics; however, to-date there has been very little push to understand the mechanisms underlying this group of pathologies. This has led to the emergence of cardio-oncology, a field of medicine focused on understanding the effects of cancer and its treatment on the human heart. Here, we describe how mechanistic modeling approaches have been applied to study open questions in the cardiovascular system and how these approaches are being increasingly applied to advance knowledge of the underlying effects of cancer treatments on the human heart. A variety of mechanistic, mathematical modeling techniques have been applied to explore the link between common cancer treatments, such as chemotherapy, radiation, targeted therapy, and immunotherapy, and cardiotoxicity, nevertheless there is limited coverage in the different types of cardiac dysfunction that may be associated with these treatments. Moreover, cardiac modeling has a rich heritage of mathematical modeling and is well suited for the further development of novel approaches for understanding the cardiotoxicities associated with cancer therapeutics. There are many opportunities to combine mechanistic, bottom-up approaches with data-driven, top-down approaches to improve personalized, precision oncology to better understand, and ultimately mitigate, cardiac dysfunction in cancer patients.
Assuntos
Antineoplásicos , Sistema Cardiovascular , Cardiopatias , Neoplasias , Humanos , Neoplasias/patologia , Cardiotoxicidade/etiologia , Cardiotoxicidade/tratamento farmacológico , Antineoplásicos/efeitos adversos , Medicina de Precisão , Cardiopatias/tratamento farmacológico , Sistema Cardiovascular/patologiaRESUMO
In recent years, artificial intelligence (AI)/machine learning has emerged as a plausible alternative to systems biology for the elucidation of biological phenomena and in attaining specified design objective in synthetic biology. Although considered highly disruptive with numerous notable successes so far, we seek to bring attention to both the fundamental and practical pitfalls of their usage, especially in illuminating emergent behaviors from chaotic or stochastic systems in biology. Without deliberating on their suitability and the required data qualities and pre-processing approaches beforehand, the research and development community could experience similar 'AI winters' that had plagued other fields. Instead, we anticipate the integration or combination of the two approaches, where appropriate, moving forward.
Assuntos
Inteligência Artificial , Biologia de Sistemas , Aprendizado de MáquinaRESUMO
Recombinant adeno-associated virus (rAAV) is a commonly used in vivo gene therapy vector because of its nonpathogenicity, long-term transgene expression, broad tropism, and ability to transduce both dividing and nondividing cells. However, rAAV vector production via transient transfection of mammalian cells typically yields a low fraction of filled-to-total capsids (~1%-30% of total capsids produced). Analysis of our previously developed mechanistic model for rAAV2/5 production attributed these low fill fractions to a poorly coordinated timeline between capsid synthesis and viral DNA replication and the repression of later phase capsid formation by Rep proteins. Here, we extend the model by quantifying the expression dynamics of total Rep proteins and their influence on the key steps of rAAV2/5 production using a multiple dosing transfection of human embryonic kidney 293 (HEK293) cells. We report that the availability of preformed empty capsids and viral DNA copies per cell are not limiting to the capsid-filling reaction. However, optimal expression of Rep proteins (<240 ± 13 ag per cell) enables enrichment of the filled capsid population (>12% of total capsids/cell) upstream. Our analysis suggests increased enrichment of filled capsids via regulating the expression of Rep proteins is possible but at the expense of per cell capsid titer in a triple plasmid transfection. Our study reveals an intrinsic limitation of scaling rAAV2/5 vector genome (vg) production and underscores the need for approaches that allow for regulating the expression of Rep proteins to maximize vg titer per cell upstream.
RESUMO
The in vitro transcription (IVT) reaction used in the production of messenger RNA vaccines and therapies remains poorly quantitatively understood. Mechanistic modeling of IVT could inform reaction design, scale-up, and control. In this work, we develop a mechanistic model of IVT to include nucleation and growth of magnesium pyrophosphate crystals and subsequent agglomeration of crystals and DNA. To help generalize this model to different constructs, a novel quantitative description is included for the rate of transcription as a function of target sequence length, DNA concentration, and T7 RNA polymerase concentration. The model explains previously unexplained trends in IVT data and quantitatively predicts the effect of adding the pyrophosphatase enzyme to the reaction system. The model is validated on additional literature data showing an ability to predict transcription rates as a function of RNA sequence length.
Assuntos
Cristalização , Difosfatos , Transcrição Gênica , Difosfatos/metabolismo , Difosfatos/química , RNA Polimerases Dirigidas por DNA/genética , RNA Polimerases Dirigidas por DNA/metabolismo , RNA Polimerases Dirigidas por DNA/química , DNA/química , DNA/genética , DNA/metabolismo , Compostos de Magnésio/química , Proteínas ViraisRESUMO
The fifth modeling workshop (5MW) was held in June 2023 at Favrholm, Denmark and sponsored by Recovery of Biological Products Conference Series. The goal of the workshop was to assemble modeling practitioners to review and discuss the current state, progress since the last fourth mini modeling workshop (4MMW), gaps and opportunities for development, deployment and maintenance of models in bioprocess applications. Areas of focus were four categories: biophysics and molecular modeling, mechanistic modeling, computational fluid dynamics (CFD) and plant modeling. Highlights of the workshop included significant advancements in biophysical/molecular modeling to novel protein constructs, mechanistic models for filtration and initial forays into modeling of multiphase systems using CFD for a bioreactor and mapped strategically to cell line selection/facility fit. A significant impediment to more fully quantitative and calibrated models for biophysics is the lack of large, anonymized datasets. A potential solution would be the use of specific descriptors in a database that would allow for detailed analyzes without sharing proprietary information. Another gap identified was the lack of a consistent framework for use of models that are included or support a regulatory filing beyond the high-level guidance in ICH Q8-Q11. One perspective is that modeling can be viewed as a component or precursor of machine learning (ML) and artificial intelligence (AI). Another outcome was alignment on a key definition for "mechanistic modeling." Feedback from participants was that there was progression in all of the fields of modeling within scope of the conference. Some areas (e.g., biophysics and molecular modeling) have opportunities for significant research investment to realize full impact. However, the need for ongoing research and development for all model types does not preclude the application to support process development, manufacturing and use in regulatory filings. Analogous to ML and AI, given the current state of the four modeling types, a prospective investment in educating inter-disciplinary subject matter experts (e.g., data science, chromatography) is essential to advancing the modeling community.
Assuntos
Simulação por Computador , Modelos Biológicos , Indústria FarmacêuticaRESUMO
Digitalization has paved the way for new paradigms such as digital shadows and digital twins for fermentation processes, opening the door for real-time process monitoring, control, and optimization. With a digital shadow, real-time model adaptation to accommodate complex metabolic phenomena such as metabolic shifts of a process can be monitored. Despite the many benefits of digitalization, the potential has not been fully reached in the industry. This study investigates the development of a digital shadow for aâ¯very complex fungal fermentation process in terms of microbial physiology and fermentation operation on pilot-scale at Novonesis and the challenges thereof. The process has historically been difficult to optimize and control due to a lack of offline measurements and an absence of biomass measurements. Pilot-scale and lab-scale fermentations were conducted for model development and validation. With all available pilot-scale data, a data-driven soft sensor was developed to estimate the main substrate concentration (glucose) with a normalized root mean squared error (N-RMSE) of 2%. This robust data-driven soft sensor was able to estimate accurately in lab-scale (volume < 20× pilot) with a N-RMSE of 7.8%. A hybrid soft sensor was developed by combining the data-driven soft sensor with a mass balance to estimate the glycerol and biomass concentrations on pilot-scale data with N-RMSEs of 11% and 21%, respectively. A digital shadow modeling framework was developed by coupling a mechanistic model (MM) with the hybrid soft sensor. The digital shadow modeling framework significantly improved the predictability compared with the MM. The contribution of this study brings the application of digital shadows closer to industrial implementation. It demonstrates the high potential of using this type of modeling framework for scale-up and leads the way to a new generation of in silico-based process development.
Assuntos
Reatores Biológicos , Glucose , Fermentação , Reatores Biológicos/microbiologia , Glicerol , BiomassaRESUMO
Recombinant adeno-associated viral vectors (rAAVs) have become an industry-standard technology in the field of gene therapy, but there are still challenges to be addressed in their biomanufacturing. One of the biggest challenges is the removal of capsid species other than that which contains the gene of interest. In this work, we develop a mechanistic model for the removal of empty capsids-those that contain no genetic material-and enrichment of full rAAV using anion-exchange membrane chromatography. The mechanistic model was calibrated using linear gradient experiments, resulting in good agreement with the experimental data. The model was then applied to optimize the purification process through maximization of yield studying the impact of mobile phase salt concentration and pH, isocratic wash and elution length, flow rate, percent full (purity) requirement, loading density (challenge), and the use of single-step or two-step elution modes. A solution from the optimization with purity of 90% and recovery yield of 84% was selected and successfully validated, as the model could predict the recovery yield with remarkable fidelity and was able to find process conditions that led to significant enrichment. This is, to the best of our knowledge, the first case study of the application of de novo mechanistic modeling for the enrichment of full capsids in rAAV manufacturing, and it serves as demonstration of the potential of mechanistic modeling in rAAV process development.
Assuntos
Dependovirus , Vetores Genéticos , Cromatografia por Troca Iônica/métodos , Dependovirus/genética , Terapia Genética , Capsídeo/químicaRESUMO
PURPOSE: Recently, there has been rapid development in model-informed drug development, which has the potential to reduce animal experiments and accelerate drug discovery. Physiologically based pharmacokinetic (PBPK) and machine learning (ML) models are commonly used in early drug discovery to predict drug properties. However, basic PBPK models require a large number of molecule-specific inputs from in vitro experiments, which hinders the efficiency and accuracy of these models. To address this issue, this paper introduces a new computational platform that combines ML and PBPK models. The platform predicts molecule PK profiles with high accuracy and without the need for experimental data. METHODS: This study developed a whole-body PBPK model and ML models of plasma protein fraction unbound ( f up ), Caco-2 cell permeability, and total plasma clearance to predict the PK of small molecules after intravenous administration. Pharmacokinetic profiles were simulated using a "bottom-up" PBPK modeling approach with ML inputs. Additionally, 40 compounds were used to evaluate the platform's accuracy. RESULTS: Results showed that the ML-PBPK model predicted the area under the concentration-time curve (AUC) with 65.0 % accuracy within a 2-fold range, which was higher than using in vitro inputs with 47.5 % accuracy. CONCLUSION: The ML-PBPK model platform provides high accuracy in prediction and reduces the number of experiments and time required compared to traditional PBPK approaches. The platform successfully predicts human PK parameters without in vitro and in vivo experiments and can potentially guide early drug discovery and development.
Assuntos
Aprendizado de Máquina , Modelos Biológicos , Humanos , Células CACO-2 , Simulação por Computador , Farmacocinética , Descoberta de Drogas/métodos , Área Sob a Curva , Administração Intravenosa , Masculino , Preparações Farmacêuticas/metabolismo , Proteínas Sanguíneas/metabolismoRESUMO
Development of a Quantitative Systems Pharmacology (QSP) model is a long process with many iterative steps. Lack of standard practices for publishing QSP models has resulted in limited model reproducibility within the field. Multiple studies have identified that model reproducibility is a large challenge, especially for QSP models. This work aimed to investigate the causes of QSP model reproducibility issues and suggest standard practices as a potential solution to ensure QSP models are reproducible. In addition, a protocol is suggested as a guidance towards better publication strategy across journals, hoping to enable QSP knowledge preservation.
RESUMO
Many animals restrict their movements to a characteristic home range. This constrained pattern of space use is thought to result from the foraging benefits of memorizing the locations and quality of heterogeneously distributed resources. However, due to the confounding effects of sensory perception, the role of memory in home-range movement behavior lacks definitive evidence in the wild. Here, we analyze the foraging decisions of a large mammal during a field resource manipulation experiment designed to disentangle the effects of memory and perception. We parametrize a mechanistic model of spatial transitions using experimental data to quantify the cognitive processes underlying animal foraging behavior and to predict how individuals respond to resource heterogeneity in space and time. We demonstrate that roe deer (Capreolus capreolus) rely on memory, not perception, to track the spatiotemporal dynamics of resources within their home range. Roe deer foraging decisions were primarily based on recent experience (half-lives of 0.9 and 5.6 d for attribute and spatial memory, respectively), enabling them to adapt to sudden changes in resource availability. The proposed memory-based model was able to both quantify the cognitive processes underlying roe deer behavior and accurately predict how they shifted resource use during the experiment. Our study highlights the fact that animal foraging decisions are based on incomplete information on the locations of available resources, a factor that is critical to developing accurate predictions of animal spatial behavior but is typically not accounted for in analyses of animal movement in the wild.
Assuntos
Cervos/fisiologia , Comportamento Alimentar , Memória , Animais , Cognição , Tomada de Decisões , MovimentoRESUMO
Far from a uniform band, the biodiversity found across Earth's tropical moist forests varies widely between the high diversity of the Neotropics and Indomalaya and the relatively lower diversity of the Afrotropics. Explanations for this variation across different regions, the "pantropical diversity disparity" (PDD), remain contentious, due to difficulty teasing apart the effects of contemporary climate and paleoenvironmental history. Here, we assess the ubiquity of the PDD in over 150,000 species of terrestrial plants and vertebrates and investigate the relationship between the present-day climate and patterns of species richness. We then investigate the consequences of paleoenvironmental dynamics on the emergence of biodiversity gradients using a spatially explicit model of diversification coupled with paleoenvironmental and plate tectonic reconstructions. Contemporary climate is insufficient in explaining the PDD; instead, a simple model of diversification and temperature niche evolution coupled with paleoaridity constraints is successful in reproducing the variation in species richness and phylogenetic diversity seen repeatedly among plant and animal taxa, suggesting a prevalent role of paleoenvironmental dynamics in combination with niche conservatism. The model indicates that high biodiversity in Neotropical and Indomalayan moist forests is driven by complex macroevolutionary dynamics associated with mountain uplift. In contrast, lower diversity in Afrotropical forests is associated with lower speciation rates and higher extinction rates driven by sustained aridification over the Cenozoic. Our analyses provide a mechanistic understanding of the emergence of uneven diversity in tropical moist forests across 110 Ma of Earth's history, highlighting the importance of deep-time paleoenvironmental legacies in determining biodiversity patterns.
Assuntos
Biodiversidade , Florestas , Clima Tropical , Animais , Evolução Biológica , Planeta TerraRESUMO
Research over the past two decades has made substantial inroads into our understanding of somatic mutations. Recently, these studies have focused on understanding their presence in homeostatic tissue. In parallel, agent-based mechanistic models have emerged as an important tool for understanding somatic mutation in tissue; yet no common methodology currently exists to provide base-pair resolution data for these models. Here, we present Gattaca as the first method for introducing and tracking somatic mutations at the base-pair resolution within agent-based models that typically lack nuclei. With nuclei that incorporate human reference genomes, mutational context, and sequence coverage/error information, Gattaca is able to realistically evolve sequence data, facilitating comparisons between in silico cell tissue modeling with experimental human somatic mutation data. This user-friendly method, incorporated into each in silico cell, allows us to fully capture somatic mutation spectra and evolution.
Assuntos
Genoma Humano , Neoplasias , Evolução Clonal , Humanos , Mutação , Neoplasias/genéticaRESUMO
The oxygen isotope composition (δ18 O) of tree-ring cellulose is used to evaluate tree physiological responses to climate, but their interpretation is still limited due to the complexity of the isotope fractionation pathways. We assessed the relative contribution of seasonal needle and xylem water δ18 O variations to the intra-annual tree-ring cellulose δ18 O signature of larch trees at two sites with contrasting soil water availability in the Swiss Alps. We combined biweekly δ18 O measurements of soil water, needle water, and twig xylem water with intra-annual δ18 O measurements of tree-ring cellulose, xylogenesis analysis, and mechanistic and structural equation modeling. Intra-annual cellulose δ18 O values resembled source water δ18 O mean levels better than needle water δ18 O. Large parts of the rings were formed under high proportional exchange with unenriched xylem water (pex ). Maximum pex values were achieved in August and imprinted on sections at 50-75% of the ring. High pex values were associated with periods of high atmospheric evaporative demand (VPD). While VPD governed needle water δ18 O variability, we estimated a limited Péclet effect at both sites. Due to a variable pex , source water has a strong influence over large parts of the intra-annual tree-ring cellulose δ18 O variations, potentially masking signals coming from needle-level processes.
Assuntos
Árvores , Água , Árvores/metabolismo , Água/metabolismo , Isótopos de Oxigênio/metabolismo , Xilema/metabolismo , Celulose/metabolismo , Solo/química , Isótopos de Carbono/metabolismoRESUMO
An optimal purification process for biopharmaceutical products is important to meet strict safety regulations, and for economic benefits. To find the global optimum, it is desirable to screen the overall design space. Advanced model-based approaches enable to screen a broad range of the design-space, in contrast to traditional statistical or heuristic-based approaches. Though, chromatographic mechanistic modeling (MM), one of the advanced model-based approaches, can be speed-limiting for flowsheet optimization, which evaluates every purification possibility (e.g., type and order of purification techniques, and their operating conditions). Therefore, we propose to use artificial neural networks (ANNs) during global optimization to select the most optimal flowsheets. So, the number of flowsheets for final local optimization is reduced and consequently the overall optimization time. Employing ANNs during global optimization proved to reduce the number of flowsheets from 15 to only 3. From these three, one flowsheet was optimized locally and similar final results were found when using the global outcome of either the ANN or MM as starting condition. Moreover, the overall flowsheet optimization time was reduced by 50% when using ANNs during global optimization. This approach accelerates the early purification process design; moreover, it is generic, flexible, and regardless of sample material's type.
RESUMO
Accurate prediction of human pharmacokinetics (PK) remains one of the key objectives of drug metabolism and PK (DMPK) scientists in drug discovery projects. This is typically performed by using in vitro-in vivo extrapolation (IVIVE) based on mechanistic PK models. In recent years, machine learning (ML), with its ability to harness patterns from previous outcomes to predict future events, has gained increased popularity in application to absorption, distribution, metabolism, and excretion (ADME) sciences. This study compares the performance of various ML and mechanistic models for the prediction of human IV clearance for a large (645) set of diverse compounds with literature human IV PK data, as well as measured relevant in vitro end points. ML models were built using multiple approaches for the descriptors: (1) calculated physical properties and structural descriptors based on chemical structure alone (classical QSAR/QSPR); (2) in vitro measured inputs only with no structure-based descriptors (ML IVIVE); and (3) in silico ML IVIVE using in silico model predictions for the in vitro inputs. For the mechanistic models, well-stirred and parallel-tube liver models were considered with and without the use of empirical scaling factors and with and without renal clearance. The best ML model for the prediction of in vivo human intrinsic clearance (CLint) was an in vitro ML IVIVE model using only six in vitro inputs with an average absolute fold error (AAFE) of 2.5. The best mechanistic model used the parallel-tube liver model, with empirical scaling factors resulting in an AAFE of 2.8. The corresponding mechanistic model with full in silico inputs achieved an AAFE of 3.3. These relative performances of the models were confirmed with the prediction of 16 Pfizer drug candidates that were not part of the original data set. Results show that ML IVIVE models are comparable to or superior to their best mechanistic counterparts. We also show that ML IVIVE models can be used to derive insights into factors for the improvement of mechanistic PK prediction.
Assuntos
Líquidos Corporais , Humanos , Simulação por Computador , Descoberta de Drogas , Cinética , Aprendizado de Máquina , Modelos Biológicos , Taxa de Depuração MetabólicaRESUMO
Dry powder inhaler (DPI) products are commonly formulated as a mixture of micronized drug particles and large carrier particles, with or without additional fine particle excipients, followed by final powder filling into dose containment systems such as capsules, blisters, or reservoirs. DPI product manufacturing consists of a series of unit operations, including particle size reduction, blending, and filling. This review provides an overview of the relevant critical process parameters used for jet milling, high-shear blending, and dosator/drum capsule filling operations across commonly utilized instruments. Further, this review describes the recent achievements regarding the application of empirical and mechanistic models, especially discrete element method (DEM) simulation, in DPI process development. Although to date only limited modeling/simulation work has been accomplished, in the authors' perspective, process design and development are destined to be more modeling/simulation driven with the emphasis on evaluating the impact of material attributes/process parameters on process performance. The advancement of computational power is expected to enable modeling/simulation approaches to tackle more complex problems with better accuracy when dealing with real-world DPI process operations.
Assuntos
Portadores de Fármacos , Inaladores de Pó Seco , Pós , Composição de Medicamentos/métodos , Administração por Inalação , Simulação por Computador , Tamanho da Partícula , AerossóisRESUMO
The COVID-19 pandemic has underscored the need to understand the dynamics of SARS-CoV-2 respiratory infection and protection provided by the immune response. SARS-CoV-2 infections are characterized by a particularly high viral load, and further by the small number of inhaled virions sufficient to generate a high viral titer in the nasal passage a few days after exposure. SARS-CoV-2 specific antibodies (Ab), induced from vaccines, previous infection, or inhaled monoclonal Ab, have proven effective against SARS-CoV-2 infection. Our goal in this work is to model the protective mechanisms that Ab can provide and to assess the degree of protection from individual and combined mechanisms at different locations in the respiratory tract. Neutralization, in which Ab bind to virion spikes and inhibit them from binding to and infecting target cells, is one widely reported protective mechanism. A second mechanism of Ab protection is muco-trapping, in which Ab crosslink virions to domains on mucin polymers, effectively immobilizing them in the mucus layer. When muco-trapped, the continuous clearance of the mucus barrier by coordinated ciliary propulsion entrains the trapped viral load toward the esophagus to be swallowed. We model and simulate the protection provided by either and both mechanisms at different locations in the respiratory tract, parametrized by the Ab titer and binding-unbinding rates of Ab to viral spikes and mucin domains. Our results illustrate limits in the degree of protection by neutralizing Ab alone, the powerful protection afforded by muco-trapping Ab, and the potential for dual protection by muco-trapping and neutralizing Ab to arrest a SARS-CoV-2 infection. This manuscript was submitted as part of a theme issue on "Modelling COVID-19 and Preparedness for Future Pandemics".
Assuntos
COVID-19 , SARS-CoV-2 , Humanos , Pandemias , Anticorpos Antivirais , Sistema Respiratório , MucinasRESUMO
The SARS-CoV-2 coronavirus continues to evolve with scores of mutations of the spike, membrane, envelope, and nucleocapsid structural proteins that impact pathogenesis. Infection data from nasal swabs, nasal PCR assays, upper respiratory samples, ex vivo cell cultures and nasal epithelial organoids reveal extreme variabilities in SARS-CoV-2 RNA titers within and between the variants. Some variabilities are naturally prone to clinical testing protocols and experimental controls. Here we focus on nasal viral load sensitivity arising from the timing of sample collection relative to onset of infection and from heterogeneity in the kinetics of cellular infection, uptake, replication, and shedding of viral RNA copies. The sources of between-variant variability are likely due to SARS-CoV-2 structural protein mutations, whereas within-variant population variability is likely due to heterogeneity in cellular response to that particular variant. With the physiologically faithful, agent-based mechanistic model of inhaled exposure and infection from (Chen et al., 2022), we perform statistical sensitivity analyses of the progression of nasal viral titers in the first 0-48 h post infection, focusing on three kinetic mechanisms. Model simulations reveal shorter latency times of infected cells (including cellular uptake, viral RNA replication, until the onset of viral RNA shedding) exponentially accelerate nasal viral load. Further, the rate of infectious RNA copies shed per day has a proportional influence on nasal viral load. Finally, there is a very weak, negative correlation of viral load with the probability of infection per virus-cell encounter, the model proxy for spike-receptor binding affinity.
Assuntos
COVID-19 , SARS-CoV-2 , Humanos , SARS-CoV-2/genética , RNA Viral/genética , Carga Viral , Teste para COVID-19RESUMO
Drying is one of the oldest and most widely used methods for food preservation. It reduces the availability of moisture and inhibits microbial and enzymatic spoilage in food products. Foam mat drying is a mild drying technique used for semiliquid and liquid foodstuff. It is useful for heat-sensitive and sticky liquid food products. In this process, liquid food is converted into foam using surfactant additives, which can be a foaming agent or foam stabilizer. These additives are surface-active compounds of vegetative and animal origins. The foamed material is then convectively dried using hot air. The foam mat drying is an efficient and economical technique. With the emergence of different hybrid techniques such as foam mat freeze drying, foamed spray drying, foamed vacuum drying, and microwave assisted foam mat drying, the powders' physical, chemical, and functional properties have enhanced many folds. These strategies have shown very promising results in terms of cost and time efficiency in almost all the cases barring a few exceptions. This review article attempts to comprehensively summarize the mechanisms dictating the foam mat drying process, novel technological tools for modeling, mathematical and computational modeling, effects of various foaming additives, and various hybrid techniques employed to foam mat drying.