ABSTRACT
Minimizing a company's operational risk by optimizing the performance of the manufacturing and distribution supply chain is a complex task that involves multiple elements, each with their own supply line constraints. Traditional approaches to optimization often assume determinism as the underlying principle. However, this paper, adopting an entropy approach, emphasizes the significance of subjective and objective uncertainty in achieving optimized decisions by incorporating stochastic fluctuations into the supply chain structure. Stochasticity, representing randomness, quantifies the level of uncertainty or risk involved. In this study, we focus on a processing production plant as a model for a chain of operations and supply chain actions. We consider the stochastically varying production and transportation costs from the site to the plant, as well as from the plant to the customer base. Through stochastic optimization, we demonstrate that the plant producer can benefit from improved financial outcomes by setting higher sale prices while simultaneously lowering optimized production costs. This can be accomplished by selectively choosing producers whose production cost probability density function follows a Pareto distribution. Notably, a lower Pareto exponent yields better supply chain cost optimization predictions. Alternatively, a Gaussian stochastic fluctuation may be proposed as a more suitable choice when trading off optimization and simplicity. Although this may result in slightly less optimal performance, it offers advantages in terms of ease of implementation and computational efficiency.
ABSTRACT
In our recently proposed stochastic version of discretized kinetic theory, the exchange of wealth in a society is modelled through a large system of Langevin equations. The deterministic part of the equations is based on non-linear transition probabilities between income classes. The noise terms can be additive, multiplicative or mixed, both with white or Ornstein-Uhlenbeck spectrum. The most important measured correlations are those between Gini inequality index G and social mobility M, between total income and G, and between M and total income. We describe numerical results concerning these correlations and a quantity which gives average stochastic deviations from the equilibrium solutions in dependence on the noise amplitude.
ABSTRACT
Linguistic analysis of protein sequences is an underexploited technique. Here, we capitalize on the concept of the lipogram to characterize sequences at the proteome levels. A lipogram is a literary composition which omits one or more letters. A protein lipogram likewise omits one or more types of amino acid. In this article, we establish a usable terminology for the decomposition of a sequence collection in terms of the lipogram. Next, we characterize Uniref50 using a lipogram decomposition. At the global level, protein lipograms exhibit power-law properties. A clear correlation with metabolic cost is seen. Finally, we use the lipogram construction to assign proteomes to the four branches of the tree-of-life: archaea, bacteria, eukaryotes and viruses. We conclude from this pilot study that the lipogram demonstrates considerable potential as an additional tool for sequence analysis and proteome classification.
Subject(s)
Amino Acid Sequence , Proteins/chemistry , Proteome/classification , Archaea , Bacteria , Eukaryota , Evolution, Molecular , Pilot Projects , VirusesABSTRACT
MOTIVATION: Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. RESULTS: Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from 'first passage probability distribution' to summarize statistics of ensemble averaged amino acid propensity values. In this article, we introduce and elaborate this approach.
Subject(s)
Sequence Analysis, Protein/methods , Algorithms , Data Interpretation, Statistical , Physics , Sequence AlignmentABSTRACT
In this study, we present an immuno-epidemic model to understand mitigation options during an epidemic break. The model incorporates comorbidity and multiple-vaccine doses through a system of coupled integro-differential equations to analyze the epidemic rate and intensity from a knowledge of the basic reproduction number and time-distributed rate functions. Our modeling results show that the interval between vaccine doses is a key control parameter that can be tuned to significantly influence disease spread. We show that multiple doses induce a hysteresis effect in immunity levels that offers a better mitigation alternative compared to frequent vaccination which is less cost-effective while being more intrusive. Optimal dosing intervals, emphasizing the cost-effectiveness of each vaccination effort, and determined by various factors such as the level of immunity and efficacy of vaccines against different strains, appear to be crucial in disease management. The model is sufficiently generic that can be extended to accommodate specific disease forms.
Subject(s)
Vaccine Efficacy , Humans , Vaccination/methods , COVID-19/prevention & control , COVID-19/immunology , COVID-19/virology , COVID-19 Vaccines/immunology , COVID-19 Vaccines/administration & dosage , SARS-CoV-2/immunology , Immunization Schedule , Basic Reproduction NumberABSTRACT
Virtual screening (VS) is a computational strategy that uses in silico automated protein docking inter alia to rank potential ligands, or by extension rank protein-ligand pairs, identifying potential drug candidates. Most docking methods use preferred sets of physicochemical descriptors (PCDs) to model the interactions between host and guest molecules. Thus, conventional VS is often data-specific, method-dependent and with demonstrably differing utility in identifying candidate drugs. This study proposes four universality classes of novel consensus scoring (CS) algorithms that combine docking scores, derived from ten docking programs (ADFR, DOCK, Gemdock, Ledock, PLANTS, PSOVina, QuickVina2, Smina, Autodock Vina and VinaXB), using decoys from the DUD-E repository ( http://dude.docking.org/ ) against 29 MRSA-oriented targets to create a general VS formulation that can identify active ligands for any suitable protein target. Our results demonstrate that CS provides improved ligand-protein docking fidelity when compared to individual docking platforms. This approach requires only a small number of docking combinations and can serve as a viable and parsimonious alternative to more computationally expensive docking approaches. Predictions from our CS algorithm are compared against independent machine learning evaluations using the same docking data, complementing the CS outcomes. Our method is a reliable approach for identifying protein targets and high-affinity ligands that can be tested as high-probability candidates for drug repositioning.
Subject(s)
Algorithms , Proteins , Ligands , Consensus , Proteins/chemistry , Molecular Docking Simulation , Protein BindingABSTRACT
Extracting "high ranking" or "prime protein targets" (PPTs) as potent MRSA drug candidates from a given set of ligands is a key challenge in efficient molecular docking. This study combines protein-versus-ligand matching molecular docking (MD) data extracted from 10 independent molecular docking (MD) evaluations - ADFR, DOCK, Gemdock, Ledock, Plants, Psovina, Quickvina2, smina, vina, and vinaxb to identify top MRSA drug candidates. Twenty-nine active protein targets (APT) from the enhanced DUD-E repository ( http://DUD-E.decoys.org ) are matched against 1040 ligands using "forward modeling" machine learning for initial "data mining and modeling" (DDM) to extract PPTs and the corresponding high affinity ligands (HALs). K-means clustering (KMC) is then performed on 400 ligands matched against 29 PTs, with each cluster accommodating HALs, and the corresponding PPTs. Performance of KMC is then validated against randomly chosen head, tail, and middle active ligands (ALs). KMC outcomes have been validated against two other clustering methods, namely, Gaussian mixture model (GMM) and density based spatial clustering of applications with noise (DBSCAN). While GMM shows similar results as with KMC, DBSCAN has failed to yield more than one cluster and handle the noise (outliers), thus affirming the choice of KMC or GMM. Databases obtained from ADFR to mine PPTs are then ranked according to the number of the corresponding HAL-PPT combinations (HPC) inside the derived clusters, an approach called "reverse modeling" (RM). From the set of 29 PTs studied, RM predicts high fidelity of 5 PPTs (17%) that bind with 76 out of 400, i.e., 19% ligands leading to a prediction of next-generation MRSA drug candidates: PPT2 (average HPC is 41.1%) is the top choice, followed by PPT14 (average HPC 25.46%), and then PPT15 (average HPC 23.12%). This algorithm can be generically implemented irrespective of pathogenic forms and is particularly effective for sparse data.
Subject(s)
Drug Design , Proteins , Molecular Docking Simulation , Algorithms , Machine LearningABSTRACT
We analyze conflict between a citizenry and an insurgent group over a fixed resource such as land. The citizenry has an elected leader who proposes a division such that, the lower the land ceded to the insurgents, the higher the cost of conflict. Leaders differ in ability and ideology such that the higher the leader's ability, the lower the cost of conflict, and the more hawkish the leader, the higher his utility from retaining land. We show that the conflict arises from the political process with re-election motives causing leaders to choose to cede too little land to signal their ability. We also show that when the rents of office are high, the political equilibrium and the second best diverge; in particular, the policy under the political equilibrium is more hawkish compared to the second best. When both ideology and ability are unknown, we provide a plausible condition under which the probability of re-election increases in the leader's hawkishness, thereby providing an explanation for why hawkish politicians may have a natural advantage under the electoral process.
Subject(s)
PoliticsABSTRACT
The devastating trail of Covid-19 is characterized by one of the highest mortality-to-infected ratio for a pandemic. Restricted therapeutic and early-stage vaccination still renders social exclusion through lockdown as the key containment mode.To understand the dynamics, we propose PHIRVD, a mechanistic infection propagation model that Machine Learns (Bayesian Markov Chain Monte Carlo) the evolution of six infection stages, namely healthy susceptible (H), predisposed comorbid susceptible (P), infected (I), recovered (R), herd immunized (V) and mortality (D), providing a highly reliable mortality prediction profile for 18 countries at varying stages of lockdown. Training data between 10 February to 29 June 2020, PHIRVD can accurately predict mortality profile up to November 2020, including the second wave kinetics. The model also suggests mortality-to-infection ratio as a more dynamic pandemic descriptor, substituting reproduction number. PHIRVD establishes the importance of early and prolonged but strategic lockdown to contain future relapse, complementing futuristic vaccine impact.
Subject(s)
COVID-19/epidemiology , COVID-19/prevention & control , Basic Reproduction Number , Bayes Theorem , COVID-19/etiology , Communicable Disease Control/methods , Comorbidity , Disease Susceptibility , Humans , Immunity, Herd , India/epidemiology , Kinetics , Machine Learning , Markov Chains , Models, Theoretical , Monte Carlo Method , Mortality , United Kingdom/epidemiologyABSTRACT
Poverty, the quintessential denominator of a developing nation, has been traditionally defined against an arbitrary poverty line; individuals (or countries) below this line are deemed poor and those above it, not so! This has two pitfalls. First, absolute reliance on a single poverty line, based on basic food consumption, and not on total consumption distribution, is only a partial poverty index at best. Second, a single expense descriptor is an exogenous quantity that does not evolve from income-expenditure statistics. Using extensive income-expenditure statistics from India, here we show how a self-consistent endogenous poverty line can be derived from an agent-based stochastic model of market exchange, combining all expenditure modes (basic food, other food and non-food), whose parameters are probabilistically estimated using advanced Machine Learning tools. Our mathematical study establishes a consumption based poverty measure that combines labor, commodity, and asset market outcomes, delivering an excellent tool for economic policy formulation.
ABSTRACT
Studies on the influence of a modern lifestyle in abetting Coronary Heart Diseases (CHD) have mostly focused on deterrent health factors, like smoking, alcohol intake, cheese consumption and average systolic blood pressure, largely disregarding the impact of a healthy lifestyle in mitigating CHD risk. In this study, 30+ years' World Health Organization (WHO) data have been analyzed, using a wide array of advanced Machine Learning techniques, to quantify how regulated reliance on positive health indicators, e.g. fruits/vegetables, cereals can offset CHD risk factors over a period of time. Our research ranks the impact of the negative outliers on CHD and then quantifies the impact of the positive health factors in mitigating the negative risk-factors. Our research outcomes, presented through simple mathematical equations, outline the best CHD prevention strategy using lifestyle control only. We show that a 20% increase in the intake of fruit/vegetable leads to 3-6% decrease in SBP; or, a 10% increase in cereal intake lowers SBP by 3%; a simultaneous increase of 10% in fruit-vegetable can further offset the effects of SBP by 6%. Our analysis establishes gender independence of lifestyle on CHD, refuting long held assumptions and unqualified beliefs. We show that CHD risk can be lowered with incremental changes in lifestyle and diet, e.g. fruit-vegetable intake ameliorating effects of alcohol-smoking-fatty food. Our multivariate data model also estimates functional relationships amongst lifestyle factors that can potentially redefine the diagnostics of Framingham score-based CHD-prediction.
Subject(s)
Coronary Disease/prevention & control , Diet , Healthy Lifestyle , Machine Learning , Adolescent , Adult , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prognosis , Risk Factors , Young AdultABSTRACT
We study memory effects in a kinetic roughening model. For d=1, a different dynamic scaling is uncovered in the memory dominated phases; the Kardar-Parisi-Zhang scaling is restored in the absence of noise. dc=2 represents the critical dimension where memory is shown to smoothen the roughening front (alpha
ABSTRACT
Cilia and flagella are hairlike extensions of eukaryotic cells which generate oscillatory beat patterns that can propel micro-organisms and create fluid flows near cellular surfaces. The evolutionary highly conserved core of cilia and flagella consists of a cylindrical arrangement of nine microtubule doublets, called the axoneme. The axoneme is an actively bending structure whose motility results from the action of dynein motor proteins cross-linking microtubule doublets and generating stresses that induce bending deformations. The periodic beat patterns are the result of a mechanical feedback that leads to self-organized bending waves along the axoneme. Using a theoretical framework to describe planar beating motion, we derive a nonlinear wave equation that describes the fundamental Fourier mode of the axonemal beat. We study the role of nonlinearities and investigate how the amplitude of oscillations increases in the vicinity of an oscillatory instability. We furthermore present numerical solutions of the nonlinear wave equation for different boundary conditions. We find that the nonlinear waves are well approximated by the linearly unstable modes for amplitudes of beat patterns similar to those observed experimentally.
Subject(s)
Axoneme/physiology , Biological Clocks/physiology , Cilia/physiology , Flagella/physiology , Models, Biological , Movement/physiology , Computer Simulation , Nonlinear DynamicsABSTRACT
A phenomenological mean-field theory is presented to describe the role of external magnetic field, pressure and chemical substitution on the nature of ferromagnetic (FM) to paramagnetic (PM) phase transition in manganites. The application of external field (or pressure) shifts the transition, leading to a field (or pressure) dependent phase boundary along which a tricritical point is shown to exist where a first-order FM-PM transition becomes second-order. We show that the effect of chemical substitution on the FM transition is analogous to that of external perturbations (magnetic field and pressure); this includes the existence of a tricritical point at which the order of transition changes. Our theoretical predictions satisfactorily explain the nature of FM-PM transition, observed in several systems. The modeling hypothesis has been critically verified from our experimental data from a wide range of colossal magnetoresistive manganite single crystals like Sm0.52Sr0.48MnO3. The theoretical model prediction of a tricritical point has been validated in this experiment which provides a major ramification of the strength of the model proposed.
ABSTRACT
We study the strong coupling (SC) limit of the anisotropic Kardar-Parisi-Zhang (KPZ) model. A systematic mapping of the continuum model to its lattice equivalent shows that in the SC limit, anisotropic perturbations destroy all spatial correlations but retain a temporal scaling which shows a remarkable crossover along one of the two spatial directions, the choice of direction depending on the relative strength of anisotropicity. The results agree with exact numerics and are expected to settle the long-standing SC problem of a KPZ model in the infinite range limit.
ABSTRACT
The "double diffusivity" model was proposed in the late 1970s, and reworked in the early 1980s, as a continuum counterpart to existing discrete models of diffusion corresponding to high diffusivity paths, such as grain boundaries and dislocation lines. It was later rejuvenated in the 1990s to interpret experimental results on diffusion in polycrystalline and nanocrystalline specimens where grain boundaries and triple grain boundary junctions act as high diffusivity paths. Technically, the model pans out as a system of coupled Fick-type diffusion equations to represent "regular" and "high" diffusivity paths with "source terms" accounting for the mass exchange between the two paths. The model remit was extended by analogy to describe flow in porous media with double porosity, as well as to model heat conduction in media with two nonequilibrium local temperature baths, e.g., ion and electron baths. Uncoupling of the two partial differential equations leads to a higher-ordered diffusion equation, solutions of which could be obtained in terms of classical diffusion equation solutions. Similar equations could also be derived within an "internal length" gradient (ILG) mechanics formulation applied to diffusion problems, i.e., by introducing nonlocal effects, together with inertia and viscosity, in a mechanics based formulation of diffusion theory. While being remarkably successful in studies related to various aspects of transport in inhomogeneous media with deterministic microstructures and nanostructures, its implications in the presence of stochasticity have not yet been considered. This issue becomes particularly important in the case of diffusion in nanopolycrystals whose deterministic ILG-based theoretical calculations predict a relaxation time that is only about one-tenth of the actual experimentally verified time scale. This article provides the "missing link" in this estimation by adding a vital element in the ILG structure, that of stochasticity, that takes into account all boundary layer fluctuations. Our stochastic-ILG diffusion calculation confirms rapprochement between theory and experiment, thereby benchmarking a new generation of gradient-based continuum models that conform closer to real-life fluctuating environments.
ABSTRACT
Starting from a stochastic agent-based model to represent market exchange in a developing economy, we study time variations of the probability density function of income with simultaneous variation of the consumption deprivation (CD), where CD represents the shortfall in consumption from the saturation level of an essential commodity, cereal. Together, these two models combine income-expenditure-based market dynamics with time variations in consumption due to income. In this new unified theoretical structure, exchange of trade in assets is only allowed when the income exceeds consumption-deprivation while CD itself is endogenously obtained from a separate kinetic model. Our results reveal that the nature of time variation of the CD function leads to a downward trend in the threshold level of consumption of basic necessities, suggesting a possible dietary transition in terms of lower saturation level of food-grain consumption, possibly through an improvement in the level of living. The new poverty index, defined as CD, is amenable to approximate probabilistic prediction within a short time horizon. A major achievement of this work is the intrinsic independence of the poverty index from an exogenous poverty line, making it more objective for policy formulation as opposed to existing poverty indices in the literature.
ABSTRACT
The dynamical evolution of dislocations in plastically deformed metals is controlled by both deterministic factors arising out of applied loads and stochastic effects appearing due to fluctuations of internal stress. Such types of stochastic dislocation processes and the associated spatially inhomogeneous modes lead to randomness in the observed deformation structure. Previous studies have analyzed the role of randomness in such textural evolution, but none of these models have considered the impact of a finite decay time (all previous models assumed instantaneous relaxation which is "unphysical") of the stochastic perturbations in the overall dynamics of the system. The present article bridges this knowledge gap by introducing a colored noise in the form of an Ornstein-Uhlenbeck noise in the analysis of a class of linear and nonlinear Wiener and Ornstein-Uhlenbeck processes that these structural dislocation dynamics could be mapped on to. Based on an analysis of the relevant Fokker-Planck model, our results show that linear Wiener processes remain unaffected by the second time scale in the problem, but all nonlinear processes, both the Wiener type and Ornstein-Uhlenbeck type, scale as a function of the noise decay time τ. The results are expected to ramify existing experimental observations and inspire new numerical and laboratory tests to gain further insight into the competition between deterministic and random effects in modeling plastically deformed samples.
ABSTRACT
In oscillatory reaction-diffusion systems, time-delay feedback can lead to the instability of uniform oscillations with respect to formation of standing waves. Here, we investigate how the presence of additive, Gaussian white noise can induce the appearance of standing waves. Combining analytical solutions of the model with spatiotemporal simulations, we find that noise can promote standing waves in regimes where the deterministic uniform oscillatory modes are stabilized. As the deterministic phase boundary is approached, the spatiotemporal correlations become stronger, such that even small noise can induce standing waves in this parameter regime. With larger noise strengths, standing waves could be induced at finite distances from the (deterministic) phase boundary. The overall dynamics is defined through the interplay of noisy forcing with the inherent reaction-diffusion dynamics.
ABSTRACT
Loss of coherence with increasing excitation amplitudes and spatial size modulation is a fundamental problem in designing Raman fiber lasers. While it is known that ramping up laser pump power increases the amplitude of stochastic excitations, such higher energy inputs can also lead to a transition from a linearly stable coherent laminar regime to a non-desirable disordered turbulent state. This report presents a new statistical methodology, based on first passage statistics, that classifies lasing regimes in Raman fiber lasers, thereby leading to a fast and highly accurate identification of a strong instability leading to a laminar-turbulent phase transition through a self-consistently defined order parameter. The results have been consistent across a wide range of pump power values, heralding a breakthrough in the non-invasive analysis of fiber laser dynamics.