Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 103
Filtrar
Mais filtros

Tipo de documento
Intervalo de ano de publicação
1.
BMC Med Res Methodol ; 24(1): 11, 2024 Jan 13.
Artigo em Inglês | MEDLINE | ID: mdl-38218799

RESUMO

BACKGROUND: In this article we describe the methodology of the time-to-event continual reassessment method in the presence of partial orders (PO-TITE-CRM) and the process of implementing this trial design into a phase I trial in head and neck cancer called ADePT-DDR. The ADePT-DDR trial aims to find the maximum tolerated dose of an ATR inhibitor given in conjunction with radiotherapy in patients with head and neck squamous cell carcinoma. METHODS: The PO-TITE-CRM is a phase I trial design that builds upon the time-to-event continual reassessment method (TITE-CRM) to allow for the presence of partial ordering of doses. Partial orders occur in the case where the monotonicity assumption does not hold and the ordering of doses in terms of toxicity is not fully known. RESULTS: We arrived at a parameterisation of the design which performed well over a range of scenarios. Results from simulations were used iteratively to determine the best parameterisation of the design and we present the final set of simulations. We provide details on the methodology as well as insight into how it is applied to the trial. CONCLUSIONS: Whilst being a very efficient design we highlight some of the difficulties and challenges that come with implementing such a design. As the issue of partial ordering may become more frequent due to the increasing investigations of combination therapies we believe this account will be beneficial to those wishing to implement a design with partial orders. TRIAL REGISTRATION: ADePT-DDR was added to the European Clinical Trials Database (EudraCT number: 2020-001034-35) on 2020-08-07.


Assuntos
Neoplasias de Cabeça e Pescoço , Projetos de Pesquisa , Humanos , Neoplasias de Cabeça e Pescoço/tratamento farmacológico , Terapia Combinada , Dose Máxima Tolerável , Relação Dose-Resposta a Droga , Simulação por Computador
2.
Biometrics ; 79(4): 3126-3139, 2023 12.
Artigo em Inglês | MEDLINE | ID: mdl-36905172

RESUMO

Natural direct and indirect effects are mediational estimands that decompose the average treatment effect and describe how outcomes would be affected by contrasting levels of a treatment through changes induced in mediator values (in the case of the indirect effect) or not through induced changes in the mediator values (in the case of the direct effect). Natural direct and indirect effects are not generally point-identified in the presence of a treatment-induced confounder; however, they may be identified if one is willing to assume monotonicity between the treatment and the treatment-induced confounder. We argue that this assumption may be reasonable in the relatively common encouragement-design trial setting, where the intervention is randomized treatment assignment and the treatment-induced confounder is whether or not treatment was actually taken/adhered to. We develop efficiency theory for the natural direct and indirect effects under this monotonicity assumption, and use it to propose a nonparametric, multiply robust estimator. We demonstrate the finite sample properties of this estimator using a simulation study, and apply it to data from the Moving to Opportunity Study to estimate the natural direct and indirect effects of being randomly assigned to receive a Section 8 housing voucher-the most common form of federal housing assistance-on risk developing any mood or externalizing disorder among adolescent boys, possibly operating through various school and community characteristics.


Assuntos
Modelos Estatísticos , Instituições Acadêmicas , Masculino , Adolescente , Humanos , Simulação por Computador
3.
Bull Math Biol ; 85(5): 39, 2023 03 31.
Artigo em Inglês | MEDLINE | ID: mdl-37000280

RESUMO

Continuous-time Markov chains are frequently used as stochastic models for chemical reaction networks, especially in the growing field of systems biology. A fundamental problem for these Stochastic Chemical Reaction Networks (SCRNs) is to understand the dependence of the stochastic behavior of these systems on the chemical reaction rate parameters. Towards solving this problem, in this paper we develop theoretical tools called comparison theorems that provide stochastic ordering results for SCRNs. These theorems give sufficient conditions for monotonic dependence on parameters in these network models, which allow us to obtain, under suitable conditions, information about transient and steady-state behavior. These theorems exploit structural properties of SCRNs, beyond those of general continuous-time Markov chains. Furthermore, we derive two theorems to compare stationary distributions and mean first passage times for SCRNs with different parameter values, or with the same parameters and different initial conditions. These tools are developed for SCRNs taking values in a generic (finite or countably infinite) state space and can also be applied for non-mass-action kinetics models. When propensity functions are bounded, our method of proof gives an explicit method for coupling two comparable SCRNs, which can be used to simultaneously simulate their sample paths in a comparable manner. We illustrate our results with applications to models of enzymatic kinetics and epigenetic regulation by chromatin modifications.


Assuntos
Algoritmos , Conceitos Matemáticos , Processos Estocásticos , Epigênese Genética , Modelos Biológicos , Cadeias de Markov , Cinética
4.
J Math Biol ; 86(3): 32, 2023 01 25.
Artigo em Inglês | MEDLINE | ID: mdl-36695934

RESUMO

To explore the influence of spatial heterogeneity on mosquito-borne diseases, we formulate a reaction-diffusion model with general incidence rates. The basic reproduction ratio [Formula: see text] for this model is introduced and the threshold dynamics in terms of [Formula: see text] are obtained. In the case where the model is spatially homogeneous, the global asymptotic stability of the endemic equilibrium is proved when [Formula: see text]. Under appropriate conditions, we establish the asymptotic profiles of [Formula: see text] in the case of small or large diffusion rates, and investigate the monotonicity of [Formula: see text] with respect to the heterogeneous diffusion coefficients. Numerically, the proposed model is applied to study the dengue fever transmission. Via performing simulations on the impacts of certain factors on [Formula: see text] and disease dynamics, we find some novel and interesting phenomena which can provide valuable information for the targeted implementation of disease control measures.


Assuntos
Modelos Biológicos , Doenças Transmitidas por Vetores , Animais , Humanos , Simulação por Computador , Número Básico de Reprodução , Doenças Transmitidas por Vetores/epidemiologia
5.
J Math Biol ; 87(6): 86, 2023 11 14.
Artigo em Inglês | MEDLINE | ID: mdl-37957406

RESUMO

In this paper, we propose and study several inverse problems of identifying/determining unknown coefficients for a class of coupled PDE systems by measuring the average flux data on part of the underlying boundary. In these coupled systems, we mainly consider the non-negative solutions of the coupled equations, which are consistent with realistic settings in biology and ecology. There are several salient features of our inverse problem study: the drastic reduction of the measurement/observation data due to averaging effects, the nonlinear coupling of multiple equations, and the non-negative constraints on the solutions, which pose significant challenges to the inverse problems. We develop a new and effective scheme to tackle the inverse problems and achieve unique identifiability results by properly controlling the injection of different source terms to obtain multiple sets of mean flux data. The approach relies on certain monotonicity properties which are related to the intrinsic structures of the coupled PDE system. We also connect our study to biological applications of practical interest.


Assuntos
Biologia , Ecologia , Matemática
6.
Sensors (Basel) ; 23(3)2023 Feb 03.
Artigo em Inglês | MEDLINE | ID: mdl-36772742

RESUMO

Road traffic safety can be influenced by road hypnosis. Accurate detection of the driver's road hypnosis is a very important function urgently required in the driver assistance system. Road hypnosis recurs frequently in a certain period, and it tends to occur in a typical monotonous scene such as a tunnel or a highway. Taking the scene of a tunnel or a highway as a typical example, road hypnosis was studied through simulated driving experiments and vehicle driving experiments. A road hypnosis recognition model based on principal component analysis (PCA) and a long short-term memory network (LSTM) was proposed, where PCA was used to extract various parameters collected by the eye tracker, and the LSTM model was constructed to identify road hypnosis. The accuracy rates of 93.27% and 97.01% in simulated driving experiments and vehicle driving experiments were obtained. The proposed method was compared with k-nearest neighbor (KNN) and random forest (RF). The results showed that the proposed PCA-LSTM model had better performance. This paper provides a novel and convenient method to realize the driver's road hypnosis detection function of the intelligent driver assistance system in practical applications.

7.
Linguist Philos ; 46(2): 259-289, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36974334

RESUMO

We discuss a well-known puzzle about the lexicalization of logical operators in natural language, in particular connectives and quantifiers. Of the many logically possible operators, only few appear in the lexicon of natural languages: the connectives in English, for example, are conjunction and, disjunction or, and negated disjunction nor; the lexical quantifiers are all, some and no. The logically possible nand (negated conjunction) and Nall (negated universal) are not expressed by lexical entries in English, nor in any natural language. Moreover, the lexicalized operators are all upward or downward monotone, an observation known as the Monotonicity Universal. We propose a logical explanation of lexical gaps and of the Monotonicity Universal, based on the dynamic behaviour of connectives and quantifiers. We define update potentials for logical operators as procedures to modify the context, under the assumption that an update by ϕ depends on the logical form of ϕ and on the speech act performed: assertion or rejection. We conjecture that the adequacy of update potentials determines the limits of lexicalizability for logical operators in natural language. Finally, we show that on this framework the Monotonicity Universal follows from the logical properties of the updates that correspond to each operator.

8.
Stat Med ; 41(19): 3837-3877, 2022 Aug 30.
Artigo em Inglês | MEDLINE | ID: mdl-35851717

RESUMO

The ICH E9(R1) addendum (2019) proposed principal stratification (PS) as one of five strategies for dealing with intercurrent events. Therefore, understanding the strengths, limitations, and assumptions of PS is important for the broad community of clinical trialists. Many approaches have been developed under the general framework of PS in different areas of research, including experimental and observational studies. These diverse applications have utilized a diverse set of tools and assumptions. Thus, need exists to present these approaches in a unifying manner. The goal of this tutorial is threefold. First, we provide a coherent and unifying description of PS. Second, we emphasize that estimation of effects within PS relies on strong assumptions and we thoroughly examine the consequences of these assumptions to understand in which situations certain assumptions are reasonable. Finally, we provide an overview of a variety of key methods for PS analysis and use a real clinical trial example to illustrate them. Examples of code for implementation of some of these approaches are given in Supplemental Materials.

9.
Qual Life Res ; 31(1): 49-59, 2022 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-34476671

RESUMO

PURPOSE: In Mokken scaling, the Crit index was proposed and is sometimes used as evidence (or lack thereof) of violations of some common model assumptions. The main goal of our study was twofold: To make the formulation of the Crit index explicit and accessible, and to investigate its distribution under various measurement conditions. METHODS: We conducted two simulation studies in the context of dichotomously scored item responses. We manipulated the type of assumption violation, the proportion of violating items, sample size, and quality. False positive rates and power to detect assumption violations were our main outcome variables. Furthermore, we used the Crit coefficient in a Mokken scale analysis to a set of responses to the General Health Questionnaire (GHQ-12), a self-administered questionnaire for assessing current mental health. RESULTS: We found that the false positive rates of Crit were close to the nominal rate in most conditions, and that power to detect misfit depended on the sample size, type of violation, and number of assumption-violating items. Overall, in small samples Crit lacked the power to detect misfit, and in larger samples power differed considerably depending on the type of violation and proportion of misfitting items. Furthermore, we also found in our empirical example that even in large samples the Crit index may fail to detect assumption violations. DISCUSSION: Even in large samples, the Crit coefficient showed limited usefulness for detecting moderate and severe violations of monotonicity. Our findings are relevant to researchers and practitioners who use Mokken scaling for scale and questionnaire construction and revision.


Assuntos
Qualidade de Vida , Projetos de Pesquisa , Simulação por Computador , Humanos , Saúde Mental , Qualidade de Vida/psicologia , Inquéritos e Questionários
10.
Entropy (Basel) ; 24(10)2022 Sep 24.
Artigo em Inglês | MEDLINE | ID: mdl-37420372

RESUMO

We use the finite dimensional monotonicity methods in order to investigate problems connected with the discrete sx,·-Laplacian on simple, connected, undirected, weighted, and finite graphs with nonlinearities given in a non-potential form. Positive solutions are also considered.

11.
Stat Med ; 40(25): 5605-5627, 2021 11 10.
Artigo em Inglês | MEDLINE | ID: mdl-34288021

RESUMO

Causal inference methods are gaining increasing prominence in pharmaceutical drug development in light of the recently published addendum on estimands and sensitivity analysis in clinical trials to the E9 guideline of the International Council for Harmonisation. The E9 addendum emphasises the need to account for post-randomization or 'intercurrent' events that can potentially influence the interpretation of a treatment effect estimate at a trial's conclusion. Instrumental Variables (IV) methods have been used extensively in economics, epidemiology, and academic clinical studies for 'causal inference,' but less so in the pharmaceutical industry setting until now. In this tutorial article we review the basic tools for causal inference, including graphical diagrams and potential outcomes, as well as several conceptual frameworks that an IV analysis can sit within. We discuss in detail how to map these approaches to the Treatment Policy, Principal Stratum and Hypothetical 'estimand strategies' introduced in the E9 addendum, and provide details of their implementation using standard regression models. Specific attention is given to discussing the assumptions each estimation strategy relies on in order to be consistent, the extent to which they can be empirically tested and sensitivity analyses in which specific assumptions can be relaxed. We finish by applying the methods described to simulated data closely matching two recent pharmaceutical trials to further motivate and clarify the ideas.


Assuntos
Desenvolvimento de Medicamentos , Projetos de Pesquisa , Causalidade , Interpretação Estatística de Dados , Indústria Farmacêutica , Humanos
12.
J Math Biol ; 83(6-7): 68, 2021 12 04.
Artigo em Inglês | MEDLINE | ID: mdl-34870739

RESUMO

We consider an age-structured density-dependent population model on several temporally variable patches. There are two key assumptions on which we base model setup and analysis. First, intraspecific competition is limited to competition between individuals of the same age (pure intra-cohort competition) and it affects density-dependent mortality. Second, dispersal between patches ensures that each patch can be reached from every other patch, directly or through several intermediary patches, within individual reproductive age. Using strong monotonicity we prove existence and uniqueness of solution and analyze its large-time behavior in cases of constant, periodically variable and irregularly variable environment. In analogy to the next generation operator, we introduce the net reproductive operator and the basic reproduction number [Formula: see text] for time-independent and periodical models and establish the permanence dichotomy: if [Formula: see text], extinction on all patches is imminent, and if [Formula: see text], permanence on all patches is guaranteed. We show that a solution for the general time-dependent problem can be bounded by above and below by solutions to the associated periodic problems. Using two-side estimates, we establish uniform boundedness and uniform persistence of a solution for the general time-dependent problem and describe its asymptotic behaviour.

13.
J Math Biol ; 83(3): 25, 2021 08 07.
Artigo em Inglês | MEDLINE | ID: mdl-34363540

RESUMO

We propose an alternative delayed population growth difference equation model based on a modification of the Beverton-Holt recurrence, assuming a delay only in the growth contribution that takes into account that those individuals that die during the delay, do not contribute to growth. The model introduced differs from a delayed logistic difference equation, known as the delayed Pielou or delayed Beverton-Holt model, that was formulated as a discretization of the Hutchinson model. The analysis of our delayed difference equation model identifies a critical delay threshold. If the time delay exceeds this threshold, the model predicts that the population will go extinct for all non-negative initial conditions. If the delay is below this threshold, the population survives and its size converges to a positive globally asymptotically stable equilibrium that is decreasing in size as the delay increases. We show global asymptotic stability of the positive equilibrium using two different techniques. For one set of parameter values, a contraction mapping result is applied, while the proof for the remaining set of parameter values, relies on showing that the map is eventually componentwise monotone.


Assuntos
Modelos Biológicos , Crescimento Demográfico , Humanos , Dinâmica Populacional
14.
Biom J ; 63(5): 1028-1051, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33734453

RESUMO

Expectile regression, in contrast to classical linear regression, allows for heteroscedasticity and omits a parametric specification of the underlying distribution. This model class can be seen as a quantile-like generalization of least squares regression. Similarly as in quantile regression, the whole distribution can be modeled with expectiles, while still offering the same flexibility in the use of semiparametric predictors as modern mean regression. However, even with no parametric assumption for the distribution of the response in expectile regression, the model is still constructed with a linear relationship between the fitted value and the predictor. If the true underlying relationship is nonlinear then severe biases can be observed in the parameter estimates as well as in quantities derived from them such as model predictions. We observed this problem during the analysis of the distribution of a self-reported hearing score with limited range. Classical expectile regression should in theory adhere to these constraints, however, we observed predictions that exceeded the maximum score. We propose to include a response function between the fitted value and the predictor similarly as in generalized linear models. However, including a fixed response function would imply an assumption on the shape of the underlying distribution function. Such assumptions would be counterintuitive in expectile regression. Therefore, we propose to estimate the response function jointly with the covariate effects. We design the response function as a monotonically increasing P-spline, which may also contain constraints on the target set. This results in valid estimates for a self-reported listening effort score through nonlinear estimates of the response function. We observed strong associations with the speech reception threshold.


Assuntos
Modelos Lineares , Viés , Humanos
15.
Entropy (Basel) ; 23(6)2021 Jun 02.
Artigo em Inglês | MEDLINE | ID: mdl-34199499

RESUMO

Feature selection is one of the core contents of rough set theory and application. Since the reduction ability and classification performance of many feature selection algorithms based on rough set theory and its extensions are not ideal, this paper proposes a feature selection algorithm that combines the information theory view and algebraic view in the neighborhood decision system. First, the neighborhood relationship in the neighborhood rough set model is used to retain the classification information of continuous data, to study some uncertainty measures of neighborhood information entropy. Second, to fully reflect the decision ability and classification performance of the neighborhood system, the neighborhood credibility and neighborhood coverage are defined and introduced into the neighborhood joint entropy. Third, a feature selection algorithm based on neighborhood joint entropy is designed, which improves the disadvantage that most feature selection algorithms only consider information theory definition or algebraic definition. Finally, experiments and statistical analyses on nine data sets prove that the algorithm can effectively select the optimal feature subset, and the selection result can maintain or improve the classification performance of the data set.

16.
Entropy (Basel) ; 23(10)2021 Oct 14.
Artigo em Inglês | MEDLINE | ID: mdl-34682059

RESUMO

While the languages of the world vary greatly, they exhibit systematic patterns, as well. Semantic universals are restrictions on the variation in meaning exhibit cross-linguistically (e.g., that, in all languages, expressions of a certain type can only denote meanings with a certain special property). This paper pursues an efficient communication analysis to explain the presence of semantic universals in a domain of function words: quantifiers. Two experiments measure how well languages do in optimally trading off between competing pressures of simplicity and informativeness. First, we show that artificial languages which more closely resemble natural languages are more optimal. Then, we introduce information-theoretic measures of degrees of semantic universals and show that these are not correlated with optimality in a random sample of artificial languages. These results suggest both that efficient communication shapes semantic typology in both content and function word domains, as well as that semantic universals may not stand in need of independent explanation.

17.
Synthese ; 199(1-2): 2597-2627, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34866664

RESUMO

Do causes necessitate their effects? Causal necessitarianism (CN) is the view that they do. One major objection-the "monotonicity objection"-runs roughly as follows. For many particular causal relations, we can easily find a possible "blocker"-an additional causal factor that, had it also been there, would have prevented the cause from producing its effect. However-the objection goes on-, if the cause really necessitated its effect in the first place, it would have produced it anyway-despite the blocker. Thus, CN must be false. Though different from Hume's famous attacks against CN, the monotonicity objection is no less important. In one form or another, it has actually been invoked by various opponents to CN, past and present. And indeed, its intuitive appeal is quite powerful. Yet, this paper argues that, once carefully analysed, the objection can be resisted-and should be. First, I show how its success depends on three implicit assumptions concerning, respectively, the notion of cause, the composition of causal factors, and the relation of necessitation. Second, I present general motivations for rejecting at least one of those assumptions: appropriate variants of them threaten views that even opponents to CN would want to preserve-in particular, the popular thesis of grounding necessitarianism. Finally, I argue that the assumption we should reject is the one concerning how causes should be understood: causes, I suggest, include an element of completeness that excludes blockers. In particular, I propose a way of understanding causal completeness that avoids common difficulties.

18.
BMC Med Res Methodol ; 20(1): 236, 2020 09 21.
Artigo em Inglês | MEDLINE | ID: mdl-32957931

RESUMO

BACKGROUND: The population attributable fraction (PAF) is the fraction of disease cases in a sample that can be attributed to an exposure. Estimating the PAF often involves the estimation of the probability of having the disease given the exposure while adjusting for confounders. In many settings, the exposure can interact with confounders. Additionally, the exposure may have a monotone effect on the probability of having the disease, and this effect is not necessarily linear. METHODS: We develop a semiparametric approach for estimating the probability of having the disease and, consequently, for estimating the PAF, controlling for the interaction between the exposure and a confounder. We use a tensor product of univariate B-splines to model the interaction under the monotonicity constraint. The model fitting procedure is formulated as a quadratic programming problem, and, thus, can be easily solved using standard optimization packages. We conduct simulations to compare the performance of the developed approach with the conventional B-splines approach without the monotonicity constraint, and with the logistic regression approach. To illustrate our method, we estimate the PAF of hopelessness and depression for suicidal ideation among elderly depressed patients. RESULTS: The proposed estimator exhibited better performance than the other two approaches in the simulation settings we tried. The estimated PAF attributable to hopelessness is 67.99% with 95% confidence interval: 42.10% to 97.42%, and is 22.36% with 95% confidence interval: 12.77% to 56.49% due to depression. CONCLUSIONS: The developed approach is easy to implement and supports flexible modeling of possible non-linear relationships between a disease and an exposure of interest.


Assuntos
Modelos Logísticos , Idoso , Simulação por Computador , Humanos , Probabilidade
19.
Clin Trials ; 17(5): 522-534, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-32631095

RESUMO

BACKGROUND/AIMS: In oncology, new combined treatments make it difficult to order dose levels according to monotonically increasing toxicity. New flexible dose-finding designs that take into account uncertainty in dose levels ordering were compared with classical designs through simulations in the setting of the monotonicity assumption violation. We give recommendations for the choice of dose-finding design. METHODS: Motivated by a clinical trial for patients with high-risk neuroblastoma, we considered designs that require a monotonicity assumption, the Bayesian Continual Reassessment Method, the modified Toxicity Probability Interval, the Bayesian Optimal Interval design, and designs that relax monotonicity assumption, the Bayesian Partial Ordering Continual Reassessment Method and the No Monotonicity Assumption design. We considered 15 scenarios including monotonic and non-monotonic dose-toxicity relationships among six dose levels. RESULTS: The No Monotonicity Assumption and Partial Ordering Continual Reassessment Method designs were robust to the violation of the monotonicity assumption. Under non-monotonic scenarios, the No Monotonicity Assumption design selected the correct dose level more often than alternative methods on average. Under the majority of monotonic scenarios, the Partial Ordering Continual Reassessment Method selected the correct dose level more often than the No Monotonicity Assumption design. Other designs were impacted by the violation of the monotonicity assumption with a proportion of correct selections below 20% in most scenarios. Under monotonic scenarios, the highest proportions of correct selections were achieved using the Continual Reassessment Method and the Bayesian Optimal Interval design (between 52.8% and 73.1%). The costs of relaxing the monotonicity assumption by the No Monotonicity Assumption design and Partial Ordering Continual Reassessment Method were decreases in the proportions of correct selections under monotonic scenarios ranging from 5.3% to 20.7% and from 1.4% to 16.1%, respectively, compared with the best performing design and were higher proportions of patients allocated to toxic dose levels during the trial. CONCLUSIONS: Innovative oncology treatments may no longer follow monotonic dose levels ordering which makes standard phase I methods fail. In such a setting, appropriate designs, as the No Monotonicity Assumption or Partial Ordering Continual Reassessment Method designs, should be used to safely determine recommended for phase II dose.


Assuntos
Ensaios Clínicos Fase I como Assunto/métodos , Dose Máxima Tolerável , Neuroblastoma/tratamento farmacológico , Projetos de Pesquisa , Antineoplásicos/uso terapêutico , Antineoplásicos/toxicidade , Teorema de Bayes , Ensaios Clínicos Fase I como Assunto/estatística & dados numéricos , Simulação por Computador , Relação Dose-Resposta a Droga , Efeitos Colaterais e Reações Adversas Relacionados a Medicamentos , Humanos , Modelos Estatísticos , Neuroblastoma/epidemiologia
20.
Proc Natl Acad Sci U S A ; 114(7): 1462-1467, 2017 02 14.
Artigo em Inglês | MEDLINE | ID: mdl-28137861

RESUMO

This work presents a mathematical study of tissue dynamics. We combine within-cell genome dynamics and diffusion between cells, so that the synthesis of the two gives rise to the emergence of function, akin to establishing "tissue homeostasis." We introduce two concepts, monotonicity and a weak version of hardwiring. These together are sufficient for global convergence of the tissue dynamics.


Assuntos
Fenômenos Fisiológicos Celulares , Modelos Biológicos , Algoritmos , Comunicação Celular , Células/metabolismo , Difusão , Genoma , Homeostase , Morfogênese
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA