Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 205
Filtrar
Mais filtros

Tipo de documento
Intervalo de ano de publicação
1.
Proc Natl Acad Sci U S A ; 119(35): e2202789119, 2022 08 30.
Artigo em Inglês | MEDLINE | ID: mdl-35998221

RESUMO

Humans and other animals often infer spurious associations among unrelated events. However, such superstitious learning is usually accounted for by conditioned associations, raising the question of whether an animal could develop more complex cognitive structures independent of reinforcement. Here, we tasked monkeys with discovering the serial order of two pictorial sets: a "learnable" set in which the stimuli were implicitly ordered and monkeys were rewarded for choosing the higher-rank stimulus and an "unlearnable" set in which stimuli were unordered and feedback was random regardless of the choice. We replicated prior results that monkeys reliably learned the implicit order of the learnable set. Surprisingly, the monkeys behaved as though some ordering also existed in the unlearnable set, showing consistent choice preference that transferred to novel untrained pairs in this set, even under a preference-discouraging reward schedule that gave rewards more frequently to the stimulus that was selected less often. In simulations, a model-free reinforcement learning algorithm (Q-learning) displayed a degree of consistent ordering among the unlearnable set but, unlike the monkeys, failed to do so under the preference-discouraging reward schedule. Our results suggest that monkeys infer abstract structures from objectively random events using heuristics that extend beyond stimulus-outcome conditional learning to more cognitive model-based learning mechanisms.


Assuntos
Aprendizagem por Associação , Reforço Psicológico , Superstições , Animais , Condicionamento Clássico , Haplorrinos , Humanos , Recompensa , Superstições/psicologia
2.
Stat Med ; 43(6): 1194-1212, 2024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38243729

RESUMO

In recent decades, several randomization designs have been proposed in the literature as better alternatives to the traditional permuted block design (PBD), providing higher allocation randomness under the same restriction of the maximum tolerated imbalance (MTI). However, PBD remains the most frequently used method for randomizing subjects in clinical trials. This status quo may reflect an inadequate awareness and appreciation of the statistical properties of these randomization designs, and a lack of simple methods for their implementation. This manuscript presents the analytic results of statistical properties for five randomization designs with MTI restriction based on their steady-state probabilities of the treatment imbalance Markov chain and compares them to those of the PBD. A unified framework for randomization sequence generation and real-time on-demand treatment assignment is proposed for the straightforward implementation of randomization algorithms with explicit formulas of conditional allocation probabilities. Topics associated with the evaluation, selection, and implementation of randomization designs are discussed. It is concluded that for two-arm equal allocation trials, several randomization designs offer stronger protection against selection bias than the PBD does, and their implementation is not necessarily more difficult than the implementation of the PBD.


Assuntos
Modelos Estatísticos , Projetos de Pesquisa , Humanos , Distribuição Aleatória , Viés de Seleção , Probabilidade
3.
BMC Med Res Methodol ; 24(1): 52, 2024 Feb 28.
Artigo em Inglês | MEDLINE | ID: mdl-38418968

RESUMO

BACKGROUND: The design of a multi-center randomized controlled trial (RCT) involves multiple considerations, such as the choice of the sample size, the number of centers and their geographic location, the strategy for recruitment of study participants, amongst others. There are plenty of methods to sequentially randomize patients in a multi-center RCT, with or without considering stratification factors. The goal of this paper is to perform a systematic assessment of such randomization methods for a multi-center 1:1 RCT assuming a competitive policy for the patient recruitment process. METHODS: We considered a Poisson-gamma model for the patient recruitment process with a uniform distribution of center activation times. We investigated 16 randomization methods (4 unstratified, 4 region-stratified, 4 center-stratified, 3 dynamic balancing randomization (DBR), and a complete randomization design) to sequentially randomize n = 500 patients. Statistical properties of the recruitment process and the randomization procedures were assessed using Monte Carlo simulations. The operating characteristics included time to complete recruitment, number of centers that recruited a given number of patients, several measures of treatment imbalance and estimation efficiency under a linear model for the response, the expected proportions of correct guesses under two different guessing strategies, and the expected proportion of deterministic assignments in the allocation sequence. RESULTS: Maximum tolerated imbalance (MTI) randomization methods such as big stick design, Ehrenfest urn design, and block urn design result in a better balance-randomness tradeoff than the conventional permuted block design (PBD) with or without stratification. Unstratified randomization, region-stratified randomization, and center-stratified randomization provide control of imbalance at a chosen level (trial, region, or center) but may fail to achieve balance at the other two levels. By contrast, DBR does a very good job controlling imbalance at all 3 levels while maintaining the randomized nature of treatment allocation. Adding more centers into the study helps accelerate the recruitment process but at the expense of increasing the number of centers that recruit very few (or no) patients-which may increase center-level imbalances for center-stratified and DBR procedures. Increasing the block size or the MTI threshold(s) may help obtain designs with improved randomness-balance tradeoff. CONCLUSIONS: The choice of a randomization method is an important component of planning a multi-center RCT. Dynamic balancing randomization with carefully chosen MTI thresholds could be a very good strategy for trials with the competitive policy for patient recruitment.


Assuntos
Projetos de Pesquisa , Humanos , Distribuição Aleatória , Tamanho da Amostra , Seleção de Pacientes
4.
Microsc Microanal ; 30(2): 306-317, 2024 Apr 29.
Artigo em Inglês | MEDLINE | ID: mdl-38498601

RESUMO

The quantitative description of biological structures is a valuable yet difficult task in the life sciences. This is commonly accomplished by imaging samples using fluorescence microscopy and analyzing resulting images using Pearson's correlation or Manders' co-occurrence intensity-based colocalization paradigms. Though conceptually and computationally simple, these approaches are critically flawed due to their reliance on signal overlap, sensitivity to cursory signal qualities, and inability to differentiate true and incidental colocalization. Point pattern analysis provides a framework for quantitative characterization of spatial relationships between spatial patterns using the distances between observations rather than their overlap, thus overcoming these issues. Here we introduce an image analysis tool called Spatial Pattern Analysis using Closest Events (SPACE) that leverages nearest neighbor-based point pattern analysis to characterize the spatial relationship of fluorescence microscopy signals from image data. The utility of SPACE is demonstrated by assessing the spatial association between mRNA and cell nuclei from confocal images of cardiac myocytes. Additionally, we use synthetic and empirical images to characterize the sensitivity of SPACE to image segmentation parameters and cursory image qualities such as signal abundance and image resolution. Ultimately, SPACE delivers performance superior to traditional colocalization methods and offers a valuable addition to the microscopist's toolbox.


Assuntos
Processamento de Imagem Assistida por Computador , Microscopia de Fluorescência , Processamento de Imagem Assistida por Computador/métodos , Microscopia de Fluorescência/métodos , Miócitos Cardíacos , Animais , Núcleo Celular , Análise Espacial , RNA Mensageiro/genética , RNA Mensageiro/análise , Microscopia Confocal/métodos
5.
Int J Mol Sci ; 25(12)2024 Jun 07.
Artigo em Inglês | MEDLINE | ID: mdl-38928027

RESUMO

A hypothesis is presented to explain how the ageing process might be influenced by optimizing mitochondrial efficiency to reduce intracellular entropy. Research-based quantifications of entropy are scarce. Non-equilibrium metabolic reactions and compartmentalization were found to contribute most to lowering entropy in the cells. Like the cells, mitochondria are thermodynamically open systems exchanging matter and energy with their surroundings-the rest of the cell. Based on the calculations from cancer cells, glycolysis was reported to produce less entropy than mitochondrial oxidative phosphorylation. However, these estimations depended on the CO2 concentration so that at slightly increased CO2, it was oxidative phosphorylation that produced less entropy. Also, the thermodynamic efficiency of mitochondrial respiratory complexes varies depending on the respiratory state and oxidant/antioxidant balance. Therefore, in spite of long-standing theoretical and practical efforts, more measurements, also in isolated mitochondria, with intact and suboptimal respiration, are needed to resolve the issue. Entropy increases in ageing while mitochondrial efficiency of energy conversion, quality control, and turnover mechanisms deteriorate. Optimally functioning mitochondria are necessary to meet energy demands for cellular defence and repair processes to attenuate ageing. The intuitive approach of simply supplying more metabolic fuels (more nutrients) often has the opposite effect, namely a decrease in energy production in the case of nutrient overload. Excessive nutrient intake and obesity accelerate ageing, while calorie restriction without malnutrition can prolong life. Balanced nutrient intake adapted to needs/activity-based high ATP requirement increases mitochondrial respiratory efficiency and leads to multiple alterations in gene expression and metabolic adaptations. Therefore, rather than overfeeding, it is necessary to fine-tune energy production by optimizing mitochondrial function and reducing oxidative stress; the evidence is discussed in this paper.


Assuntos
Envelhecimento , Entropia , Mitocôndrias , Espécies Reativas de Oxigênio , Mitocôndrias/metabolismo , Humanos , Envelhecimento/metabolismo , Espécies Reativas de Oxigênio/metabolismo , Animais , Metabolismo Energético , Estresse Oxidativo , Fosforilação Oxidativa
6.
Behav Res Methods ; 56(7): 7831-7848, 2024 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-38954396

RESUMO

Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which participants are asked to generate a sequence of numbers that is as random as possible. However, there is no consensus on how best to quantify the randomness of responses in human-generated sequences. Traditionally, psychologists have used measures of randomness that directly assess specific features of human behavior in RNG tasks, such as the tendency to avoid repetition or to systematically generate numbers that have not been generated in the recent choice history, a behavior known as cycling. Other disciplines have proposed measures of randomness that are based on a more rigorous mathematical foundation and are less restricted to specific features of randomness, such as algorithmic complexity. More recently, variants of these measures have been proposed to assess systematic patterns in short sequences. We report the first large-scale integrative study to compare measures of specific aspects of randomness with entropy-derived measures based on information theory and measures based on algorithmic complexity. We compare the ability of the different measures to discriminate between human-generated sequences and truly random sequences based on atmospheric noise, and provide a systematic analysis of how the usefulness of randomness measures is affected by sequence length. We conclude with recommendations that can guide the selection of appropriate measures of randomness in psychological research.


Assuntos
Algoritmos , Humanos , Masculino , Feminino , Teoria da Informação , Adulto , Entropia
7.
Entropy (Basel) ; 26(4)2024 Apr 17.
Artigo em Inglês | MEDLINE | ID: mdl-38667895

RESUMO

We investigate whether it is possible to distinguish chaotic time series from random time series using network theory. In this perspective, we selected four methods to generate graphs from time series: the natural, the horizontal, the limited penetrable horizontal visibility graph, and the phase space reconstruction method. These methods claim that the distinction of chaos from randomness is possible by studying the degree distribution of the generated graphs. We evaluated these methods by computing the results for chaotic time series from the 2D Torus Automorphisms, the chaotic Lorenz system, and a random sequence derived from the normal distribution. Although the results confirm previous studies, we found that the distinction of chaos from randomness is not generally possible in the context of the above methodologies.

8.
Stud Hist Philos Sci ; 106: 196-207, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-39059029

RESUMO

The first formal definition of randomness, seen as a property of sequences of events or experimental outcomes, dates back to Richard von Mises' work in the foundations of probability and statistics. The randomness notion introduced by von Mises is nowadays widely regarded as being too weak. This is, to a large extent, due to the work of Jean Ville, which is often described as having dealt the death blow to von Mises' approach, and which was integral to the development of algorithmic randomness-the now-standard theory of randomness for elements of a probability space. The main goal of this article is to trace the history and provide an in-depth appraisal of two lesser-known, yet historically and methodologically notable proposals for how to modify von Mises' definition so as to avoid Ville's objection. The first proposal is due to Abraham Wald, while the second one is due to Claus-Peter Schnorr. We show that, once made precise in a natural way using computability theory, Wald's proposal constitutes a much more radical departure from von Mises' framework than intended. Schnorr's proposal, on the other hand, does provide a partial vindication of von Mises' approach: it demonstrates that it is possible to obtain a satisfactory randomness notion-indeed, a canonical algorithmic randomness notion-by characterizing randomness in terms of the invariance of limiting relative frequencies. More generally, we argue that Schnorr's proposal, together with a number of little-known related results, reveals that there is more continuity than typically acknowledged between von Mises' approach and algorithmic randomness. Even though von Mises' exclusive focus on limiting relative frequencies did not survive the passage to the theory of algorithmic randomness, another crucial aspect of his conception of randomness did endure; namely, the idea that randomness amounts to a certain type of stability or invariance under an appropriate class of transformations.


Assuntos
Algoritmos , História do Século XX , Probabilidade , História do Século XIX
9.
Stroke ; 54(7): 1909-1919, 2023 07.
Artigo em Inglês | MEDLINE | ID: mdl-37078281

RESUMO

From 2016 to 2021, the National Institutes of Health Stroke Trials Network funded by National Institutes of Health/National Institute of Neurological Disorders and Stroke initiated ten multicenter randomized controlled clinical trials. Optimal subject randomization designs are demanded with 4 critical properties: (1) protection of treatment assignment randomness, (2) achievement of the desired treatment allocation ratio, (3) balancing of baseline covariates, and (4) ease of implementation. For acute stroke trials, it is necessary to minimize the time between eligibility assessment and treatment initiation. This article reviews the randomization designs for 3 trials currently enrolling in Stroke Trials Network funded by National Institutes of Health/National Institute of Neurological Disorders and Stroke, the SATURN (Statins in Intracerebral Hemorrhage Trial), the MOST (Multiarm Optimization of Stroke Thrombolysis Trial), and the FASTEST (Recombinant Factor VIIa for Hemorrhagic Stroke Trial). Randomization methods utilized in these trials include minimal sufficient balance, block urn design, big stick design, and step-forward randomization. Their advantages and limitations are reviewed and compared with traditional stratified permuted block design and minimization.


Assuntos
National Institute of Neurological Disorders and Stroke (USA) , Acidente Vascular Cerebral , Humanos , Hemorragia Cerebral/terapia , Estudos Multicêntricos como Assunto , National Institutes of Health (U.S.) , Distribuição Aleatória , Acidente Vascular Cerebral/tratamento farmacológico , Estados Unidos , Ensaios Clínicos Controlados Aleatórios como Assunto
10.
Stat Med ; 42(3): 228-245, 2023 02 10.
Artigo em Inglês | MEDLINE | ID: mdl-36415044

RESUMO

Explained variation is well understood under linear regression models and has been extended to models for survival data. In this article, we consider the mixture cure models. We propose two approaches to define explained variation under the mixture cure models, one based on the Kullback-Leibler information gain and the other based on residual sum of squares. We show that the proposed measures have desired properties as measures of explained variation, similar to those under other regression models. A simulation study is conducted to demonstrate the properties of the proposed measures. They are also applied to real data analyses to illustrate the use of explained variation.


Assuntos
Modelos Estatísticos , Humanos , Modelos de Riscos Proporcionais , Simulação por Computador , Modelos Lineares , Análise de Sobrevida
11.
Proc Natl Acad Sci U S A ; 117(47): 29555-29560, 2020 Nov 24.
Artigo em Inglês | MEDLINE | ID: mdl-33154159

RESUMO

The exotic properties of quantum spin liquids (QSLs) have continually been of interest since Anderson's 1973 ground-breaking idea. Geometrical frustration, quantum fluctuations, and low dimensionality are the most often evoked material's characteristics that favor the long-range fluctuating spin state without freezing into an ordered magnet or a spin glass at low temperatures. Among the few known QSL candidates, organic crystals have the advantage of having rich chemistry capable of finely tuning their microscopic parameters. Here, we demonstrate the emergence of a QSL state in [EDT-TTF-CONH2]2 +[[Formula: see text]] (EDT-BCO), where the EDT molecules with spin-1/2 on a triangular lattice form layers which are separated by a sublattice of BCO molecular rotors. By several magnetic measurements, we show that the subtle random potential of frozen BCO Brownian rotors suppresses magnetic order down to the lowest temperatures. Our study identifies the relevance of disorder in the stabilization of QSLs.

12.
Acta Biotheor ; 71(2): 12, 2023 Mar 18.
Artigo em Inglês | MEDLINE | ID: mdl-36933070

RESUMO

Mutations are often described as being "random with respect to fitness." Here we show that the experiments used to establish randomness with respect to fitness are only capable of showing that mutations are random with respect to current external selection. Current debates about whether or not mutations are directed may be at least partially resolved by making use of this distinction. Additionally, this distinction has important mathematical, experimental, and inferential implications.


Assuntos
Condicionamento Físico Animal , Seleção Genética , Animais , Aptidão Genética , Mutação
13.
Entropy (Basel) ; 25(10)2023 Sep 28.
Artigo em Inglês | MEDLINE | ID: mdl-37895511

RESUMO

Null models are crucial tools for investigating network topological structures. However, research on null models for higher-order networks is still relatively scarce. In this study, we introduce an innovative method to construct null models for hypergraphs, namely the hyperedge swapping-based method. By preserving certain network properties while altering others, we generate six hyper-null models with various orders and analyze their interrelationships. To validate our approach, we first employ hypergraph entropy to assess the randomness of these null models across four datasets. Furthermore, we examine the differences in important statistical properties between the various null models and the original networks. Lastly, we investigate the impact of hypergraph randomness on network dynamics using the proposed hyper-null models, focusing on dismantling and epidemic contagion. The findings show that our proposed hyper-null models are applicable to various scenarios. By introducing a comprehensive framework for generating and analyzing hyper-null models, this research opens up avenues for further exploration of the intricacies of network structures and their real-world implications.

14.
Entropy (Basel) ; 25(3)2023 Mar 04.
Artigo em Inglês | MEDLINE | ID: mdl-36981338

RESUMO

In the large-scale measurement field, deployment planning usually uses the Monte Carlo method for simulation analysis, which has high algorithm complexity. At the same time, traditional station planning is inefficient and unable to calculate overall accessibility due to the occlusion of tooling. To solve this problem, in this study, we first introduced a Poisson-like randomness strategy and an enhanced randomness strategy to improve the remora optimization algorithm (ROA), i.e., the PROA. Simultaneously, its convergence speed and robustness were verified in different dimensions using the CEC benchmark function. The convergence speed of 67.5-74% of the results is better than the ROA, and the robustness results of 66.67-75% are better than those of the ROA. Second, a deployment model was established for the large-scale measurement field to obtain the maximum visible area of the target to be measured. Finally, the PROA was used as the optimizer to solve optimal deployment planning; the performance of the PROA was verified by simulation analysis. In the case of six stations, the maximum visible area of the PROA reaches 83.02%, which is 18.07% higher than that of the ROA. Compared with the traditional method, this model shortens the deployment time and calculates the overall accessibility, which is of practical significance for improving assembly efficiency in large-size measurement field environments.

15.
Entropy (Basel) ; 25(1)2023 Jan 13.
Artigo em Inglês | MEDLINE | ID: mdl-36673302

RESUMO

Assuming that there is no way of sending signals propagating faster than light and that free will exists, the loophole-free observed violation of Bell's inequalities demonstrates that at least one of three fundamental hypotheses involved in the derivation and observation of the inequalities is false: Locality, Realism, or Ergodicity. An experiment is proposed to obtain some evidence about which one is the false one. It is based on recording the time evolution of the rate of non-random series of outcomes that are generated in a specially designed Bell's setup. The results of such experiment would be important not only to the foundations of Quantum Mechanics, but they would also have immediate practical impact on the efficient use of quantum-based random number generators and the security of Quantum Key Distribution using entangled states.

16.
Entropy (Basel) ; 25(12)2023 Nov 28.
Artigo em Inglês | MEDLINE | ID: mdl-38136472

RESUMO

Random pulse computing (RPC), the third paradigm along with digital and quantum computing, draws inspiration from biology, particularly the functioning of neurons. Here, we study information processing in random pulse computing circuits intended for the summation of numbers. Based on the information-theoretic merits of entropy budget and relative Kolmogorov-Sinai entropy, we investigate the prior art and propose new circuits: three deterministic adders with significantly improved output entropy and one exact nondeterministic adder that requires much less additional entropy than the previous art. All circuits are realized and tested experimentally, using quantum entropy sources and reconfigurable logic devices. Not only the proposed circuits yield a precise mathematical result and have output entropy near maximum, which satisfies the need for building a programmable random pulse computer, but also they provide affordable hardware options for generating additional entropy.

17.
Educ Stud Math ; 112(1): 3-24, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36277373

RESUMO

The many studies with coin-tossing tasks in literature show that the concept of randomness is challenging for adults as well as children. Systematic errors observed in coin-tossing tasks are often related to the representativeness heuristic, which refers to a mental shortcut that is used to judge randomness by evaluating how well a set of random events represents the typical example for random events we hold in our mind. Representative thinking is explained by our tendency to seek for patterns in our surroundings. In the present study, predictions of coin-tosses of 302 third-graders were explored. Findings suggest that in third grade of elementary school, children make correct as well as different types of erroneous predictions and individual differences exist. Moreover, erroneous predictions that were in line with representative thinking were positively associated with an early spontaneous focus on regularities, which was assessed when they were in second year of preschool. We concluded that previous studies might have underestimated children's reasoning about randomness in coin-tossing contexts and that representative thinking is indeed associated with pattern-based thinking tendencies.

18.
Semin Cell Dev Biol ; 97: 86-92, 2020 01.
Artigo em Inglês | MEDLINE | ID: mdl-31301356

RESUMO

A fundamental question in evolutionary biology is how heritable variability originates. In Darwinian evolution a central focus was placed on the thinking that random variations are the material basis for the action of natural selection. Although randomness in Darwinian evolution can be interpreted from different angles, here I will focus on the assumption of equiprobability of mutations. I have reviewed the literature regarding epigenetic mechanisms causing biased genetic variability. Although it is interesting to find correlations between somatic epigenetic marks and evolution, causation between epigenetic changes and genomic evolutionary novelties can only be established when the epigenetic changes are interrogated in the germ line. Epigenetic changes are reported to influence the emergence of single nucleotide polymorphisms and copy number variations. On the other hand, epigenetic changes are known to be influenced by environmental exposures. This dual ability of epigenetic changes could mean that germ line epigenetically influenced mutations could have an important role in the emergence of genomic evolutionary novelties. The emergent knowledge on the relation of epigenetic changes and mutations will help to understand an underappreciated role of the environment in speciation and genomic divergence: that of influencer of genomic changes.


Assuntos
Evolução Biológica , Epigênese Genética/genética , Genômica/métodos , Genótipo , Humanos
19.
Stat Med ; 41(10): 1846-1861, 2022 05 10.
Artigo em Inglês | MEDLINE | ID: mdl-35176811

RESUMO

Minimal sufficient balance (MSB) is a recently suggested method for adaptively controlling covariate imbalance in randomized controlled trials in a manner which reduces the impact on randomness of allocation over other approaches by only intervening when the imbalance is sufficiently significant. Despite its improvements, the approach is unable to consider the relative clinical importance or magnitude of imbalance in each covariate weight, and ignores any imbalance which is not statistically significant, even when these imbalances may collectively justify intervention. We propose the common scale MSB (CS-MSB) method which addresses these limitations, and present simulation studies comparing our proposed method to MSB. We demonstrate that CS-MSB requires less intervention than MSB to achieve the same level of covariate balance, and does not adversely impact either statistical power or Type-I error.


Assuntos
Projetos de Pesquisa , Simulação por Computador , Humanos , Razão de Chances , Distribuição Aleatória
20.
Proc Natl Acad Sci U S A ; 116(46): 23091-23099, 2019 11 12.
Artigo em Inglês | MEDLINE | ID: mdl-31659052

RESUMO

Conventional kinesin, responsible for directional transport of cellular vesicles, takes multiple nearly uniform 8.2-nm steps by consuming one ATP molecule per step as it walks toward the plus end of the microtubule (MT). Despite decades of intensive experimental and theoretical studies, there are gaps in the elucidation of key steps in the catalytic cycle of kinesin. How the motor waits for ATP to bind to the leading head is controversial. Two experiments using a similar protocol have arrived at different conclusions. One asserts that kinesin waits for ATP in a state with both the heads bound to the MT, whereas the other shows that ATP binds to the leading head after the trailing head detaches. To discriminate between the 2 scenarios, we developed a minimal model, which analytically predicts the outcomes of a number of experimental observable quantities (the distribution of run length, the distribution of velocity [[Formula: see text]], and the randomness parameter) as a function of an external resistive force (F) and ATP concentration ([T]). The differences in the predicted bimodality in [Formula: see text] as a function of F between the 2 models may be amenable to experimental testing. Most importantly, we predict that the F and [T] dependence of the randomness parameters differ qualitatively depending on the waiting states. The randomness parameters as a function of F and [T] can be quantitatively measured from stepping trajectories with very little prejudice in data analysis. Therefore, an accurate measurement of the randomness parameter and the velocity distribution as a function of load and nucleotide concentration could resolve the apparent controversy.


Assuntos
Trifosfato de Adenosina/metabolismo , Cinesinas/metabolismo , Microtúbulos/metabolismo , Modelos Químicos , Cinética
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA