Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Biomed Opt Express ; 14(12): 6442-6469, 2023 Dec 01.
Artigo em Inglês | MEDLINE | ID: mdl-38420310

RESUMO

Optical tweezers (OT) have become an essential technique in several fields of physics, chemistry, and biology as precise micromanipulation tools and microscopic force transducers. Quantitative measurements require the accurate calibration of the trap stiffness of the optical trap and the diffusion constant of the optically trapped particle. This is typically done by statistical estimators constructed from the position signal of the particle, which is recorded by a digital camera or a quadrant photodiode. The finite integration time and sampling frequency of the detector need to be properly taken into account. Here, we present a general approach based on the joint probability density function of the sampled trajectory that corrects exactly the biases due to the detector's finite integration time and limited sampling frequency, providing theoretical formulas for the most widely employed calibration methods: equipartition, mean squared displacement, autocorrelation, power spectral density, and force reconstruction via maximum-likelihood-estimator analysis (FORMA). Our results, tested with experiments and Monte Carlo simulations, will permit users of OT to confidently estimate the trap stiffness and diffusion constant, extending their use to a broader set of experimental conditions.

2.
Phys Rev E ; 103(5-1): 052121, 2021 May.
Artigo em Inglês | MEDLINE | ID: mdl-34134259

RESUMO

A 1929 Gedankenexperiment proposed by Szilárd, often referred to as "Szilárd's engine", has served as a foundation for computing fundamental thermodynamic bounds to information processing. While Szilárd's original box could be partitioned into two halves and contains one gas molecule, we calculate here the maximal average work that can be extracted in a system with N particles and q partitions, given an observer which counts the molecules in each partition, and given a work extraction mechanism that is limited to pressure equalization. We find that the average extracted work is proportional to the mutual information between the one-particle position and the vector containing the counts of how many particles are in each partition. We optimize this quantity over the initial locations of the dividing walls, and find that there exists a critical number of particles N^{★}(q) below which the extracted work is maximized by a symmetric configuration of the q partitions, and above which the optimal partitioning is asymmetric. Overall, the average extracted work is maximized for a number of particles N[over ̂](q)

3.
Phys Rev E ; 102(5-2): 059904, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-33327215

RESUMO

This corrects the article DOI: 10.1103/PhysRevE.90.050103.

4.
Phys Rev E ; 100(3-1): 032134, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31639925

RESUMO

The problem of efficiently reconstructing tomographic images can be mapped into a Bayesian inference problem over the space of pixels densities. Solutions to this problem are given by pixels assignments that are compatible with tomographic measurements and maximize a posterior probability density. This maximization can be performed with standard local optimization tools when the log-posterior is a convex function, but it is generally intractable when introducing realistic nonconcave priors that reflect typical images features such as smoothness or sharpness. We introduce a new method to reconstruct images obtained from Radon projections by using expectation propagation, which allows us to approximate the intractable posterior. We show, by means of extensive simulations, that, compared to state-of-the-art algorithms for this task, expectation propagation paired with very simple but non-log-concave priors is often able to reconstruct images up to a smaller error while using a lower amount of information per pixel. We provide estimates for the critical rate of information per pixel above which recovery is error-free by means of simulations on ensembles of phantom and real images.

5.
Phys Rev E ; 98(2-1): 020102, 2018 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-30253505

RESUMO

We develop a theoretical approach to compute the conditioned spectral density of N×N noninvariant random matrices in the limit N→∞. This large deviation observable, defined as the eigenvalue distribution conditioned to have a fixed fraction k of eigenvalues smaller than x∈R, provides the spectrum of random matrix samples that deviate atypically from the average behavior. We apply our theory to sparse random matrices and unveil strikingly different and generic properties, namely, (i) their conditioned spectral density has compact support, (ii) it does not experience any abrupt transition for k around its typical value, and (iii) its eigenvalues do not accumulate at x. Moreover, our work points towards other types of transitions in the conditioned spectral density for values of k away from its typical value. These properties follow from the weak or absent eigenvalue repulsion in sparse ensembles and they are in sharp contrast to those displayed by classic or rotationally invariant random matrices. The exactness of our theoretical findings are confirmed through numerical diagonalization of finite random matrices.

6.
Sci Rep ; 8(1): 4825, 2018 Mar 19.
Artigo em Inglês | MEDLINE | ID: mdl-29556047

RESUMO

When complex systems are driven to depletion by some external factor, their non-stationary dynamics can present an intermittent behaviour between relative tranquility and burst of activity whose consequences are often catastrophic. To understand and ultimately be able to predict such dynamics, we propose an underlying mechanism based on sharp thresholds of a local generalized energy density that naturally leads to negative feedback. We find a transition from a continuous regime to an intermittent one, in which avalanches can be predicted despite the stochastic nature of the process. This model may have applications in many natural and social complex systems where a rapid depletion of resources or generalized energy drives the dynamics. In particular, we show how this model accurately describes the time evolution and avalanches present in a real social system.

7.
Entropy (Basel) ; 20(11)2018 Nov 17.
Artigo em Inglês | MEDLINE | ID: mdl-33266609

RESUMO

Pseudo-random number generators are widely used in many branches of science, mainly in applications related to Monte Carlo methods, although they are deterministic in design and, therefore, unsuitable for tackling fundamental problems in security and cryptography. The natural laws of the microscopic realm provide a fairly simple method to generate non-deterministic sequences of random numbers, based on measurements of quantum states. In practice, however, the experimental devices on which quantum random number generators are based are often unable to pass some tests of randomness. In this review, we briefly discuss two such tests, point out the challenges that we have encountered in experimental implementations and finally present a fairly simple method that successfully generates non-deterministic maximally random sequences.

8.
Sci Rep ; 7(1): 3096, 2017 06 08.
Artigo em Inglês | MEDLINE | ID: mdl-28596593

RESUMO

Random number generation plays an essential role in technology with important applications in areas ranging from cryptography to Monte Carlo methods, and other probabilistic algorithms. All such applications require high-quality sources of random numbers, yet effective methods for assessing whether a source produce truly random sequences are still missing. Current methods either do not rely on a formal description of randomness (NIST test suite) on the one hand, or are inapplicable in principle (the characterization derived from the Algorithmic Theory of Information), on the other, for they require testing all the possible computer programs that could produce the sequence to be analysed. Here we present a rigorous method that overcomes these problems based on Bayesian model selection. We derive analytic expressions for a model's likelihood which is then used to compute its posterior distribution. Our method proves to be more rigorous than NIST's suite and Borel-Normality criterion and its implementation is straightforward. We applied our method to an experimental device based on the process of spontaneous parametric downconversion to confirm it behaves as a genuine quantum random number generator. As our approach relies on Bayesian inference our scheme transcends individual sequence analysis, leading to a characterization of the source itself.

9.
Phys Rev Lett ; 117(10): 104101, 2016 Sep 02.
Artigo em Inglês | MEDLINE | ID: mdl-27636476

RESUMO

We present a general method to obtain the exact rate function Ψ_{[a,b]}(k) controlling the large deviation probability Prob[I_{N}[a,b]=kN]≍e^{-NΨ_{[a,b]}(k)} that an N×N sparse random matrix has I_{N}[a,b]=kN eigenvalues inside the interval [a,b]. The method is applied to study the eigenvalue statistics in two distinct examples: (i) the shifted index number of eigenvalues for an ensemble of Erdös-Rényi graphs and (ii) the number of eigenvalues within a bounded region of the spectrum for the Anderson model on regular random graphs. A salient feature of the rate function in both cases is that, unlike rotationally invariant random matrices, it is asymmetric with respect to its minimum. The asymmetric character depends on the disorder in a way that is compatible with the distinct eigenvalue statistics corresponding to localized and delocalized eigenstates. The results also show that the level compressibility κ_{2}/κ_{1} for the Anderson model on a regular graph satisfies 0<κ_{2}/κ_{1}<1 in the bulk regime, in contrast with the behavior found in Gaussian random matrices. Our theoretical findings are thoroughly compared to numerical diagonalization in both cases, showing a reasonable good agreement.

10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 90(5-1): 050103, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25493721

RESUMO

We study the statistics of the condition number κ=λ_{max}/λ_{min} (the ratio between largest and smallest squared singular values) of N×M Gaussian random matrices. Using a Coulomb fluid technique, we derive analytically and for large N the cumulative P(κx) distributions of κ. We find that these distributions decay as P(κx)≈exp[-ßNΦ_{+}(x)], where ß is the Dyson index of the ensemble. The left and right rate functions Φ_{±}(x) are independent of ß and calculated exactly for any choice of the rectangularity parameter α=M/N-1>0. Interestingly, they show a weak nonanalytic behavior at their minimum 〈κ〉 (corresponding to the average condition number), a direct consequence of a phase transition in the associated Coulomb fluid problem. Matching the behavior of the rate functions around 〈κ〉, we determine exactly the scale of typical fluctuations ∼O(N^{-2/3}) and the tails of the limiting distribution of κ. The analytical results are in excellent agreement with numerical simulations.

11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 90(5-1): 052708, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25493817

RESUMO

The mechanical properties of molecules are today captured by single molecule manipulation experiments, so that polymer features are tested at a nanometric scale. Yet devising mathematical models to get further insight beyond the commonly studied force-elongation relation is typically hard. Here we draw from techniques developed in the context of disordered systems to solve models for single and double-stranded DNA stretching in the limit of a long polymeric chain. Since we directly derive the marginals for the molecule local orientation, our approach allows us to readily calculate the experimental elongation as well as other observables at wish. As an example, we evaluate the correlation length as a function of the stretching force. Furthermore, we are able to fit successfully our solution to real experimental data. Although the model is admittedly phenomenological, our findings are very sound. For single-stranded DNA our solution yields the correct (monomer) scale and yet, more importantly, the right persistence length of the molecule. In the double-stranded case, our model reproduces the well-known overstretching transition and correctly captures the ratio between native DNA and overstretched DNA. Also in this case the model yields a persistence length in good agreement with consensus, and it gives interesting insights into the bending stiffness of the native and overstretched molecule, respectively.

12.
Artigo em Inglês | MEDLINE | ID: mdl-25375421

RESUMO

We compute the full order statistics of a one-dimensional gas of spinless fermions (or, equivalently, hard bosons) in a harmonic trap at zero temperature, including its large deviation tails. The problem amounts to computing the probability distribution of the kth smallest eigenvalue λ(k) of a large dimensional Gaussian random matrix. We find that this probability behaves for large N as P[λ(k)=x]≈exp[-ßN(2)ψ(k/N,x)], where ß is the Dyson index of the ensemble. The rate function ψ(c,x), computed explicitly as a function of x in terms of the intensive label c=k/N, has a quadratic behavior modulated by a weak logarithmic singularity at its minimum. This is shown to be related to phase transitions in the associated Coulomb gas problem. The connection with statistics of extreme eigenvalues and order stastistics of random matrices is also discussed. We find that, as a function of c and keeping the value of x fixed, the rate function ψ(c,x) describes the statistics of the shifted index number, generalizing known results on its typical fluctuations; as a function of x and keeping the fraction c=k/N fixed, the rate function ψ(c,x) also describes the statistics of the kth eigenvalue in the bulk, generalizing as well the results on its typical fluctuations. Moreover, for k=1 (respectively, for k=N), the rate function captures both the fluctuations to the left and to the right of the typical value of λ(1) (respectively, of λ(N)).

13.
Phys Rev E Stat Nonlin Soft Matter Phys ; 82(4 Pt 1): 040104, 2010 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-21230224

RESUMO

We consider the large deviations of the smallest eigenvalue of the Wishart-Laguerre Ensemble. Using the Coulomb gas picture we obtain rate functions for the large fluctuations to the left and the right of the hard edge. Our results are compared with known exact results for ß=1 finding good agreement. We also consider the case of almost square matrices finding new universal rate functions describing large fluctuations.

14.
Proc Natl Acad Sci U S A ; 106(8): 2607-11, 2009 Feb 24.
Artigo em Inglês | MEDLINE | ID: mdl-19196991

RESUMO

Understanding the organization of reaction fluxes in cellular metabolism from the stoichiometry and the topology of the underlying biochemical network is a central issue in systems biology. In this task, it is important to devise reasonable approximation schemes that rely on the stoichiometric data only, because full-scale kinetic approaches are computationally affordable only for small networks (e.g., red blood cells, approximately 50 reactions). Methods commonly used are based on finding the stationary flux configurations that satisfy mass-balance conditions for metabolites, often coupling them to local optimization rules (e.g., maximization of biomass production) to reduce the size of the solution space to a single point. Such methods have been widely applied and have proven able to reproduce experimental findings for relatively simple organisms in specific conditions. Here, we define and study a constraint-based model of cellular metabolism where neither mass balance nor flux stationarity are postulated and where the relevant flux configurations optimize the global growth of the system. In the case of Escherichia coli, steady flux states are recovered as solutions, although mass-balance conditions are violated for some metabolites, implying a nonzero net production of the latter. Such solutions furthermore turn out to provide the correct statistics of fluxes for the bacterium E. coli in different environments and compare well with the available experimental evidence on individual fluxes. Conserved metabolic pools play a key role in determining growth rate and flux variability. Finally, we are able to connect phenomenological gene essentiality with "frozen" fluxes (i.e., fluxes with smaller allowed variability) in E. coli metabolism.


Assuntos
Escherichia coli/genética , Genes Bacterianos , Genes Essenciais , Escherichia coli/metabolismo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...