Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 84
Filtrar
Más filtros

Tipo del documento
Intervalo de año de publicación
1.
Entropy (Basel) ; 25(6)2023 Jun 10.
Artículo en Inglés | MEDLINE | ID: mdl-37372263

RESUMEN

Using the Luchko's general fractional calculus (GFC) and its extension in the form of the multi-kernel general fractional calculus of arbitrary order (GFC of AO), a nonlocal generalization of probability is suggested. The nonlocal and general fractional (CF) extensions of probability density functions (PDFs), cumulative distribution functions (CDFs) and probability are defined and its properties are described. Examples of general nonlocal probability distributions of AO are considered. An application of the multi-kernel GFC allows us to consider a wider class of operator kernels and a wider class of nonlocality in the probability theory.

2.
Stud Hist Philos Sci ; 101: 48-60, 2023 10.
Artículo en Inglés | MEDLINE | ID: mdl-37690232

RESUMEN

Problems with uniform probabilities on an infinite support show up in contemporary cosmology. This paper focuses on the context of inflation theory, where it complicates the assignment of a probability measure over pocket universes. The measure problem in cosmology, whereby it seems impossible to pick out a uniquely well-motivated measure, is associated with a paradox that occurs in standard probability theory and crucially involves uniformity on an infinite sample space. This problem has been discussed by physicists, albeit without reference to earlier work on this topic. The aim of this article is both to introduce philosophers of probability to these recent discussions in cosmology and to familiarize physicists and philosophers working on cosmology with relevant foundational work by Kolmogorov, de Finetti, Jaynes, and other probabilists. As such, the main goal is not to solve the measure problem, but to clarify the exact origin of some of the current obstacles. The analysis of the assumptions going into the paradox indicates that there exist multiple ways of dealing consistently with uniform probabilities on infinite sample spaces. Taking a pluralist stance towards the mathematical methods used in cosmology shows there is some room for progress with assigning probabilities in cosmological theories.


Asunto(s)
Diversidad Cultural , Insuflación , Probabilidad , Teoría de la Probabilidad
3.
Entropy (Basel) ; 24(9)2022 Sep 12.
Artículo en Inglés | MEDLINE | ID: mdl-36141172

RESUMEN

Kolmogorov's axioms of probability theory are extended to conditional probabilities among distinct (and sometimes intertwining) contexts. Formally, this amounts to row stochastic matrices whose entries characterize the conditional probability to find some observable (postselection) in one context, given an observable (preselection) in another context. As the respective probabilities need not (but, depending on the physical/model realization, can) be of the Born rule type, this generalizes approaches to quantum probabilities by Aufféves and Grangier, which in turn are inspired by Gleason's theorem.

4.
Group Decis Negot ; 31(2): 491-528, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35228778

RESUMEN

We present a consensus improvement mechanism based on prospect theory and quantum probability theory (QPT) that enables the manifestation of irrational and uncertain behaviors of decision makers (DMs) in linguistic distribution group decision making. In this framework, the DMs pursue the possibility of working with different partial agreements on prospect values. Considering that the reference information should be comprehensive and accurate as it guides information modification and affects consensus efficiency, objective and subjective information is integrated to obtain the information. Several studies have verified that the interference effect will occur when the brain beliefs flow towards the different decision classification paths. To address this problem, QPT is introduced into the information integration and the optimized value of the interference term can be acquired by the designed multi-objective programming model based on the maximum individual utility. Finally, as the reference point changes during the preference adjustment process, a dynamic reference point-oriented consensus model is constructed to obtain the optimized modification. A case study is performed on the emergency plan for the selection of designated hospitals, and comparative analyses are performed to demonstrate the feasibility and advantages of the proposed model. Several important insights are offered to simulate the most likely possibility of consciousness flowing into different decision classifications for DMs and moderators.

5.
Proc Natl Acad Sci U S A ; 115(39): 9803-9806, 2018 09 25.
Artículo en Inglés | MEDLINE | ID: mdl-30201714

RESUMEN

The universal law of generalization describes how animals discriminate between alternative sensory stimuli. On an appropriate perceptual scale, the probability that an organism perceives two stimuli as similar typically declines exponentially with the difference on the perceptual scale. Exceptions often follow a Gaussian probability pattern rather than an exponential pattern. Previous explanations have been based on underlying theoretical frameworks such as information theory, Kolmogorov complexity, or empirical multidimensional scaling. This article shows that the few inevitable invariances that must apply to any reasonable perceptual scale provide a sufficient explanation for the universal exponential law of generalization. In particular, reasonable measurement scales of perception must be invariant to shift by a constant value, which by itself leads to the exponential form. Similarly, reasonable measurement scales of perception must be invariant to multiplication, or stretch, by a constant value, which leads to the conservation of the slope of discrimination with perceptual difference. In some cases, an additional assumption about exchangeability or rotation of underlying perceptual dimensions leads to a Gaussian pattern of discrimination, which can be understood as a special case of the more general exponential form. The three measurement invariances of shift, stretch, and rotation provide a sufficient explanation for the universally observed patterns of perceptual generalization. All of the additional assumptions and language associated with information, complexity, and empirical scaling are superfluous with regard to the broad patterns of perception.


Asunto(s)
Generalización Psicológica , Percepción , Animales , Discriminación en Psicología , Modelos Psicológicos , Distribución Normal , Probabilidad
6.
Entropy (Basel) ; 23(6)2021 May 27.
Artículo en Inglés | MEDLINE | ID: mdl-34071931

RESUMEN

Attaining reliable gradient profiles is of utmost relevance for many physical systems. In many situations, the estimation of the gradient is inaccurate due to noise. It is common practice to first estimate the underlying system and then compute the gradient profile by taking the subsequent analytic derivative of the estimated system. The underlying system is often estimated by fitting or smoothing the data using other techniques. Taking the subsequent analytic derivative of an estimated function can be ill-posed. This becomes worse as the noise in the system increases. As a result, the uncertainty generated in the gradient estimate increases. In this paper, a theoretical framework for a method to estimate the gradient profile of discrete noisy data is presented. The method was developed within a Bayesian framework. Comprehensive numerical experiments were conducted on synthetic data at different levels of noise. The accuracy of the proposed method was quantified. Our findings suggest that the proposed gradient profile estimation method outperforms the state-of-the-art methods.

7.
Entropy (Basel) ; 23(2)2021 Feb 19.
Artículo en Inglés | MEDLINE | ID: mdl-33669835

RESUMEN

The Sigma-Pi structure investigated in this work consists of the sum of products of an increasing number of identically distributed random variables. It appears in stochastic processes with random coefficients and also in models of growth of entities such as business firms and cities. We study the Sigma-Pi structure with Bernoulli random variables and find that its probability distribution is always bounded from below by a power-law function regardless of whether the random variables are mutually independent or duplicated. In particular, we investigate the case in which the asymptotic probability distribution has always upper and lower power-law bounds with the same tail-index, which depends on the parameters of the distribution of the random variables. We illustrate the Sigma-Pi structure in the context of a simple growth model with successively born entities growing according to a stochastic proportional growth law, taking both Bernoulli, confirming the theoretical results, and half-normal random variables, for which the numerical results can be rationalized using insights from the Bernoulli case. We analyze the interdependence among entities represented by the product terms within the Sigma-Pi structure, the possible presence of memory in growth factors, and the contribution of each product term to the whole Sigma-Pi structure. We highlight the influence of the degree of interdependence among entities in the number of terms that effectively contribute to the total sum of sizes, reaching the limiting case of a single term dominating extreme values of the Sigma-Pi structure when all entities grow independently.

8.
Stud Hist Philos Sci ; 90: 160-167, 2021 12.
Artículo en Inglés | MEDLINE | ID: mdl-34695623

RESUMEN

Earman (2018) has recently argued that the Principal Principle, a principle of rationality connecting objective chance and credence, is a theorem of quantum probability theory. This paper critiques Earman's argument, while also offering a positive proposal for how to understand the status of the Principal Principle in quantum probability theory.


Asunto(s)
Teoría de la Probabilidad , Teoría Cuántica , Disentimientos y Disputas , Probabilidad
9.
Sensors (Basel) ; 20(5)2020 Feb 25.
Artículo en Inglés | MEDLINE | ID: mdl-32106442

RESUMEN

Drafting involves cycling so close behind another person that wind resistance is significantly reduced, which is illegal during most long distance and several short distance triathlon and duathlon events. In this paper, a proof of concept for a drafting detection system based on computer vision is proposed. After detecting and tracking a bicycle through the various scenes, the distance to this object is estimated through computational geometry. The probability of drafting is then determined through statistical analysis of subsequent measurements over an extended period of time. These algorithms are tested using a static recording and a recording that simulates a race situation with ground truth distances obtained from a Light Detection And Ranging (LiDAR) system. The most accurate developed distance estimation method yields an average error of 0 . 46 m in our test scenario. When sampling the distances at periods of 1 or 2 s, simulations demonstrate that a drafting violation is detected quickly for cyclists riding at 2 m or more below the limit, while generally avoiding false positives during the race-like test set-up and five hour race simulations.


Asunto(s)
Ciclismo , Fotograbar/instrumentación , Viento , Algoritmos , Simulación por Computador , Humanos , Probabilidad
10.
Entropy (Basel) ; 22(3)2020 Mar 19.
Artículo en Inglés | MEDLINE | ID: mdl-33286131

RESUMEN

Using first principles from inference, we design a set of functionals for the purposes of ranking joint probability distributions with respect to their correlations. Starting with a general functional, we impose its desired behavior through the Principle of Constant Correlations (PCC), which constrains the correlation functional to behave in a consistent way under statistically independent inferential transformations. The PCC guides us in choosing the appropriate design criteria for constructing the desired functionals. Since the derivations depend on a choice of partitioning the variable space into n disjoint subspaces, the general functional we design is the n-partite information (NPI), of which the total correlation and mutual information are special cases. Thus, these functionals are found to be uniquely capable of determining whether a certain class of inferential transformations, ρ → ∗ ρ ' , preserve, destroy or create correlations. This provides conceptual clarity by ruling out other possible global correlation quantifiers. Finally, the derivation and results allow us to quantify non-binary notions of statistical sufficiency. Our results express what percentage of the correlations are preserved under a given inferential transformation or variable mapping.

11.
Entropy (Basel) ; 22(3)2020 Mar 13.
Artículo en Inglés | MEDLINE | ID: mdl-33286105

RESUMEN

Steady-state vowels are vowels that are uttered with a momentarily fixed vocal tract configuration and with steady vibration of the vocal folds. In this steady-state, the vowel waveform appears as a quasi-periodic string of elementary units called pitch periods. Humans perceive this quasi-periodic regularity as a definite pitch. Likewise, so-called pitch-synchronous methods exploit this regularity by using the duration of the pitch periods as a natural time scale for their analysis. In this work, we present a simple pitch-synchronous method using a Bayesian approach for estimating formants that slightly generalizes the basic approach of modeling the pitch periods as a superposition of decaying sinusoids, one for each vowel formant, by explicitly taking into account the additional low-frequency content in the waveform which arises not from formants but rather from the glottal pulse. We model this low-frequency content in the time domain as a polynomial trend function that is added to the decaying sinusoids. The problem then reduces to a rather familiar one in macroeconomics: estimate the cycles (our decaying sinusoids) independently from the trend (our polynomial trend function); in other words, detrend the waveform of steady-state waveforms. We show how to do this efficiently.

12.
J Theor Biol ; 469: 172-179, 2019 05 21.
Artículo en Inglés | MEDLINE | ID: mdl-30831174

RESUMEN

The traditional log-linear inactivation kinetics model considers microbial inactivation as a process that follows first-order kinetics. A basic concept of log reduction is decimal reduction time (D-value), which means time/dose required to kill 90% of the relevant microorganisms. D-value based on the first-order survival kinetics model is insufficient for reliable estimations of bacterial survivors following inactivation treatment. This is because the model does not consider the inactivation curvature and variability in bacterial inactivation. However, although the D-value has some limitations, it is widely used for risk assessment and sterilization time estimation. In this study, stochastic inactivation models are used in place of the conventional D-value to describe the probability of a population containing survivors. As representative bacterial inactivation normally follows a log-linear or log-Weibull model, we calculate the time required for a specific decrease in the number of cells and the number of survival cells as a probability distribution using the stochastic inactivation of individual cells in a population. We compare the probability of a population containing survivors calculated via the D-value, an inactivation kinetics model, and the stochastic formula. The stochastic calculation can be approximately estimated via a kinetic curvature model with less than 5% difference below the probability of a population containing survivors 0.1. This stochastic formula indicates that the D-value model would over- or under-estimate the probability of a population containing survivors when applied to inactivation kinetics with curvature. The results presented in this study show that stochastic analysis using mathematical models that account for variability in the individual cell inactivation time and initial cell number would lead to a realistic and probabilistic estimation of bacterial inactivation.


Asunto(s)
Bacterias/citología , Recuento de Células , Cinética , Modelos Lineales , Probabilidad , Procesos Estocásticos , Sobrevivientes , Factores de Tiempo
13.
Sensors (Basel) ; 19(22)2019 Nov 06.
Artículo en Inglés | MEDLINE | ID: mdl-31698686

RESUMEN

The massive amount of data generated by structural health monitoring (SHM) systems usually affects the system's capacity for data transmission and analysis. This paper proposes a novel concept based on the probability theory for data reduction in SHM systems. The beauty salient feature of the proposed method is that it alleviates the burden of collecting and analysis of the entire strain data via a relative damage approach. In this methodology, the rate of variation of strain distributions is related to the rate of damage. In order to verify the accuracy of the approach, experimental and numerical studies were conducted on a thin steel plate subjected to cyclic in-plane tension loading. Circular holes with various sizes were made on the plate to define damage states. Rather than measuring the entire strain response, the cumulative durations of strain events at different predefined strain levels were obtained for each damage scenario. Then, the distribution of the calculated cumulative times was used to detect the damage progression. The results show that the presented technique can efficiently detect the damage progression. The damage detection accuracy can be improved by increasing the predefined strain levels. The proposed concept can lead to over 2500% reduction in data storage requirement, which can be particularly important for data generation and data handling in on-line SHM systems.

14.
Entropy (Basel) ; 22(1)2019 Dec 31.
Artículo en Inglés | MEDLINE | ID: mdl-33285833

RESUMEN

In 2000, Kennedy and O'Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice's uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that "learning" or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.

15.
Proc Natl Acad Sci U S A ; 112(7): E667-76, 2015 Feb 17.
Artículo en Inglés | MEDLINE | ID: mdl-25646459

RESUMEN

Insulin secretion is key for glucose homeostasis. Insulin secretory granules (SGs) exist in different functional pools, with young SGs being more mobile and preferentially secreted. However, the principles governing the mobility of age-distinct SGs remain undefined. Using the time-reporter insulin-SNAP to track age-distinct SGs we now show that their dynamics can be classified into three components: highly dynamic, restricted, and nearly immobile. Young SGs display all three components, whereas old SGs are either restricted or nearly immobile. Both glucose stimulation and F-actin depolymerization recruit a fraction of nearly immobile young, but not old, SGs for highly dynamic, microtubule-dependent transport. Moreover, F-actin marks multigranular bodies/lysosomes containing aged SGs. These data demonstrate that SGs lose their responsiveness to glucose stimulation and competence for microtubule-mediated transport over time while changing their relationship with F-actin.


Asunto(s)
Actinas/metabolismo , Insulina/fisiología , Microtúbulos/fisiología , Vesículas Secretoras/metabolismo , Animales , Línea Celular Tumoral , Senescencia Celular , Microscopía Confocal , Ratas
16.
Sensors (Basel) ; 18(11)2018 Nov 02.
Artículo en Inglés | MEDLINE | ID: mdl-30400158

RESUMEN

Efficient matching of incoming events of data streams to persistent queries is fundamental to event stream processing systems in wireless sensor networks. These applications require dealing with high volume and continuous data streams with fast processing time on distributed complex event processing (CEP) systems. Therefore, a well-managed parallel processing technique is needed for improving the performance of the system. However, the specific properties of pattern operators in the CEP systems increase the difficulties of the parallel processing problem. To address these issues, a parallelization model and an adaptive parallel processing strategy are proposed for the complex event processing by introducing a histogram and utilizing the probability and queue theory. The proposed strategy can estimate the optimal event splitting policy, which can suit the most recent workload conditions such that the selected policy has the least expected waiting time for further processing of the arriving events. The proposed strategy can keep the CEP system running fast under the variation of the time window sizes of operators and the input rates of streams. Finally, the utility of our work is demonstrated through the experiments on the StreamBase system.

17.
Risk Anal ; 37(8): 1477-1494, 2017 08.
Artículo en Inglés | MEDLINE | ID: mdl-28437867

RESUMEN

Robustness measures a system's ability of being insensitive to disturbances. Previous studies assessed the robustness of transportation networks to a single disturbance without considering simultaneously happening multiple events. The purpose of this article is to address this problem and propose a new framework to assess the robustness of an urban transportation network. The framework consists of two layers. The upper layer is to define the robustness index based on the impact evaluation in different scenarios obtained from the lower layer, whereas the lower layer is to evaluate the performance of each hypothetical disrupted road network given by the upper layer. The upper layer has two varieties, that is, robustness against random failure and robustness against intentional attacks. This robustness measurement framework is validated by application to a real-world urban road network in Hong Kong. The results show that the robustness of a transport network with consideration of multiple events is quite different from and more comprehensive than that with consideration of only a single disruption. We also propose a Monte Carlo method and a heuristic algorithm to handle different scenarios with multiple hazard events, which is proved to be quite efficient. This methodology can also be applied to conduct risk analysis of other systems where multiple failures or disruptions exist.

18.
Stat Med ; 35(29): 5391-5400, 2016 12 20.
Artículo en Inglés | MEDLINE | ID: mdl-27501057

RESUMEN

The random mutation and natural selection phenomenon act in a mathematically predictable behavior, which when understood leads to approaches to reduce and prevent the failure of the use of these selection pressures when treating infections and cancers. The underlying principle to impair the random mutation and natural selection phenomenon is to use combination therapy, which forces the population to evolve to multiple selection pressures simultaneously that invoke the multiplication rule of probabilities simultaneously as well. Recently, it has been seen that combination therapy for the treatment of malaria has failed to prevent the emergence of drug-resistant variants. Using this empirical example and the principles of probability theory, the derivation of the equations describing this treatment failure is carried out. These equations give guidance as to how to use combination therapy for the treatment of cancers and infectious diseases and prevent the emergence of drug resistance. Copyright © 2016 John Wiley & Sons, Ltd.


Asunto(s)
Farmacorresistencia Microbiana/genética , Mutación , Selección Genética , Bacterias/genética , Humanos , Modelos Teóricos , Probabilidad
19.
Theor Biol Med Model ; 13: 13, 2016 Apr 14.
Artículo en Inglés | MEDLINE | ID: mdl-27075996

RESUMEN

BACKGROUND: This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. METHOD: By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. RESULTS: The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. CONCLUSIONS: The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.


Asunto(s)
Modelos Biológicos , Movimiento (Física) , Algoritmos , Citoplasma/metabolismo , Difusión , Sustancias Macromoleculares , Modelos Estadísticos , Probabilidad , Procesos Estocásticos
20.
Philos Trans A Math Phys Eng Sci ; 374(2068)2016 May 28.
Artículo en Inglés | MEDLINE | ID: mdl-27091169

RESUMEN

We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA