Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros

Banco de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
Entropy (Basel) ; 25(1)2023 Jan 10.
Artículo en Inglés | MEDLINE | ID: mdl-36673284

RESUMEN

When we compare the influences of two causes on an outcome, if the conclusion from every group is against that from the conflation, we think there is Simpson's Paradox. The Existing Causal Inference Theory (ECIT) can make the overall conclusion consistent with the grouping conclusion by removing the confounder's influence to eliminate the paradox. The ECIT uses relative risk difference Pd = max(0, (R - 1)/R) (R denotes the risk ratio) as the probability of causation. In contrast, Philosopher Fitelson uses confirmation measure D (posterior probability minus prior probability) to measure the strength of causation. Fitelson concludes that from the perspective of Bayesian confirmation, we should directly accept the overall conclusion without considering the paradox. The author proposed a Bayesian confirmation measure b* similar to Pd before. To overcome the contradiction between the ECIT and Bayesian confirmation, the author uses the semantic information method with the minimum cross-entropy criterion to deduce causal confirmation measure Cc = (R - 1)/max(R, 1). Cc is like Pd but has normalizing property (between -1 and 1) and cause symmetry. It especially fits cases where a cause restrains an outcome, such as the COVID-19 vaccine controlling the infection. Some examples (about kidney stone treatments and COVID-19) reveal that Pd and Cc are more reasonable than D; Cc is more useful than Pd.

2.
Stud Hist Philos Sci ; 75: 43-50, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-31426946

RESUMEN

In this paper, we distinguish Quine's thesis of holism from the related Duhem-Quine problem. We discuss the construal of holism which claims that the effect of falsification is felt on a conjunction of hypotheses. The Duhem-Quine problem claims that there is no principled way of knowing how falsification affects individual conjuncts. This latter claim relies on holism and an additional commitment to the hypothetico-deductive model of theory confirmation such that it need not arise in non-deductive accounts. While existing personalist Bayesian treatments of the problem make this point by assuming values of priors for the conjuncts, we arrive at the same conclusion without invoking such assumptions. Our discussion focuses on the falsification of equiprobable conjuncts and highlights the role played by their alternatives in ascertaining their relative disconfirmation. The equiprobability of conjuncts is discussed alongside a historical case study.

3.
J Biomed Inform ; 53: 291-9, 2015 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-25499899

RESUMEN

BACKGROUND: Metabolomics is an emerging field that includes ascertaining a metabolic profile from a combination of small molecules, and which has health applications. Metabolomic methods are currently applied to discover diagnostic biomarkers and to identify pathophysiological pathways involved in pathology. However, metabolomic data are complex and are usually analyzed by statistical methods. Although the methods have been widely described, most have not been either standardized or validated. Data analysis is the foundation of a robust methodology, so new mathematical methods need to be developed to assess and complement current methods. We therefore applied, for the first time, the dominance-based rough set approach (DRSA) to metabolomics data; we also assessed the complementarity of this method with standard statistical methods. Some attributes were transformed in a way allowing us to discover global and local monotonic relationships between condition and decision attributes. We used previously published metabolomics data (18 variables) for amyotrophic lateral sclerosis (ALS) and non-ALS patients. RESULTS: Principal Component Analysis (PCA) and Orthogonal Partial Least Square-Discriminant Analysis (OPLS-DA) allowed satisfactory discrimination (72.7%) between ALS and non-ALS patients. Some discriminant metabolites were identified: acetate, acetone, pyruvate and glutamine. The concentrations of acetate and pyruvate were also identified by univariate analysis as significantly different between ALS and non-ALS patients. DRSA correctly classified 68.7% of the cases and established rules involving some of the metabolites highlighted by OPLS-DA (acetate and acetone). Some rules identified potential biomarkers not revealed by OPLS-DA (beta-hydroxybutyrate). We also found a large number of common discriminating metabolites after Bayesian confirmation measures, particularly acetate, pyruvate, acetone and ascorbate, consistent with the pathophysiological pathways involved in ALS. CONCLUSION: DRSA provides a complementary method for improving the predictive performance of the multivariate data analysis usually used in metabolomics. This method could help in the identification of metabolites involved in disease pathogenesis. Interestingly, these different strategies mostly identified the same metabolites as being discriminant. The selection of strong decision rules with high value of Bayesian confirmation provides useful information about relevant condition-decision relationships not otherwise revealed in metabolomics data.


Asunto(s)
Esclerosis Amiotrófica Lateral/diagnóstico , Biomarcadores/química , Biología Computacional/métodos , Metabolómica/métodos , Ácido 3-Hidroxibutírico/química , Acetatos/química , Acetona/química , Anciano , Algoritmos , Teorema de Bayes , Toma de Decisiones , Análisis Discriminante , Femenino , Humanos , Análisis de los Mínimos Cuadrados , Espectroscopía de Resonancia Magnética , Masculino , Persona de Mediana Edad , Análisis Multivariante , Análisis de Componente Principal
4.
Cogn Sci ; 45(1): e12919, 2021 01.
Artículo en Inglés | MEDLINE | ID: mdl-33398915

RESUMEN

In a series of three behavioral experiments, we found a systematic distortion of probability judgments concerning elementary visual stimuli. Participants were briefly shown a set of figures that had two features (e.g., a geometric shape and a color) with two possible values each (e.g., triangle or circle and black or white). A figure was then drawn, and participants were informed about the value of one of its features (e.g., that the figure was a "circle") and had to predict the value of the other feature (e.g., whether the figure was "black" or "white"). We repeated this procedure for various sets of figures and, by varying the statistical association between features in the sets, we manipulated the probability of a feature given the evidence of another (e.g., the posterior probability of hypothesis "black" given the evidence "circle") as well as the support provided by a feature to another (e.g., the impact, or confirmation, of evidence "circle" on the hypothesis "black"). Results indicated that participants' judgments were deeply affected by impact, although they only should have depended on the probability distributions over the features, and that the dissociation between evidential impact and posterior probability increased the number of errors. The implications of these findings for lower and higher level cognitive models are discussed.


Asunto(s)
Juicio , Humanos , Probabilidad
5.
Forensic Sci Int ; 301: e59-e63, 2019 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-31178229

RESUMEN

In forensic science it is not rare that common sayings are used to support particular inferences. A typical example is the adage 'The absence of evidence is not evidence of absence'. This paper analyzes the rationale hidden behind such statement and it offers a structural way to approach the analysis of this particular adage throughout a careful analysis of four different scenarios.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA