Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 114
Filtrar
1.
Proc Natl Acad Sci U S A ; 121(10): e2313542121, 2024 Mar 05.
Artículo en Inglés | MEDLINE | ID: mdl-38412121

RESUMEN

Studying the pathways of ligand-receptor binding is essential to understand the mechanism of target recognition by small molecules. The binding free energy and kinetics of protein-ligand complexes can be computed using molecular dynamics (MD) simulations, often in quantitative agreement with experiments. However, only a qualitative picture of the ligand binding/unbinding paths can be obtained through a conventional analysis of the MD trajectories. Besides, the higher degree of manual effort involved in analyzing pathways limits its applicability in large-scale drug discovery. Here, we address this limitation by introducing an automated approach for analyzing molecular transition paths with a particular focus on protein-ligand dissociation. Our method is based on the dynamic time-warping algorithm, originally designed for speech recognition. We accurately classified molecular trajectories using a very generic descriptor set of contacts or distances. Our approach outperforms manual classification by distinguishing between parallel dissociation channels, within the pathways identified by visual inspection. Most notably, we could compute exit-path-specific ligand-dissociation kinetics. The unbinding timescale along the fastest path agrees with the experimental residence time, providing a physical interpretation to our entirely data-driven protocol. In combination with appropriate enhanced sampling algorithms, this technique can be used for the initial exploration of ligand-dissociation pathways as well as for calculating path-specific thermodynamic and kinetic properties.

2.
Proc Natl Acad Sci U S A ; 121(7): e2318731121, 2024 Feb 13.
Artículo en Inglés | MEDLINE | ID: mdl-38315841

RESUMEN

Capturing rare yet pivotal events poses a significant challenge for molecular simulations. Path sampling provides a unique approach to tackle this issue without altering the potential energy landscape or dynamics, enabling recovery of both thermodynamic and kinetic information. However, despite its exponential acceleration compared to standard molecular dynamics, generating numerous trajectories can still require a long time. By harnessing our recent algorithmic innovations-particularly subtrajectory moves with high acceptance, coupled with asynchronous replica exchange featuring infinite swaps-we establish a highly parallelizable and rapidly converging path sampling protocol, compatible with diverse high-performance computing architectures. We demonstrate our approach on the liquid-vapor phase transition in superheated water, the unfolding of the chignolin protein, and water dissociation. The latter, performed at the ab initio level, achieves comparable statistical accuracy within days, in contrast to a previous study requiring over a year.

3.
Annu Rev Phys Chem ; 75(1): 137-162, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38941527

RESUMEN

Dynamical reweighting techniques aim to recover the correct molecular dynamics from a simulation at a modified potential energy surface. They are important for unbiasing enhanced sampling simulations of molecular rare events. Here, we review the theoretical frameworks of dynamical reweighting for modified potentials. Based on an overview of kinetic models with increasing level of detail, we discuss techniques to reweight two-state dynamics, multistate dynamics, and path integrals. We explore the natural link to transition path sampling and how the effect of nonequilibrium forces can be reweighted. We end by providing an outlook on how dynamical reweighting integrates with techniques for optimizing collective variables and with modern potential energy surfaces.

4.
Proc Natl Acad Sci U S A ; 119(29): e2122237119, 2022 Jul 19.
Artículo en Inglés | MEDLINE | ID: mdl-35858324

RESUMEN

We use the continuum micromagnetic framework to derive the formulas for compact skyrmion lifetime due to thermal noise in ultrathin ferromagnetic films with relatively weak interfacial Dzyaloshinskii-Moriya interaction. In the absence of a saddle point connecting the skyrmion solution to the ferromagnetic state, we interpret the skyrmion collapse event as "capture by an absorber" at microscale. This yields an explicit Arrhenius collapse rate with both the barrier height and the prefactor as functions of all the material parameters, as well as the dynamical paths to collapse.

5.
Stat Med ; 43(4): 706-730, 2024 02 20.
Artículo en Inglés | MEDLINE | ID: mdl-38111986

RESUMEN

Rare events are events which occur with low frequencies. These often arise in clinical trials or cohort studies where the data are arranged in binary contingency tables. In this article, we investigate the estimation of effect heterogeneity for the risk-ratio parameter in meta-analysis of rare events studies through two likelihood-based nonparametric mixture approaches: an arm-based and a contrast-based model. Maximum likelihood estimation is achieved using the EM algorithm. Special attention is given to the choice of initial values. Inspired by the classification likelihood, a strategy is implemented which repeatably uses random allocation of the studies to the mixture components as choice of initial values. The likelihoods under the contrast-based and arm-based approaches are compared and differences are highlighted. We use simulations to assess the performance of these two methods. Under the design of sampling studies with nested treatment groups, the results show that the nonparametric mixture model based on the contrast-based approach is more appropriate in terms of model selection criteria such as AIC and BIC. Under the arm-based design the results from the arm-based model performs well although in some cases it is also outperformed by the contrast-based model. Comparisons of the estimators are provided in terms of bias and mean squared error. Also included in the comparison is the mixed Poisson regression model as well as the classical DerSimonian-Laird model (using the Mantel-Haenszel estimator for the common effect). Using simulation, estimating effect heterogeneity in the case of the contrast-based method appears to behave better than the compared methods although differences become negligible for large within-study sample sizes. We illustrate the methodologies using several meta-analytic data sets in medicine.


Asunto(s)
Metaanálisis como Asunto , Humanos , Simulación por Computador , Funciones de Verosimilitud , Oportunidad Relativa , Tamaño de la Muestra
6.
BMC Med Res Methodol ; 24(1): 219, 2024 Sep 27.
Artículo en Inglés | MEDLINE | ID: mdl-39333867

RESUMEN

BACKGROUND: There is a growing trend to include non-randomised studies of interventions (NRSIs) in rare events meta-analyses of randomised controlled trials (RCTs) to complement the evidence from the latter. An important consideration when combining RCTs and NRSIs is how to address potential bias and down-weighting of NRSIs in the pooled estimates. The aim of this study is to explore the use of a power prior approach in a Bayesian framework for integrating RCTs and NRSIs to assess the effect of rare events. METHODS: We proposed a method of specifying the down-weighting factor based on judgments of the relative magnitude (no information, and low, moderate, serious and critical risk of bias) of the overall risk of bias for each NRSI using the ROBINS-I tool. The methods were illustrated using two meta-analyses, with particular interest in the risk of diabetic ketoacidosis (DKA) in patients using sodium/glucose cotransporter-2 (SGLT-2) inhibitors compared with active comparators, and the association between low-dose methotrexate exposure and melanoma. RESULTS: No significant results were observed for these two analyses when the data from RCTs only were pooled (risk of DKA: OR = 0.82, 95% confidence interval (CI): 0.25-2.69; risk of melanoma: OR = 1.94, 95%CI: 0.72-5.27). When RCTs and NRSIs were directly combined without distinction in the same meta-analysis, both meta-analyses showed significant results (risk of DKA: OR = 1.50, 95%CI: 1.11-2.03; risk of melanoma: OR = 1.16, 95%CI: 1.08-1.24). Using Bayesian analysis to account for NRSI bias, there was a 90% probability of an increased risk of DKA in users receiving SGLT-2 inhibitors and an 91% probability of an increased risk of melanoma in patients using low-dose methotrexate. CONCLUSIONS: Our study showed that including NRSIs in a meta-analysis of RCTs for rare events could increase the certainty and comprehensiveness of the evidence. The estimates obtained from NRSIs are generally considered to be biased, and the possible influence of NRSIs on the certainty of the combined evidence needs to be carefully investigated.


Asunto(s)
Teorema de Bayes , Metaanálisis como Asunto , Ensayos Clínicos Controlados Aleatorios como Asunto , Inhibidores del Cotransportador de Sodio-Glucosa 2 , Humanos , Ensayos Clínicos Controlados Aleatorios como Asunto/métodos , Ensayos Clínicos Controlados Aleatorios como Asunto/estadística & datos numéricos , Inhibidores del Cotransportador de Sodio-Glucosa 2/uso terapéutico , Inhibidores del Cotransportador de Sodio-Glucosa 2/efectos adversos , Metotrexato/uso terapéutico , Cetoacidosis Diabética/inducido químicamente , Melanoma/tratamiento farmacológico
7.
Appetite ; 201: 107597, 2024 Oct 01.
Artículo en Inglés | MEDLINE | ID: mdl-38972638

RESUMEN

We Investigated how promoting diverse, healthy food options affects long-term dietary choices. We hypothesized that encouraging exploration of nutritious plant-based foods would lead to lasting improvements in diet. Participants (N = 211) were randomly assigned into two groups for a 6-week intervention: The fixed menu group was given the same large menu every week, while the changing menu group received a new small menu each week. At the end of the intervention both groups were exposed to the same menu suggestions. Food diversity evaluation was based on weekly reports collected during the intervention. Self-reported adherence to Mediterranean diet components was assessed using the I-MEDAS screener. The proportion of plant-based foods in participants' diets was estimated using a 0-100% scale based on self-report. Both items were evaluated using online questionnaires given to participants at baseline, at the end of the intervention, as well as three and six months after the intervention concluded. Results mean(SD) demonstrated that participants in fixed menu group explored a significantly wider array of items 26.33(11.64) than those in the changing menus group [19.79(10.29), t(202) = 4.25, p < 0.001, Cohen's d = 0.60]. A repeated measures analysis of covariance rmANCOVA revealed that short-term increase in I-MEDAS and PBD score were noted in both groups; however, only participants with the fixed menu sustained this increase at months follow-up [diff = 1.50, t(132) = 4.50, p < 0.001 Our findings suggest that manipulating the rate of exposure to food suggestions may affect overall dietary variety. It seems that early presentation with options may increase overall dietary variety and may even support longer-term habits. This study contributes to developing effective interventions and highlights the challenge of promoting exploratory behavior in nutrition.


Asunto(s)
Dieta Mediterránea , Cooperación del Paciente , Humanos , Femenino , Masculino , Adulto , Persona de Mediana Edad , Encuestas y Cuestionarios , Culinaria/métodos , Dieta Saludable/psicología , Dieta Saludable/métodos , Preferencias Alimentarias/psicología , Adulto Joven , Conducta Alimentaria/psicología , Conducta de Elección
8.
Pharm Stat ; 2024 Apr 16.
Artículo en Inglés | MEDLINE | ID: mdl-38628051

RESUMEN

The meta-analysis of rare events presents unique methodological challenges owing to the small number of events. Bayesian methods are often used to combine rare events data to inform decision-making, as they can incorporate prior information and handle studies with zero events without the need for continuity corrections. However, the comparative performances of different Bayesian models in pooling rare events data are not well understood. We conducted a simulation to compare the statistical properties of four parameterizations based on the binomial-normal hierarchical model, using two different priors for the treatment effect: weakly informative prior (WIP) and non-informative prior (NIP), pooling randomized controlled trials with rare events using the odds ratio metric. We also considered the beta-binomial model proposed by Kuss and the random intercept and slope generalized linear mixed models. The simulation scenarios varied based on the treatment effect, sample size ratio between the treatment and control arms, and level of heterogeneity. Performance was evaluated using median bias, root mean square error, median width of 95% credible or confidence intervals, coverage, Type I error, and empirical power. Two reviews are used to illustrate these methods. The results demonstrate that the WIP outperforms the NIP within the same model structure. Among the compared models, the model that included the treatment effect parameter in the risk model for the control arm did not perform well. Our findings confirm that rare events meta-analysis faces the challenge of being underpowered, highlighting the importance of reporting the power of results in empirical studies.

9.
Sensors (Basel) ; 24(15)2024 Aug 02.
Artículo en Inglés | MEDLINE | ID: mdl-39124055

RESUMEN

Rare events are occurrences that take place with a significantly lower frequency than more common, regular events. These events can be categorized into distinct categories, from frequently rare to extremely rare, based on factors like the distribution of data and significant differences in rarity levels. In manufacturing domains, predicting such events is particularly important, as they lead to unplanned downtime, a shortening of equipment lifespans, and high energy consumption. Usually, the rarity of events is inversely correlated with the maturity of a manufacturing industry. Typically, the rarity of events affects the multivariate data generated within a manufacturing process to be highly imbalanced, which leads to bias in predictive models. This paper evaluates the role of data enrichment techniques combined with supervised machine learning techniques for rare event detection and prediction. We use time series data augmentation and sampling to address the data scarcity, maintaining its patterns, and imputation techniques to handle null values. Evaluating 15 learning models, we find that data enrichment improves the F1 measure by up to 48% in rare event detection and prediction. Our empirical and ablation experiments provide novel insights, and we also investigate model interpretability.

10.
Proteomics ; 23(21-22): e2200290, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-36852539

RESUMEN

The evolution of omics and computational competency has accelerated discoveries of the underlying biological processes in an unprecedented way. High throughput methodologies, such as flow cytometry, can reveal deeper insights into cell processes, thereby allowing opportunities for scientific discoveries related to health and diseases. However, working with cytometry data often imposes complex computational challenges due to high-dimensionality, large size, and nonlinearity of the data structure. In addition, cytometry data frequently exhibit diverse patterns across biomarkers and suffer from substantial class imbalances which can further complicate the problem. The existing methods of cytometry data analysis either predict cell population or perform feature selection. Through this study, we propose a "wisdom of the crowd" approach to simultaneously predict rare cell populations and perform feature selection by integrating a pool of modern machine learning (ML) algorithms. Given that our approach integrates superior performing ML models across different normalization techniques based on entropy and rank, our method can detect diverse patterns existing across the model features. Furthermore, the method identifies a dynamic biomarker structure that divides the features into persistently selected, unselected, and fluctuating assemblies indicating the role of each biomarker in rare cell prediction, which can subsequently aid in studies of disease progression.


Asunto(s)
Algoritmos , Aprendizaje Automático , Biomarcadores/análisis
11.
Philos Trans A Math Phys Eng Sci ; 381(2250): 20220245, 2023 Jul 10.
Artículo en Inglés | MEDLINE | ID: mdl-37211032

RESUMEN

Discrete state Markov chains in discrete or continuous time are widely used to model phenomena in the social, physical and life sciences. In many cases, the model can feature a large state space, with extreme differences between the fastest and slowest transition timescales. Analysis of such ill-conditioned models is often intractable with finite precision linear algebra techniques. In this contribution, we propose a solution to this problem, namely partial graph transformation, to iteratively eliminate and renormalize states, producing a low-rank Markov chain from an ill-conditioned initial model. We show that the error induced by this procedure can be minimized by retaining both the renormalized nodes that represent metastable superbasins, and those through which reactive pathways concentrate, i.e. the dividing surface in the discrete state space. This procedure typically returns a much lower rank model, where trajectories can be efficiently generated with kinetic path sampling. We apply this approach to an ill-conditioned Markov chain for a model multi-community system, measuring the accuracy by direct comparison with trajectories and transition statistics. This article is part of a discussion meeting issue 'Supercomputing simulations of advanced materials'.

12.
Pharm Stat ; 22(3): 492-507, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36585125

RESUMEN

A stratified analysis of the differences in proportions has been widely employed in epidemiological research, social sciences, and drug development. It provides a useful framework for combining data across strata to produce a common effect. However, for rare events with incidence rates close to zero, popular confidence intervals for risk differences in a stratified analysis may not have appropriate coverage probabilities that approach the nominal confidence levels and the algorithms may fail to produce a valid confidence interval because of zero events in both the arms of a stratum. The main objective of this study is to evaluate the performance of certain methods commonly employed to construct confidence intervals for stratified risk differences when the response probabilities are close to a boundary value of zero or one. Additionally, we propose an improved stratified Miettinen-Nurminen confidence interval that exhibits a superior performance over standard methods while avoiding computational difficulties involving rare events. The proposed method can also be employed when the response probabilities are close to one.


Asunto(s)
Intervalos de Confianza , Humanos , Probabilidad
13.
Sensors (Basel) ; 23(12)2023 Jun 09.
Artículo en Inglés | MEDLINE | ID: mdl-37420632

RESUMEN

We report on the development of scintillating bolometers based on lithium molybdate crystals that contain molybdenum that has depleted into the double-ß active isotope 100Mo (Li2100deplMoO4). We used two Li2100deplMoO4 cubic samples, each of which consisted of 45-millimeter sides and had a mass of 0.28 kg; these samples were produced following the purification and crystallization protocols developed for double-ß search experiments with 100Mo-enriched Li2MoO4 crystals. Bolometric Ge detectors were utilized to register the scintillation photons that were emitted by the Li2100deplMoO4 crystal scintillators. The measurements were performed in the CROSS cryogenic set-up at the Canfranc Underground Laboratory (Spain). We observed that the Li2100deplMoO4 scintillating bolometers were characterized by an excellent spectrometric performance (∼3-6 keV of FWHM at 0.24-2.6 MeV γs), moderate scintillation signal (∼0.3-0.6 keV/MeV scintillation-to-heat energy ratio, depending on the light collection conditions), and high radiopurity (228Th and 226Ra activities are below a few µBq/kg), which is comparable with the best reported results of low-temperature detectors that are based on Li2MoO4 using natural or 100Mo-enriched molybdenum content. The prospects of Li2100deplMoO4 bolometers for use in rare-event search experiments are briefly discussed.


Asunto(s)
Molibdeno , Radio (Elemento) , Isótopos , Conteo por Cintilación/métodos , Litio , Iones
14.
Biom J ; 65(3): e2200132, 2023 03.
Artículo en Inglés | MEDLINE | ID: mdl-36216590

RESUMEN

Meta-analysis of binary data is challenging when the event under investigation is rare, and standard models for random-effects meta-analysis perform poorly in such settings. In this simulation study, we investigate the performance of different random-effects meta-analysis models in terms of point and interval estimation of the pooled log odds ratio in rare events meta-analysis. First and foremost, we evaluate the performance of a hypergeometric-normal model from the family of generalized linear mixed models (GLMMs), which has been recommended, but has not yet been thoroughly investigated for rare events meta-analysis. Performance of this model is compared to performance of the beta-binomial model, which yielded favorable results in previous simulation studies, and to the performance of models that are frequently used in rare events meta-analysis, such as the inverse variance model and the Mantel-Haenszel method. In addition to considering a large number of simulation parameters inspired by real-world data settings, we study the comparative performance of the meta-analytic models under two different data-generating models (DGMs) that have been used in past simulation studies. The results of this study show that the hypergeometric-normal GLMM is useful for meta-analysis of rare events when moderate to large heterogeneity is present. In addition, our study reveals important insights with regard to the performance of the beta-binomial model under different DGMs from the binomial-normal family. In particular, we demonstrate that although misalignment of the beta-binomial model with the DGM affects its performance, it shows more robustness to the DGM than its competitors.


Asunto(s)
Modelos Estadísticos , Oportunidad Relativa , Simulación por Computador , Modelos Lineales
15.
Am J Epidemiol ; 191(3): 487-498, 2022 02 19.
Artículo en Inglés | MEDLINE | ID: mdl-34718388

RESUMEN

Estimating incidence of rare cancers is challenging for exceptionally rare entities and in small populations. In a previous study, investigators in the Information Network on Rare Cancers (RARECARENet) provided Bayesian estimates of expected numbers of rare cancers and 95% credible intervals for 27 European countries, using data collected by population-based cancer registries. In that study, slightly different results were found by implementing a Poisson model in integrated nested Laplace approximation/WinBUGS platforms. In this study, we assessed the performance of a Poisson modeling approach for estimating rare cancer incidence rates, oscillating around an overall European average and using small-count data in different scenarios/computational platforms. First, we compared the performance of frequentist, empirical Bayes, and Bayesian approaches for providing 95% confidence/credible intervals for the expected rates in each country. Second, we carried out an empirical study using 190 rare cancers to assess different lower/upper bounds of a uniform prior distribution for the standard deviation of the random effects. For obtaining a reliable measure of variability for country-specific incidence rates, our results suggest the suitability of using 1 as the lower bound for that prior distribution and selecting the random-effects model through an averaged indicator derived from 2 Bayesian model selection criteria: the deviance information criterion and the Watanabe-Akaike information criterion.


Asunto(s)
Neoplasias , Teorema de Bayes , Europa (Continente)/epidemiología , Humanos , Incidencia , Neoplasias/epidemiología , Sistema de Registros
16.
Philos Trans A Math Phys Eng Sci ; 380(2226): 20210036, 2022 Jun 27.
Artículo en Inglés | MEDLINE | ID: mdl-35527637

RESUMEN

Transitional localized turbulence in shear flows is known to either decay to an absorbing laminar state or to proliferate via splitting. The average passage times from one state to the other depend super-exponentially on the Reynolds number and lead to a crossing Reynolds number above which proliferation is more likely than decay. In this paper, we apply a rare-event algorithm, Adaptative Multilevel Splitting, to the deterministic Navier-Stokes equations to study transition paths and estimate large passage times in channel flow more efficiently than direct simulations. We establish a connection with extreme value distributions and show that transition between states is mediated by a regime that is self-similar with the Reynolds number. The super-exponential variation of the passage times is linked to the Reynolds number dependence of the parameters of the extreme value distribution. Finally, motivated by instantons from Large Deviation theory, we show that decay or splitting events approach a most-probable pathway. This article is part of the theme issue 'Mathematical problems in physical fluid dynamics (part 2)'.

17.
Philos Trans A Math Phys Eng Sci ; 380(2218): 20210088, 2022 Mar 07.
Artículo en Inglés | MEDLINE | ID: mdl-35034489

RESUMEN

Intense fluctuations of energy dissipation rate in turbulent flows result from the self-amplification of strain rate via a quadratic nonlinearity, with contributions from vorticity (via the vortex stretching mechanism) and pressure-Hessian-which are analysed here using direct numerical simulations of isotropic turbulence on up to [Formula: see text] grid points, and Taylor-scale Reynolds numbers in the range 140-1300. We extract the statistics involved in amplification of strain and condition them on the magnitude of strain. We find that strain is self-amplified by the quadratic nonlinearity, and depleted via vortex stretching, whereas pressure-Hessian acts to redistribute strain fluctuations towards the mean-field and hence depletes intense strain. Analysing the intense fluctuations of strain in terms of its eigenvalues reveals that the net amplification is solely produced by the third eigenvalue, resulting in strong compressive action. By contrast, the self-amplification acts to deplete the other two eigenvalues, whereas vortex stretching acts to amplify them, with both effects cancelling each other almost perfectly. The effect of the pressure-Hessian for each eigenvalue is qualitatively similar to that of vortex stretching, but significantly weaker in magnitude. Our results conform with the familiar notion that intense strain is organized in sheet-like structures, which are in the vicinity of, but never overlap with tube-like regions of intense vorticity due to fundamental differences in their amplifying mechanisms. This article is part of the theme issue 'Scaling the turbulence edifice (part 1)'.

18.
J Biomed Inform ; 129: 104072, 2022 05.
Artículo en Inglés | MEDLINE | ID: mdl-35421602

RESUMEN

BACKGROUND: Medical decision-making impacts both individual and public health. Clinical scores are commonly used among various decision-making models to determine the degree of disease deterioration at the bedside. AutoScore was proposed as a useful clinical score generator based on machine learning and a generalized linear model. However, its current framework still leaves room for improvement when addressing unbalanced data of rare events. METHODS: Using machine intelligence approaches, we developed AutoScore-Imbalance, which comprises three components: training dataset optimization, sample weight optimization, and adjusted AutoScore. Baseline techniques for performance comparison included the original AutoScore, full logistic regression, stepwise logistic regression, least absolute shrinkage and selection operator (LASSO), full random forest, and random forest with a reduced number of variables. These models were evaluated based on their area under the curve (AUC) in the receiver operating characteristic analysis and balanced accuracy (i.e., mean value of sensitivity and specificity). By utilizing a publicly accessible dataset from Beth Israel Deaconess Medical Center, we assessed the proposed model and baseline approaches to predict inpatient mortality. RESULTS: AutoScore-Imbalance outperformed baselines in terms of AUC and balanced accuracy. The nine-variable AutoScore-Imbalance sub-model achieved the highest AUC of 0.786 (0.732-0.839), while the eleven-variable original AutoScore obtained an AUC of 0.723 (0.663-0.783), and the logistic regression with 21 variables obtained an AUC of 0.743 (0.685-0.801). The AutoScore-Imbalance sub-model (using a down-sampling algorithm) yielded an AUC of 0.771 (0.718-0.823) with only five variables, demonstrating a good balance between performance and variable sparsity. Furthermore, AutoScore-Imbalance obtained the highest balanced accuracy of 0.757 (0.702-0.805), compared to 0.698 (0.643-0.753) by the original AutoScore and the maximum of 0.720 (0.664-0.769) by other baseline models. CONCLUSIONS: We have developed an interpretable tool to handle clinical data imbalance, presented its structure, and demonstrated its superiority over baselines. The AutoScore-Imbalance tool can be applied to highly unbalanced datasets to gain further insight into rare medical events and facilitate real-world clinical decision-making.


Asunto(s)
Algoritmos , Aprendizaje Automático , Toma de Decisiones Clínicas , Modelos Logísticos , Curva ROC
19.
Br J Anaesth ; 129(5): 647-649, 2022 11.
Artículo en Inglés | MEDLINE | ID: mdl-36030133

RESUMEN

The response to the COVID-19 pandemic and the approach to patient safety share three important concepts: the challenges of preventing rare events, use of rules, and tolerance for uncertainty. We discuss how each of these ideas can be utilised in perioperative safety to create a high-reliability system.


Asunto(s)
COVID-19 , Humanos , Pandemias/prevención & control , Seguridad del Paciente , Reproducibilidad de los Resultados , Incertidumbre
20.
Artículo en Inglés | MEDLINE | ID: mdl-38624818

RESUMEN

Automated valuation models (AVMs) are widely used by financial institutions to estimate the property value for a residential mortgage. The distribution of pricing errors obtained from AVMs generally show fat tails (Pender 2016; Demiroglu and James Management Science, 64(4), 1747-1760 2018). The extreme events on the tails are usually known as "black swans" (Taleb 2010) in finance and their existence complicates financial risk management, assessment, and regulation. We show via theory, Monte Carlo experiments, and an empirical example that a direct relation exists between non-normality of the pricing errors and goodness-of-fit of the house pricing models. Specifically, we provide an empirical example using US housing prices where we demonstrate an almost perfect linear relation between the estimated degrees-of-freedom for a Student's t distribution and the goodness-of-fit of sophisticated evaluation models with spatial and spatialtemporal dependence.

SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda