Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 33
Filtrar
1.
Biomimetics (Basel) ; 9(8)2024 Aug 14.
Artigo em Inglês | MEDLINE | ID: mdl-39194473

RESUMO

The collapsible tubular mast (CTM) can be compactly folded for transport and deployed in orbit to serve as a key structural element. Once deployed, the CTM is vulnerable to buckling under axial load and bending moments, compromising its load-bearing capacity. The intricate relationship between the CTM's cross-section and its buckling behavior poses a significant challenge for designers. This is due to the ultra-thin nature of the CTM, which gives rise to highly localized buckling modes rather than global ones. To overcome this challenge, we developed surrogate models using a neural network (NN) trained with data from finite element analysis (FEA). These NN-based surrogate models provide high computational accuracy in predicting nonlinear buckling loads under axial force and bending moments around the two principal axes of the CTM's cross-section, achieving R2 values of 0.9906, 0.9987, and 0.9628, respectively. These models also significantly improve computational efficiency, reducing prediction time to a fraction of a second compared to several minutes with FEA. Furthermore, the NN-based surrogate models enable the usage of the non-dominated sorting genetic algorithm (NSGA-II) for multi-objective optimization (MOO) of the CTMs. These models can be integrated in the NSGA-II algorithm to evaluate the objective function of existing and new individuals until a set of 1000 non-dominated solutions, i.e., cross-sectional configurations optimizing buckling performance, is identified. The proposed approach enables the design of ultra-thin CTMs with optimized stability and structural integrity by promoting design decisions based on the quantitative information provided by the NN-based surrogate models.

2.
Environ Pollut ; 362: 124820, 2024 Aug 26.
Artigo em Inglês | MEDLINE | ID: mdl-39197641

RESUMO

To analyze the surface cumulative mass of VOCs from residual sources in dual-media fractured rocks and assess environmental health risks, complex 3D numerical models were constructed. These models comprehensively considered fracture-rock interactions, density-driven effects, and surface pressure fluctuations. The investigation identified the key control factors affecting surface cumulative mass, including the fracture aperture, pollutant source location, fracture density, and so on. Additionally, a regression-based general surrogate model was established using the obtained representative dataset. According to U.S. EPA's Respiratory Inhalation Reference Concentrations, the cumulative mass of CH2ClCHCl2 in one day for one-third of the model exceeds the concentration limit. Benzene and TCE concentrations reached 29 and 740 times the reference limits, significantly impacting air quality and health. Surrogate model analysis showed that in the worst-case scenario, 1 min's surface cumulative mass could cause Benzene concentrations to exceed the limit by 57 times. The implications of the study lies in reminding us that even after groundwater remediation in the saturated zone, residual VOCs in the capillary zones can still significantly impact surface environmental health risks. This investigation also presents an effective framework that integrates complex, time-consuming numerical modeling with simple, efficient statistical modeling to predict concerned variables and their uncertainties. This study provides a reference basis for the control of environmental pollution pertaining to VOCs volatilization from buried capillary zones at specific depths.

3.
MethodsX ; 13: 102840, 2024 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-39071996

RESUMO

The enhanced multi-objective symbolic discretization for time series (eMODiTS) uses an evolutionary process to identify the appropriate discretization scheme in the Time Series Classification (TSC) task. It discretizes using a unique alphabet cut for each word segment. However, this kind of scheme has a higher computational cost. Therefore, this study implemented surrogate models to minimize this cost. The general procedure is summarized below.•The K-nearest neighbor for regression, the support vector regression model, and the Ra- dial Basis Functions neural networks were implemented as surrogate models to estimate the objective values of eMODiTS, including the discretization process.•An archive-based update strategy was introduced to maintain diversity in the training set.•Finally, the model update process uses a hybrid (fixed and dynamic) approach for the surrogate model's evolution control.

4.
Comput Methods Programs Biomed ; 252: 108234, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38823206

RESUMO

BACKGROUND AND OBJECTIVE: Patient-specific 3D computational fluid dynamics (CFD) models are increasingly being used to understand and predict transarterial radioembolization procedures used for hepatocellular carcinoma treatment. While sensitivity analyses of these CFD models can help to determine the most impactful input parameters, such analyses are computationally costly. Therefore, we aim to use surrogate modelling to allow relatively cheap sensitivity analysis. As an example, we compute Sobol's sensitivity indices for three input waveform shape parameters. METHODS: We extracted three characteristic shape parameters from our input mass flow rate waveform (peak systolic mass flow rate, heart rate, systolic duration) and defined our 3D input parameter space by varying these parameters within 75 %-125 % of their nominal values. To fit our surrogate model with a minimal number of costly CFD simulations, we developed an adaptive design of experiments (ADOE) algorithm. The ADOE uses 100 Latin hypercube sampled points in 3D input space to define the initial design of experiments (DOE). Subsequently, we re-sample input space with 10,000 Latin Hypercube sampled points and cheaply estimate the outputs using the surrogate model. In each of 27 equivolume bins which divide our input space, we determine the most uncertain prediction of the 10,000 points, compute the true outputs using CFD, and add these points to the DOE. For each ADOE iteration, we calculate Sobol's sensitivity indices, and we continue to add batches of 27 samples to the DOE until the Sobol indices have stabilized. RESULTS: We tested our ADOE algorithm on the Ishigami function and showed that we can reliably obtain Sobol's indices with an absolute error <0.1. Applying ADOE to our waveform sensitivity problem, we found that the first-order sensitivity indices were 0.0550, 0.0191 and 0.407 for the peak systolic mass flow rate, heart rate, and the systolic duration, respectively. CONCLUSIONS: Although the current study was an illustrative case, the ADOE allows reliable sensitivity analysis with a limited number of complex model evaluations, and performs well even when the optimal DOE size is a priori unknown. This enables us to identify the highest-impact input parameters of our model, and other novel, costly models in the future.


Assuntos
Algoritmos , Carcinoma Hepatocelular , Embolização Terapêutica , Neoplasias Hepáticas , Humanos , Neoplasias Hepáticas/radioterapia , Carcinoma Hepatocelular/radioterapia , Embolização Terapêutica/métodos , Distribuição Normal , Fígado , Simulação por Computador , Hidrodinâmica , Análise de Regressão , Imageamento Tridimensional
5.
Virol Sin ; 39(3): 434-446, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38556051

RESUMO

The Ebola virus (EBOV) is a member of the Orthoebolavirus genus, Filoviridae family, which causes severe hemorrhagic diseases in humans and non-human primates (NHPs), with a case fatality rate of up to 90%. The development of countermeasures against EBOV has been hindered by the lack of ideal animal models, as EBOV requires handling in biosafety level (BSL)-4 facilities. Therefore, accessible and convenient animal models are urgently needed to promote prophylactic and therapeutic approaches against EBOV. In this study, a recombinant vesicular stomatitis virus expressing Ebola virus glycoprotein (VSV-EBOV/GP) was constructed and applied as a surrogate virus, establishing a lethal infection in hamsters. Following infection with VSV-EBOV/GP, 3-week-old female Syrian hamsters exhibited disease signs such as weight loss, multi-organ failure, severe uveitis, high viral loads, and developed severe systemic diseases similar to those observed in human EBOV patients. All animals succumbed at 2-3 days post-infection (dpi). Histopathological changes indicated that VSV-EBOV/GP targeted liver cells, suggesting that the tissue tropism of VSV-EBOV/GP was comparable to wild-type EBOV (WT EBOV). Notably, the pathogenicity of the VSV-EBOV/GP was found to be species-specific, age-related, gender-associated, and challenge route-dependent. Subsequently, equine anti-EBOV immunoglobulins and a subunit vaccine were validated using this model. Overall, this surrogate model represents a safe, effective, and economical tool for rapid preclinical evaluation of medical countermeasures against EBOV under BSL-2 conditions, which would accelerate technological advances and breakthroughs in confronting Ebola virus disease.


Assuntos
Modelos Animais de Doenças , Ebolavirus , Doença pelo Vírus Ebola , Mesocricetus , Animais , Doença pelo Vírus Ebola/virologia , Doença pelo Vírus Ebola/patologia , Ebolavirus/genética , Ebolavirus/patogenicidade , Feminino , Humanos , Vesiculovirus/genética , Vesiculovirus/patogenicidade , Anticorpos Antivirais/sangue , Cricetinae , Carga Viral , Glicoproteínas/genética , Glicoproteínas/imunologia
6.
Water Res ; 252: 121202, 2024 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-38290237

RESUMO

Hydrodynamic models can accurately simulate flood inundation but are limited by their high computational demand that scales non-linearly with model complexity, resolution, and domain size. Therefore, it is often not feasible to use high-resolution hydrodynamic models for real-time flood predictions or when a large number of predictions are needed for probabilistic flood design. Computationally efficient surrogate models have been developed to address this issue. The recently developed Low-fidelity, Spatial analysis, and Gaussian Process Learning (LSG) model has shown strong performance in both computational efficiency and simulation accuracy. The LSG model is a physics-guided surrogate model that simulates flood inundation by first using an extremely coarse and simplified (i.e. low-fidelity) hydrodynamic model to provide an initial estimate of flood inundation. Then, the low-fidelity estimate is upskilled via Empirical Orthogonal Functions (EOF) analysis and Sparse Gaussian Process models to provide accurate high-resolution predictions. Despite the promising results achieved thus far, the LSG model has not been benchmarked against other surrogate models. Such a comparison is needed to fully understand the value of the LSG model and to provide guidance for future research efforts in flood inundation simulation. This study compares the LSG model to four state-of-the-art surrogate flood inundation models. The surrogate models are assessed for their ability to simulate the temporal and spatial evolution of flood inundation for events both within and beyond the range used for model training. The models are evaluated for three distinct case studies in Australia and the United Kingdom. The LSG model is found to be superior in accuracy for both flood extent and water depth, including when applied to flood events outside the range of training data used, while achieving high computational efficiency. In addition, the low-fidelity model is found to play a crucial role in achieving the overall superior performance of the LSG model.


Assuntos
Inundações , Água , Simulação por Computador , Algoritmos , Análise Espacial
7.
J Contam Hydrol ; 261: 104288, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-38176294

RESUMO

Petroleum pollution in soil and groundwater has emerged as a significant environmental concern worldwide. As a sustainable and cost-effective in-situ remediation technique, Monitored Natural Attenuation (MNA) exhibits significant promise in addressing sites contaminated by petrochemicals. This study specifically targets a typical petrochemical-contaminated site in northern China and employs GMS software to establish a comprehensive physical model. The model relies on time-series monitoring data of phenol concentrations spanning from 2018 to 2020, effectively simulating both the leakage and natural attenuation of phenol. Within this study, the adsorption coefficient and maximum adsorption capacity emerge as the foremost influential factors shaping the outcomes of the model. Given the inherent heterogeneity of the site and the variability of hydrochemical conditions, parameters such as dispersion, porosity, and adsorption coefficient exhibit significant uncertainties. Consequently, relying on traditional deterministic models to predict the feasibility of MNA technology is not reliable. Therefore, this study employs machine learning (ML) methods to construct stochastic parameter models based on physical processes. The Random Forest Regression (RFR) algorithm, after trained, demonstrates strong alignment with numerical model output, exhibiting an average Nash-Sutcliffe Efficiency (NSE) >0.96. Using a stochastic approach, RFR iteratively computes phenol concentration across 6000 sets of parameters. Applying probability statistics, the model shows a notable reduction in the likelihood of phenol concentrations exceeding a threshold, dropping from 64.0% to 15.7% before and after natural attenuation. In parameter uncertainty, the stochastic model emphasizes natural attenuation's efficacy in mitigating phenol pollution risk (porosity being the most influential factor). This case study proposed a novel method to quickly assess the pollution risks at petrochemical sites under the influence of the uncertainty of pollutant transport and reaction parameters. The results can provide a reference for the pollution risk assessment at petrochemical sites, especially in sites with high stratigraphic heterogeneity or insufficient transport parameter data.


Assuntos
Monitoramento Ambiental , Água Subterrânea , Poluição Ambiental/análise , Fenol/análise , Medição de Risco
8.
Eur J Pharm Biopharm ; 194: 159-169, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38110160

RESUMO

The identification of process Design Space (DS) is of high interest in highly regulated industrial sectors, such as pharmaceutical industry, where assurance of manufacturability and product quality is key for process development and decision-making. If the process can be controlled by a set of manipulated variables, the DS can be expanded in comparison to an open-loop scenario, where there are no controls in place. Determining the benefits of control strategies may be challenging, particularly when the available model is complex and computationally expensive - which is typically the case of pharmaceutical manufacturing. In this study, we exploit surrogate-based feasibility analysis to determine whether the process satisfies all process constraints by manipulating the process inputs and reduce the effect of uncertainty. The proposed approach is successfully tested on two simulated pharmaceutical case studies of increasing complexity, i.e., considering (i) a single pharmaceutical unit operation, and (ii) a pharmaceutical manufacturing line comprised of a sequence of connected unit operations. Results demonstrate that different control actions can be effectively exploited to operate the process in a wider range of inputs and mitigate uncertainty.


Assuntos
Indústria Farmacêutica , Tecnologia Farmacêutica , Tecnologia Farmacêutica/métodos , Incerteza , Controle de Qualidade , Indústria Farmacêutica/métodos , Preparações Farmacêuticas
9.
Chemosphere ; 345: 140476, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37866497

RESUMO

The growing number of contaminated sites across the world pose a considerable threat to the environment and human health. Remediating such sites is a cumbersome process with the complexity originating from the need for extensive sampling and testing during site characterization. Selection and design of remediation technology is further complicated by the uncertainties surrounding contaminant attributes, concentration, as well as soil and groundwater properties, which influence the remediation efficiency. Additionally, challenges emerge in identifying contamination sources and monitoring the affected area. Often, these problems are overly simplified, and the data gathered is underutilized rendering the remediation process inefficient. The potential of artificial intelligence (AI), machine-learning (ML), and deep-learning (DL) to address these issues is noteworthy, as their emergence revolutionized the process of data management/analysis. Researchers across the world are increasingly leveraging AI/ML/DL to address remediation challenges. Current study aims to perform a comprehensive literature review on the integration of AI/ML/DL tools into contaminated site remediation. A brief introduction to various emerging and existing AI/ML/DL technologies is presented, followed by a comprehensive literature review. In essence, ML/DL based predictive models can facilitate a thorough understanding of contamination patterns, reducing the need for extensive soil and groundwater sampling. Additionally, AI/ML/DL algorithms can play a pivotal role in identifying optimal remediation strategies by analyzing historical data, simulating scenarios through surrogate models, parameter-optimization using nature inspired algorithms, and enhancing decision-making with AI-based tools. Overall, with supportive measures like open-data policies and data integration, AI/ML/DL possess the potential to revolutionize the practice of contaminated site remediation.


Assuntos
Inteligência Artificial , Aprendizado Profundo , Humanos , Algoritmos , Aprendizado de Máquina , Solo
10.
Materials (Basel) ; 16(13)2023 Jun 28.
Artigo em Inglês | MEDLINE | ID: mdl-37444985

RESUMO

Reinforced concrete (RC) is the result of a combination of steel reinforcing rods (which have high tensile) and concrete (which has high compressive strength). Additionally, the prediction of long-term deformations of RC flexural structures and the magnitude of the influence of the relevant material and geometric parameters are important for evaluating their serviceability and safety throughout their life cycles. Empirical methods for predicting the long-term deformation of RC structures are limited due to the difficulty of considering all the influencing factors. In this study, four popular surrogate models, i.e., polynomial chaos expansion (PCE), support vector regression (SVR), Kriging, and radial basis function (RBF), are used to predict the long-term deformation of RC structures. The surrogate models were developed and evaluated using RC simply supported beam examples, and experimental datasets were collected for comparison with common machine learning models (back propagation neural network (BP), multilayer perceptron (MLP), decision tree (DT) and linear regression (LR)). The models were tested using the statistical metrics R2, RAAE, RMAE, RMSE, VAF, PI, A10-index and U95. The results show that all four proposed models can effectively predict the deformation of RC structures, with PCE and SVR having the best accuracy, followed by the Kriging model and RBF. Moreover, the prediction accuracy of the surrogate model is much lower than that of the empirical method and the machine learning model in terms of the RMSE. Furthermore, a global sensitivity analysis of the material and geometric parameters affecting structural deflection using PCE is proposed. It was found that the geometric parameters are more influential than the material parameters. Additionally, there is a coupling effect between material and geometric parameters that works together to influence the long-term deflection of RC structures.

11.
J Cheminform ; 14(1): 70, 2022 Oct 17.
Artigo em Inglês | MEDLINE | ID: mdl-36253845

RESUMO

Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures. Training an accurate and comprehensive GCNN surrogate for molecular design requires large-scale graph datasets and is usually a time-consuming process. Recent advances in GPUs and distributed computing open a path to reduce the computational cost for GCNN training effectively. However, efficient utilization of high performance computing (HPC) resources for training requires simultaneously optimizing large-scale data management and scalable stochastic batched optimization techniques. In this work, we focus on building GCNN models on HPC systems to predict material properties of millions of molecules. We use HydraGNN, our in-house library for large-scale GCNN training, leveraging distributed data parallelism in PyTorch. We use ADIOS, a high-performance data management framework for efficient storage and reading of large molecular graph data. We perform parallel training on two open-source large-scale graph datasets to build a GCNN predictor for an important quantum property known as the HOMO-LUMO gap. We measure the scalability, accuracy, and convergence of our approach on two DOE supercomputers: the Summit supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and the Perlmutter system at the National Energy Research Scientific Computing Center (NERSC). We present our experimental results with HydraGNN showing (i) reduction of data loading time up to 4.2 times compared with a conventional method and (ii) linear scaling performance for training up to 1024 GPUs on both Summit and Perlmutter.

12.
J Appl Stat ; 49(2): 485-497, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35707207

RESUMO

In the present paper, we present a surrogate model, which can be used to estimate extreme tower loads on a wind turbine from a number of signals and a suitable simulation tool. Due to the requirements of the International Electrotechnical Commission (IEC) Standard 61400-1, assessing extreme tower loads on wind turbines constitutes a key component of the design phase. The proposed model imputes tower loads by matching observed signals with simulated quantities using proximities induced by random forests. In this way, the algorithm's adaptability to high-dimensional and sparse settings is exploited without using regression-based surrogate loads (which may display misleading probabilistic characteristics). Finally, the model is applied to estimate tower loads on an operating wind turbine from data on its operational statistics.

13.
J Environ Radioact ; 247: 106849, 2022 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-35294912

RESUMO

Predicting source or background radionuclide emissions is limited by the effort needed to run gas/aerosol atmospheric transport models (ATMs). A high-performance surrogate model is developed for the HYSPLIT4 (NOAA) ATM to accelerate transport simulation through model reduction, code optimization, and improved scaling on high performance computing systems. The surrogate model parameters are a grid of short-duration transport simulations stored offline. The surrogate model then predicts the path of a plume of radionuclide particles emitted from a source, or the field of sources which may have contributed to a detected signal, more efficiently than direct simulation by HYSPLIT4. Termed the Atmospheric Transport Model Surrogate (ATaMS), this suite of capabilities forms a basis to accelerate workflows for probabilistic source prediction and estimation of the radionuclide atmospheric background.


Assuntos
Monitoramento de Radiação , Radioisótopos/isolamento & purificação , Aerossóis , Simulação por Computador , Estudos Retrospectivos
14.
Int J Numer Method Biomed Eng ; 38(4): e3575, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-35094499

RESUMO

This work introduces a computational methodology to calibrate material models in biomechanical applications under uncertainty. We adopt a Bayesian approach, which estimates the probability distributions of hyperelastic material parameters, based on force-strain measurements. We approximate the parametric biomechanical model by combining a reduced order representation of the force response with a Polynomial Chaos expansion. The surrogate model allows to employ sampling-intensive Markov chain Monte Carlo methods and provides an efficient way to estimate (generalized) Sobol coefficients. We use a Sobol sensitivity analysis to assess the influence of material parameters and present an iterative procedure to quantify the accuracy of the surrogate model as additional uncertainty during Bayesian updating. The methodology is illustrated with three cases, tensile experiments on heat-induced whey protein gel, indentation experiments for oocytes and a manufactured example. Real experimental data are used for the calibration.


Assuntos
Calibragem , Teorema de Bayes , Cadeias de Markov , Método de Monte Carlo , Incerteza
15.
Sci Total Environ ; 820: 153349, 2022 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-35077794

RESUMO

Although combined ozonation with activated carbon (AC) adsorption is a promising technique for leachate treatment, little is known about how ozone-induced changes in leachate characteristics affect the organics adsorption, especially in view of emerging micropollutants (MPs) removal. Furthermore, the online monitoring of MPs is challenging but desirable for efficient treatment operation. This study investigates how preceding ozonation impacts the adsorption of bulk organics (expressed as chemical oxygen demand (COD)) and ozone-recalcitrant MPs, i.e., primidone, atrazine and alachlor, in leachate using batch and column adsorption tests. Additionally, a new surrogate-based model was evaluated for predicting MPs breakthrough. Batch tests revealed that ozonation results in a decreasing apparent affinity of COD towards AC, but the non-adsorbable part did not obviously change. The adsorption of MPs in ozonated leachate was (1-41%) higher than that in non-ozonated leachate, especially for the more hydrophobic alachlor and atrazine, due to a reduced sites competition from bulk organics. Column adsorption showed that ozonation delayed COD and MPs breakthrough due to the reduced COD loading and sites competition, respectively. An increased empty bed contact time (EBCT, 10-40 min) led to an increased COD uptake by a factor of 3.0-3.2 for ozonated and non-ozonated leachates, while MPs adsorption also increased, suggesting that pore blockage rather than site competition could be the dominant inhibitory effect. The data from column adsorption demonstrate the applicability of developed surrogate-based model for predicting MPs breakthrough. Particularly, the fitting parameters were not affected by change of leachate characteristics, while they were impacted by change of EBCT.


Assuntos
Ozônio , Poluentes Químicos da Água , Adsorção , Carvão Vegetal , Seguimentos , Ozônio/química , Poluentes Químicos da Água/análise
16.
Microorganisms ; 9(12)2021 Dec 02.
Artigo em Inglês | MEDLINE | ID: mdl-34946102

RESUMO

Leishmaniasis is a vector-borne parasitic disease caused by Leishmania species. The disease affects humans and animals, particularly dogs, provoking cutaneous, mucocutaneous, or visceral processes depending on the Leishmania sp. and the host immune response. No vaccine for humans is available, and the control relies mainly on chemotherapy. However, currently used drugs are old, some are toxic, and the safer presentations are largely unaffordable by the most severely affected human populations. Moreover, its efficacy has shortcomings, and it has been challenged by the growing reports of resistance and therapeutic failure. This manuscript presents an overview of the currently used drugs, the prevailing model to develop new antileishmanial drugs and its low efficiency, and the impact of deconstruction of the drug pipeline on the high failure rate of potential drugs. To improve the predictive value of preclinical research in the chemotherapy of leishmaniasis, several proposals are presented to circumvent critical hurdles-namely, lack of common goals of collaborative research, particularly in public-private partnership; fragmented efforts; use of inadequate surrogate models, especially for in vivo trials; shortcomings of target product profile (TPP) guides.

17.
Front Physiol ; 12: 662314, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34113262

RESUMO

Purpose: Bayesian calibration is generally superior to standard direct-search algorithms in that it estimates the full joint posterior distribution of the calibrated parameters. However, there are many barriers to using Bayesian calibration in health decision sciences stemming from the need to program complex models in probabilistic programming languages and the associated computational burden of applying Bayesian calibration. In this paper, we propose to use artificial neural networks (ANN) as one practical solution to these challenges. Methods: Bayesian Calibration using Artificial Neural Networks (BayCANN) involves (1) training an ANN metamodel on a sample of model inputs and outputs, and (2) then calibrating the trained ANN metamodel instead of the full model in a probabilistic programming language to obtain the posterior joint distribution of the calibrated parameters. We illustrate BayCANN using a colorectal cancer natural history model. We conduct a confirmatory simulation analysis by first obtaining parameter estimates from the literature and then using them to generate adenoma prevalence and cancer incidence targets. We compare the performance of BayCANN in recovering these "true" parameter values against performing a Bayesian calibration directly on the simulation model using an incremental mixture importance sampling (IMIS) algorithm. Results: We were able to apply BayCANN using only a dataset of the model inputs and outputs and minor modification of BayCANN's code. In this example, BayCANN was slightly more accurate in recovering the true posterior parameter estimates compared to IMIS. Obtaining the dataset of samples, and running BayCANN took 15 min compared to the IMIS which took 80 min. In applications involving computationally more expensive simulations (e.g., microsimulations), BayCANN may offer higher relative speed gains. Conclusions: BayCANN only uses a dataset of model inputs and outputs to obtain the calibrated joint parameter distributions. Thus, it can be adapted to models of various levels of complexity with minor or no change to its structure. In addition, BayCANN's efficiency can be especially useful in computationally expensive models. To facilitate BayCANN's wider adoption, we provide BayCANN's open-source implementation in R and Stan.

18.
Int J Numer Method Biomed Eng ; 37(6): e3453, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33751821

RESUMO

The main objective of this study was to solve a multi-objective optimization on a representative coronary stent platform with the goal of finding new geometric designs with improved biomechanical performance. The following set of metrics, calculated via finite element models, was used to quantify stent performance: vessel injury, radial recoil, bending resistance, longitudinal resistance, radial strength and prolapse index. The multi-objective optimization problem was solved with the aid of surrogate-based algorithms; for comparison and validation purposes, four surrogate-based multi-objective optimization algorithms (EIhv -EGO, Phv -EGO, ParEGO and SMS-EGO) with a limited sample budget were employed and their results compared. The quality of the non-dominated solution sets outputted by each algorithm was assessed against four quality indicators: hypervolume, R2, epsilon and generational distance. Results showed that Phv -EGO was the algorithm that exhibited the best performance in overall terms. Afterwards, the highest quality Pareto front was chosen for an in-depth analysis of the optimization results. The amount of correlation and conflict was quantified for each pair of objective functions. Next, through cluster analysis, one was able to identify families of solutions with similar performance behavior and to discuss the nature of the existent trade-offs between objectives, and the trends between design parameters and solutions in a biomechanical perspective. In the end, a constrained-based design selection was performed with the goal of finding solutions in the Pareto front with equal or better performance in all objectives against a baseline design.


Assuntos
Algoritmos , Vasos Coronários , Stents
19.
Hist Philos Life Sci ; 43(1): 27, 2021 Feb 23.
Artigo em Inglês | MEDLINE | ID: mdl-33620596

RESUMO

Patient-derived xenografts (PDXs) are currently promoted as new translational models in precision oncology. PDXs are immunodeficient mice with human tumors that are used as surrogate models to represent specific types of cancer. By accounting for the genetic heterogeneity of cancer tumors, PDXs are hoped to provide more clinically relevant results in preclinical research. Further, in the function of so-called "mouse avatars", PDXs are hoped to allow for patient-specific drug testing in real-time (in parallel to treatment of the corresponding cancer patient). This paper examines the circulation of knowledge and bodily material across the species boundary of human and personalized mouse model, historically as well as in contemporary practices. PDXs raise interesting questions about the relation between animal model and human patient, and about the capacity of hybrid or interspecies models to close existing translational gaps. We highlight that the translational potential of PDXs not only depends on representational matching of model and target, but also on temporal alignment between model development and practical uses. Aside from the importance of ensuring temporal stability of human tumors in a murine body, the mouse avatar concept rests on the possibility of aligning the temporal horizons of the clinic and the lab. We examine strategies to address temporal challenges, including cryopreservation and biobanking, as well as attempts to speed up translation through modification and use of faster developing organisms. We discuss how featured model virtues change with precision oncology, and contend that temporality is a model feature that deserves more philosophical attention.


Assuntos
Modelos Animais de Doenças , Xenoenxertos/estatística & dados numéricos , Oncologia/métodos , Medicina de Precisão/métodos , Pesquisa Translacional Biomédica/métodos , Transplante Heterólogo/estatística & dados numéricos , Animais , Bancos de Espécimes Biológicos , Criopreservação , Humanos , Camundongos , Filosofia
20.
Environ Monit Assess ; 192(10): 628, 2020 Sep 09.
Artigo em Inglês | MEDLINE | ID: mdl-32902735

RESUMO

To provide more precise understanding of water quality changes, continuous sampling is being used more in surface water quality monitoring networks. However, it remains unclear how much improvement continuous monitoring provides over spot sampling, in identifying water quality changes over time. This study aims (1) to assess our ability to detect trends using water quality data of both high and low frequencies and (2) to assess the value of using high-frequency data as a surrogate to help detect trends in other constituents. Statistical regression models were used to identify temporal trends and then to assess the trend detection power of high-frequency (15 min) and low-frequency (monthly) data for turbidity and electrical conductivity (EC) data collected across Victoria, Australia. In addition, we developed surrogate models to simulate five sediment and nutrients constituents from runoff, turbidity and EC. A simulation-based statistical approach was then used to the compare the power to detect trends between the low- and high-frequency water quality records. Results show that high-frequency sampling shows clear benefits in trend detection power for turbidity, EC, as well as simulated sediment and nutrients, especially over short data periods. For detecting a 1% annual trend with 5 years of data, up to 97% and 94% improvements on the trend detection probability are offered by high-frequency data compared with monthly data, for turbidity and EC, respectively. Our results highlight the benefits of upgrading monitoring networks with wider application of high-frequency sampling.


Assuntos
Poluentes da Água/análise , Qualidade da Água , Monitoramento Ambiental , Vitória , Água
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA