RESUMO
Fluorinated organic compounds (FOCs) represent a class of synthetic chemicals distinguished by their resilient carbon-fluorine bonds, which demonstrate an ability to withstand environmental degradation over an extended period. The integration of FOCs into cutting-edge applications, including lithium-ion batteries (LiBs), presents considerable potential for environmental harm that has not yet been sufficiently addressed. This study focuses on the environmental fate of two fluorinated aromatics, tris(pentafluorophenyl)borane (TPFPB) and tris(pentafluorophenyl)phosphine (TPFPP), given their important role in improving the performance of LiBs. To achieve this, laboratory simulation methods including total oxidizable precursor assay, electrochemistry (EC), Fenton reaction, UV-C irradiation, and hydrolysis were employed. Liquid chromatography and gas chromatography coupled with high-resolution mass spectrometry were used for identification of transformation products (TPs) and prediction of their molecular formulae. Despite the structural similarity between TPFPB and TPFPP, distinct differences in electrochemical behavior and degradation pathways were observed. TPFPB readily underwent hydroxylation and hydrolysis, resulting in a wide range of 49 TPs. A total of 28 TPs were newly identified, including oligomers and highly toxic dioxins. In contrast, TPFPP degraded exclusively under harsh conditions, requiring the development of innovative conditioning protocols for EC. In total, the simulation experiments yielded nine structurally different compounds, including seven previously undescribed, partially defluorinated TPs. This study highlights the potential risks associated with the use of FOCs in LiBs and provides insight into the complex environmental behavior of FOCs.
RESUMO
We describe a new method for presenting and interpreting linear trends in health inequalities, and present a proof-of-concept analysis of inequalities in smoking among adolescents in Europe. We estimated the regression line of the assumed linear relationship between smoking prevalence in low- and high-socioeconomic status (SES) youth over time. Using simulation, we constructed a 95% confidence interval (CI) for the smoking prevalence in low-SES youth for when this would be 0% in high-SES youth, and we calculated the likelihood of eradicating smoking inequality (<5% for both low and high SES). This method was applied to data on adolescents aged 15-16 years (n = 250,326) from 23 European countries, derived from the 2003-2015 European Survey Project on Alcohol and Other Drugs. Smoking prevalence decreased more slowly among low- than among high-SES adolescents. The estimated smoking prevalence was 9.4% (95% CI: 6.1, 12.7) for boys and 5.4% (95% CI: 1.4, 9.2) for girls with low SES when 0% with high SES. The likelihood of eradicating smoking inequality was <1% for boys and 37% for girls. We conclude that this novel methodological approach to trends in health inequalities is feasible in practice. Applying it to trends in smoking inequalities among adolescents in Europe, we found that Europe is currently not on track to eradicate youth smoking across SES groups.
Assuntos
Fumar , Classe Social , Feminino , Masculino , Humanos , Adolescente , Fatores Socioeconômicos , Fumar/epidemiologia , Europa (Continente)/epidemiologia , Fumar Tabaco/epidemiologiaRESUMO
The paper contributes to the debate on natural interest rates and potential growth rates. We build a model-based projection of the world's most significant economies/areas to improve understanding of their change over the long run and the factors behind their decline. We use a general equilibrium overlapping generation model to understand the simultaneous role of demographics, technology, and globalization. The novelty of the model lies in the way it constructs a human capital index based on UN population projections and an estimated increasing returns production function for major economies worldwide. We find that the decline in interest rates is well explained through labor market dynamics and the increasing obsolescence of capital goods. We also find that a reduced share of labor income has caused movement in the opposite direction, leading to an increase in natural interest rates, which runs counter to the empirical evidence. Moreover, the dynamics of economic integration predict an endogenous adjustment of global imbalances over the long run, with an increasing weight of the Chinese economy and, consequently, a phase of weakness in United States growth between 2030 and 2040. The model is also used to perform shock scenario analysis. We find that demographic decline can adversely affect the growth dynamics for European countries, while a change in the dynamics of globalization can have serious consequences, especially for the United States, with significant benefits for European countries and China.
RESUMO
The Future Elderly Model (FEM) is a microsimulation model designed to forecast health status, longevity, and a variety of economic outcomes. Compared to traditional actuarial models, microsimulation models provide greater opportunities for policy forecasting and richer detail, but they typically build upon smaller samples of data that may mitigate forecasting accuracy. We perform validation analyses of the FEM's mortality and quality of life forecasts using a version of the FEM estimated exclusively on early waves of data from the Health and Retirement Study. First, we compare FEM mortality and longevity projections to the actual mortality and longevity experience observed over the same period of time. We also compare the FEM results to actuarial forecasts of mortality and longevity during the same time. We find that FEM projections are generally in line with observed mortality rates and closely match longevity. Then, we assess the FEM's performance at predicting quality of life and longitudinal outcomes, two features missing from traditional actuarial models. Our analysis suggests the FEM performs at least as well as actuarial forecasts of mortality, while providing policy simulation features that are not available in actuarial models.
Assuntos
Longevidade , Qualidade de Vida , Idoso , Previsões , Nível de Saúde , Humanos , AposentadoriaRESUMO
Accurate future projections of population health are imperative to plan for the future healthcare needs of a rapidly aging population. Multistate-transition microsimulation models, such as the U.S. Future Elderly Model, address this need but require high-quality panel data for calibration. We develop an alternative method that relaxes this data requirement, using repeated cross-sectional representative surveys to estimate multistate-transition contingency tables applied to Japan's population. We calculate the birth cohort sex-specific prevalence of comorbidities using five waves of the governmental health surveys. Combining estimated comorbidity prevalence with death record information, we determine the transition probabilities of health statuses. We then construct a virtual Japanese population aged 60 and older as of 2013 and perform a microsimulation to project disease distributions to 2046. Our estimates replicate governmental projections of population pyramids and match the actual prevalence trends of comorbidities and the disease incidence rates reported in epidemiological studies in the past decade. Our future projections of cardiovascular diseases indicate lower prevalence than expected from static models, reflecting recent declining trends in disease incidence and fatality.
Assuntos
Coorte de Nascimento , Estado Funcional , Idoso , Estudos Transversais , Feminino , Previsões , Humanos , Japão/epidemiologia , Masculino , Pessoa de Meia-IdadeRESUMO
Antithrombin (AT) is a critical regulator of the coagulation cascade by inhibiting multiple coagulation factors including thrombin and FXa. Binding of heparinoids to this serpin enhances the inhibition considerably. Mutations located in the heparin binding site of AT result in thrombophilia in affected individuals. Our aim was to study 10 antithrombin mutations known to affect their heparin binding in a heparin pentasaccharide bound state using two molecular dynamics (MD) based methods providing enhanced sampling, GaMD and LiGaMD2. The latter provides an additional boost to the ligand and the most important binding site residues. From our GaMD simulations we were able to identify four variants (three affecting amino acid Arg47 and one affecting Lys114) that have a particularly large effect on binding. The additional acceleration provided by LiGaMD2 allowed us to study the consequences of several other mutants including those affecting Arg13 and Arg129. We were able to identify several conformational types by cluster analysis. Analysis of the simulation trajectories revealed the causes of the impaired pentasaccharide binding including pentasaccharide subunit conformational changes and altered allosteric pathways in the AT protein. Our results provide insights into the effects of AT mutations interfering with heparin binding at an atomic level and can facilitate the design or interpretation of in vitro experiments.
Assuntos
Antitrombinas , Heparina , Simulação de Dinâmica Molecular , Mutação , Heparina/metabolismo , Heparina/química , Sítios de Ligação , Humanos , Antitrombinas/química , Antitrombinas/metabolismo , Ligação Proteica , Oligossacarídeos/química , Oligossacarídeos/metabolismoRESUMO
This paper is concerned with the growth-driven shape-programming problem, which involves determining a growth tensor that can produce a deformation on a hyperelastic body reaching a given target shape. We consider the two cases of globally compatible growth, where the growth tensor is a deformation gradient over the undeformed domain, and the incompatible one, which discards such hypothesis. We formulate the problem within the framework of optimal control theory in hyperelasticity. The Hausdorff distance is used to quantify dissimilarities between shapes; the complexity of the actuation is incorporated in the cost functional as well. Boundary conditions and external loads are allowed in the state law, thus extending previous works where the stress-free hypothesis turns out to be essential. A rigorous mathematical analysis is then carried out to prove the well-posedness of the problem. The numerical approximation is performed using gradient-based optimisation algorithms. Our main goal in this part is to show the possibility to apply inverse techniques for the numerical approximation of this problem, which allows us to address more generic situations than those covered by analytical approaches. Several numerical experiments for beam-like and shell-type geometries illustrate the performance of the proposed numerical scheme.
RESUMO
HIGHLIGHTS: The net value of reducing decision uncertainty by collecting additional data is quantified by the expected net benefit of sampling (ENBS). This tutorial presents a general-purpose algorithm for computing the ENBS for collecting survival data along with a step-by-step implementation in R.The algorithm is based on recently published methods for simulating survival data and computing expected value of sample information that do not rely on the survival data to follow any particular parametric distribution and that can take into account any arbitrary censoring process.We demonstrate in a case study based on a previous cancer technology appraisal that ENBS calculations are useful not only for designing new studies but also for optimizing reimbursement decisions for new health technologies based on immature evidence from ongoing trials.
Assuntos
Algoritmos , Humanos , Análise de Sobrevida , Incerteza , Neoplasias/mortalidade , Avaliação da Tecnologia Biomédica/métodosRESUMO
BACKGROUND: Expected value of sample information (EVSI) quantifies the expected value to a decision maker of reducing uncertainty by collecting additional data. EVSI calculations require simulating plausible data sets, typically achieved by evaluating quantile functions at random uniform numbers using standard inverse transform sampling (ITS). This is straightforward when closed-form expressions for the quantile function are available, such as for standard parametric survival models, but these are often unavailable when assuming treatment effect waning and for flexible survival models. In these circumstances, the standard ITS method could be implemented by numerically evaluating the quantile functions at each iteration in a probabilistic analysis, but this greatly increases the computational burden. Thus, our study aims to develop general-purpose methods that standardize and reduce the computational burden of the EVSI data-simulation step for survival data. METHODS: We developed a discrete sampling method and an interpolated ITS method for simulating survival data from a probabilistic sample of survival probabilities over discrete time units. We compared the general-purpose and standard ITS methods using an illustrative partitioned survival model with and without adjustment for treatment effect waning. RESULTS: The discrete sampling and interpolated ITS methods agree closely with the standard ITS method, with the added benefit of a greatly reduced computational cost in the scenario with adjustment for treatment effect waning. CONCLUSIONS: We present general-purpose methods for simulating survival data from a probabilistic sample of survival probabilities that greatly reduce the computational burden of the EVSI data-simulation step when we assume treatment effect waning or use flexible survival models. The implementation of our data-simulation methods is identical across all possible survival models and can easily be automated from standard probabilistic decision analyses. HIGHLIGHTS: Expected value of sample information (EVSI) quantifies the expected value to a decision maker of reducing uncertainty through a given data collection exercise, such as a randomized clinical trial. In this article, we address the problem of computing EVSI when we assume treatment effect waning or use flexible survival models, by developing general-purpose methods that standardize and reduce the computational burden of the EVSI data-generation step for survival data.We developed 2 methods for simulating survival data from a probabilistic sample of survival probabilities over discrete time units, a discrete sampling method and an interpolated inverse transform sampling method, which can be combined with a recently proposed nonparametric EVSI method to accurately estimate EVSI for collecting survival data.Our general-purpose data-simulation methods greatly reduce the computational burden of the EVSI data-simulation step when we assume treatment effect waning or use flexible survival models. The implementation of our data-simulation methods is identical across all possible survival models and can therefore easily be automated from standard probabilistic decision analyses.
Assuntos
Probabilidade , Humanos , Incerteza , Simulação por Computador , Coleta de Dados , Análise Custo-BenefícioRESUMO
As global warming intensifies, there will be increased uncertainty as to the environmental behavior and risks from heavy metals in industrial/legacy contaminated sites in permafrost regions. Bioavailability has been increasingly used for human health risk assessment of heavy metals in contaminated soils. Soil heavy metal bioavailability depends on soil physicochemical properties, and freeze-thaw affects soil physical, chemical and biological processes. However it is not clear whether freeze-thaw has an effect on the bioavailability of soil heavy metals. In this study, soils contaminated with Pb and As were collected from 10 industrial sites in northeast China. Extractability and bioavailability of soil Pb and As were determined by the Tessier sequential extraction method and four in vitro gastron-intestinal simulation methods under control and freeze-thaw treatments. The aims were: to compare the results of extraction and bioavailability from laboratory experiments which artificially simulate freeze-thaw conditions against control soils; to explore the correlation between bioavailability of Pb/As and soil properties. Freeze-thaw significantly decreased soil pH, and increased the soil weight surface area. Freeze-thaw decreased the percentage in the residual fraction, and increased the percentage of Pb and As in the exchangeable fraction, carbonate-bound fraction, Fe-Mn oxides-bound fraction and organic-bound fraction, relative to control soils. Freeze-thaw significantly increased Pb and As bioavailability compared to the controls. Pb and As released in the gastric phase of the four methods was significantly higher than that in the intestinal phase. Further analysis of correlations between Pb and As bioavailability and soil properties indicated that total concentrations of Al, Fe and Mn, particle size, and weight surface area significantly correlated to Pb and As bioavailability. Overall, this study demonstrated that freeze-thaw did influence the bioavailability of soil heavy metals. It suggests the freeze-thaw action should be comprehensively considered in the human risk assessment of soil pollutants in permafrost regions.
Assuntos
Metais Pesados , Poluentes do Solo , Humanos , Chumbo/análise , Disponibilidade Biológica , Poluição Ambiental/análise , Metais Pesados/análise , Solo/química , Poluentes do Solo/análiseRESUMO
The translational challenge in biomedical research lies in the effective and efficient transfer of mechanistic knowledge from one biological context to another. Implicit in this process is the establishment of causality from correlation in the form of mechanistic hypotheses. Effectively addressing the translational challenge requires the use of automated methods, including the ability to computationally capture the dynamic aspect of putative hypotheses such that they can be evaluated in a high throughput fashion. Ontologies provide structure and organization to biomedical knowledge; converting these representations into executable models/simulations is the next necessary step. Researchers need the ability to map their conceptual models into a model specification that can be transformed into an executable simulation program. We suggest this mapping process, which approximates certain steps in the development of a computational model, can be expressed as a set of logical rules, and a semi-intelligent computational agent, the Computational Modeling Assistant (CMA), can perform reasoning to develop a plan to achieve the construction of an executable model. Presented herein is a description and implementation for a model construction reasoning process between biomedical and simulation ontologies that is performed by the CMA to produce the specification of an executable model that can be used for dynamic knowledge representation.
RESUMO
Simulation-based and permutation-based inferential methods are commonplace in phylogenetic comparative methods, especially as evolutionary data have become more complex and parametric methods more limited for their analysis. Both approaches simulate many random outcomes from a null model to empirically generate sampling distributions of statistics. Although simulation-based and permutation-based methods seem commensurate in purpose, results from analysis of variance (ANOVA) based on the distributions of random F-statistics produced by these methods can be quite different in practice. Differences could be from either the null-model process that generates variation across many simulations or random permutations of the data, or different estimation methods for linear model coefficients and statistics. Unfortunately, because the null-model process and coefficient estimation are intrinsically linked in phylogenetic ANOVA methods, the precise reason for methodological differences has not been fully considered. Here we show that the null-model processes of phylogenetic simulation and randomizing residuals in a permutation procedure are indeed commensurate, and that both also produce results consistent with parametric ANOVA, for cases where parametric ANOVA is possible. We also provide results that caution against using ordinary least-squares estimation along with phylogenetic simulation; a typical phylogenetic ANOVA implementation.
Assuntos
Filogenia , Análise de Variância , Simulação por Computador , Análise dos Mínimos Quadrados , Modelos LinearesRESUMO
A number of papers by Young and collaborators have criticized epidemiological studies and meta-analyses of air pollution hazards using a graphical method that the authors call a P value plot, claiming to find zero effects, heterogeneity, and P hacking. However, the P value plot method has not been validated in a peer-reviewed publication. The aim of this study was to investigate the statistical and evidentiary properties of this method. Methods: A simulation was developed to create studies and meta-analyses with known real effects δ , integrating two quantifiable conceptions of evidence from the philosophy of science literature. The simulation and analysis is publicly available and automatically reproduced. Results: In this simulation, the plot did not provide evidence for heterogeneity or P hacking with respect to any condition. Under the right conditions, the plot can provide evidence of zero effects; but these conditions are not satisfied in any actual use by Young and collaborators. Conclusion: The P value plot does not provide evidence to support the skeptical claims about air pollution hazards made by Young and collaborators.
RESUMO
PURPOSE: This study investigated the ability of two chewing simulation devices to emulate in vitro the clinical deterioration observed in anterior composite restorations in severe tooth-wear patients. MATERIALS AND METHODS: Advanced tooth wear was simulated in bovine incisors, which were restored with palatal and buccal direct composite veneer restorations. The incisal edges of restorations were subjected to 960K cycles of either compressive loading (Biocycle-V2; 125 N at 2 Hz) or wear and mechanical loading (Rub and Roll; 30 N at 20 rpm). Surface degradation was rated using FDI scores to compare the chewing devices (Fisher's test, a = 0.05). Topography and deterioration of restorations was analyzed using SEM. The ability to emulate the deterioration was investigated by comparing the surface degradation observed in vitro with the clinical degradation observed in restorations placed in severe tooth-wear patients after 3.5 years. RESULTS: Distinct degradation patterns were observed between the simulation devices: Biocycle-V2 generated deterioration that was not comparable to the clinical situation, including contact damage, minor wear, and localized roughening. The degradation caused by Rub and Roll was more similar to the in vivo situation, including wear facets, chipping, delamination, staining, and marginal ditching. The FDI scores were different between the chewing devices for surface/marginal staining, material/retention, and marginal adaptation (p = 0.003). SEM analysis showed microcracking at the interface between composite layers at the incisal edges. CONCLUSIONS: The Rub and Roll chewing device was able to emulate the clinical deterioration observed in anterior restorations in severe tooth-wear patients and thus may be used as an oral-cavity simulation method, contributing to translational research.
Assuntos
Deterioração Clínica , Desgaste dos Dentes , Animais , Bovinos , Resinas Compostas , Restauração Dentária Permanente/métodos , Humanos , Mastigação , Desgaste dos Dentes/terapiaRESUMO
The expected value of sample information (EVSI) can be used to prioritize avenues for future research and design studies that support medical decision making and offer value for money spent. EVSI is calculated based on 3 key elements. Two of these, a probabilistic model-based economic evaluation and updating model uncertainty based on simulated data, have been frequently discussed in the literature. By contrast, the third element, simulating data from the proposed studies, has received little attention. This tutorial contributes to bridging this gap by providing a step-by-step guide to simulating study data for EVSI calculations. We discuss a general-purpose algorithm for simulating data and demonstrate its use to simulate 3 different outcome types. We then discuss how to induce correlations in the generated data, how to adjust for common issues in study implementation such as missingness and censoring, and how individual patient data from previous studies can be leveraged to undertake EVSI calculations. For all examples, we provide comprehensive code written in the R language and, where possible, Excel spreadsheets in the supplementary materials. This tutorial facilitates practical EVSI calculations and allows EVSI to be used to prioritize research and design studies.
Assuntos
Algoritmos , Modelos Estatísticos , Análise Custo-Benefício , Humanos , IncertezaRESUMO
Colloidal particles in nematic liquid crystals show a beautiful variety of complex phenomena with promising applications. Their dynamical behaviour is determined by topology and interactions with the liquid crystal and external fields. Here, a nematic magnetic nanocapsule reoriented periodically by time-varying magnetic fields is studied using numerical simulations. The approach combines Molecular Dynamics to resolve solute-solvent interactions and Nematic Multiparticle Collision Dynamics to incorporate nematohydrodynamic fields and fluctuations. A Saturn ring defect resulting from homeotropic anchoring conditions surrounds the capsule and rotates together with it. Magnetically induced rotations of the capsule can produce transformations of this topological defect, which changes from a disclination curve to a defect structure extending over the surface of the capsule. Transformations occur for large magnetic fields. At moderate fields, elastic torques prevent changes of the topological defect by tilting the capsule out from the rotation plane of the magnetic field.
RESUMO
A comprehensive model for the theoretical simulation of luminescent solar concentrators (LSCs) has been developed and examined. It can simulate the interdependent effects of multiple dopants having two main electronic energy states, which are incorporated simultaneously into the fiber core, as well as the effect of the cladding. The available experimental results appear to confirm the accuracy of the model, which is a valuable tool for gaining insight into the behavior of LSC prototypes, since it may guide the designers at the early stages of optimization processes.
RESUMO
At present, sustainable and stable partial nitrification has not been widely achieved in the mainstream PN/ANAMMOX process. Here, the feasibility of sustainable and stable partial nitrification was demonstrated in automatic recycling PN/ANAMMOX reactor under mainstream conditions using both simulation and experimental methods. Stable nitrite accumulation in the aerobic zone could be achieved via regulating dissolved oxygen (DO) concentrations and sludge retention time (SRT). The DO concentrations required for the repression of nitrite-oxidizing bacteria (NOB) were lower at longer SRTs. The DO concentrations and SRTs required for NOB repression were lower at lower temperatures. However, NOB repression was diminished by a persistent low DO and short SRT under mainstream conditions. With the introduction of automatic recycling, sustainable and stable partial nitrification was achieved. Effluent recycling could limit the nitrite-nitrogen required for NOB growth. Collectively, effluent recycling may serve as a feasible and useful strategy for NOB inhibition during the PN/ANAMMOX process.
Assuntos
Reatores Biológicos , Nitritos , Bactérias , Nitrificação , Nitrogênio , Oxirredução , Esgotos , Águas ResiduáriasRESUMO
BACKGROUND: Implementation researchers have sought ways to use simulations to support the core components of implementation, which typically include assessing the need for change, designing implementation strategies, executing the strategies, and evaluating outcomes. The goal of this paper is to explain how agent-based modeling could fulfill this role. METHODS: We describe agent-based modeling with respect to other simulation methods that have been used in implementation science, using non-technical language that is broadly accessible. We then provide a stepwise procedure for developing agent-based models of implementation processes. We use, as a case study to illustrate the procedure, the implementation of evidence-based smoking cessation practices for persons with serious mental illness (SMI) in community mental health clinics. RESULTS: For our case study, we present descriptions of the motivating research questions, specific models used to answer these questions, and a summary of the insights that can be obtained from the models. In the first example, we use a simple form of agent-based modeling to simulate the observed smoking behaviors of persons with SMI in a recently completed trial (IDEAL, Comprehensive Cardiovascular Risk Reduction Trial in Persons with SMI). In the second example, we illustrate how a more complex agent-based approach that includes interactions between patients, providers and site administrators can be used to provide guidance for an implementation intervention that includes training and organizational strategies. This example is based in part on an ongoing project focused on scaling up evidence-based tobacco smoking cessation practices in community mental health clinics in Maryland. CONCLUSION: In this paper we explain how agent-based models can be used to address implementation science research questions and provide a procedure for setting up simulation models. Through our examples, we show how what-if scenarios can be examined in the implementation process, which are particularly useful in implementation frameworks with adaptive components.
RESUMO
BACKGROUND: The current study utilizes system dynamics to model the determinants of youth smoking and simulate effects of anti-smoking policies in the context of North Dakota, a state with one of the lowest cigarette tax rates in the USA. METHODS: An explanatory model was built to replicate historical trends in the youth smoking rate. Three different policies were simulated: 1) an increase in cigarette excise taxes; 2) increased funding for CDC-recommended comprehensive tobacco control programs; and 3) enforcement of increased retailer compliance with age restrictions on cigarette sales. RESULTS: The explanatory model successfully replicated historical trends in adolescent smoking behavior in North Dakota from 1992 to 2014. The policy model showed that increasing taxes to $2.20 per pack starting in 2015 was the most effective of the three policies, producing a 32.6% reduction in youth smoking rate by 2032. Other policies reduced smoking by a much lesser degree (7.0 and 3.2% for comprehensive tobacco control program funding and retailer compliance, respectively). The effects of each policy were additive. CONCLUSIONS: System dynamics modeling suggests that increasing cigarette excise taxes are particularly effective at reducing adolescent smoking rates. More generally, system dynamics offers an important complement to conventional analysis of observational data.