Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 73
Filtrar
1.
Front Genet ; 15: 1401470, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39050246

RESUMO

As genomic selection emerges as a promising breeding method for both plants and animals, numerous methods have been introduced and applied to various real and simulated data sets. Research suggests that no single method is universally better than others; rather, performance is highly dependent on the characteristics of the data and the nature of the prediction task. This implies that each method has its strengths and weaknesses. In this study, we exploit this notion and propose a different approach. Rather than comparing multiple methods to determine the best one for a particular study, we advocate combining multiple methods to achieve better performance than each method in isolation. In pursuit of this goal, we introduce and develop a computational method of the stacked generalization within ensemble methods. In this method, the meta-model merges predictions from multiple base models to achieve improved performance. We applied this method to plant and animal data and compared its performance with currently available methods using standard performance metrics. We found that the proposed method yielded a lower or comparable mean squared error in predicting phenotypes compared to the current methods. In addition, the proposed method showed greater resistance to overfitting compared to the current methods. Further analysis included statistical hypothesis testing, which showed that the proposed method outperformed or matched the current methods. In summary, the proposed stacked generalization integrates currently available methods to achieve stable and better performance. In this context, our study provides general recommendations for effective practices in genomic selection.

2.
Front Chem ; 12: 1398984, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38894728

RESUMO

The component analysis of raw meal is critical to the quality of cement. In recent years, near-infrared (NIR) has been emerged as an innovative and efficient analytical method to determine the oxide content of cement raw meal. This study aims to utilize NIR spectroscopy combined with machine learning and chemometrics to improve the prediction of oxide content in cement raw meal. The Savitzky-Golay convolution smoothing method is applied to eliminate noise interference for the analysis of calcium carbonate ( C a C O 3 ), silicon dioxide ( S i O 2 ), aluminum oxide ( A l 2 O 3 ), and ferric oxide ( F e 2 O 3 ) in cement raw materials. Different wavelength selection techniques are used to perform a comprehensive analysis of the model, comparing the performance of several wavelength selection techniques. The back-propagation neural network regression model based on particle swarm optimization algorithm was also applied to optimize the extracted and screened feature wavelengths, and the model prediction performance was checked and evaluated using R p and RMSE. In conclusion, the results indicate that NIR spectroscopy in combination with ML and chemometrics has great potential to effectively improve the prediction performance of oxide content in raw materials and highlight the importance of modeling and wavelength selection techniques. By enabling more accurate and efficient determination of oxide content in raw materials, NIR spectroscopy coupled with meta-modeling has the potential to revolutionize quality assurance practices in cement manufacturing.

3.
Artigo em Inglês | MEDLINE | ID: mdl-38856785

RESUMO

This paper deals with Emergency Department (ED) fast-tracks for low-acuity patients, a strategy often adopted to reduce ED overcrowding. We focus on optimizing resource allocation in minor injuries units, which are the ED units that can treat low-acuity patients, with the aim of minimizing patient waiting times and ED operating costs. We formulate this problem as a general multiobjective simulation-based optimization problem where some of the objectives are expensive black-box functions that can only be evaluated through a time-consuming simulation. To efficiently solve this problem, we propose a metamodeling approach that uses an artificial neural network to replace a black-box objective function with a suitable model. This approach allows us to obtain a set of Pareto optimal points for the multiobjective problem we consider, from which decision-makers can select the most appropriate solutions for different situations. We present the results of computational experiments conducted on a real case study involving the ED of a large hospital in Italy. The results show the reliability and effectiveness of our proposed approach, compared to the standard approach based on derivative-free optimization.

4.
Bioengineering (Basel) ; 11(5)2024 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-38790358

RESUMO

Cardiopulmonary resuscitation (CPR) is a life-saving technique used in emergencies when the heart stops beating, typically involving chest compressions and ventilation. Current adult CPR guidelines do not differentiate based on age beyond infancy and childhood. This oversight increases the risk of fatigue fractures in the elderly due to decreased bone density and changes in thoracic structure. Therefore, this study aimed to investigate the correlation and impact of factors influencing rib fatigue fractures for safer out-of-hospital manual cardiopulmonary resuscitation (OHMCPR) application. Using the finite element analysis (FEA) method, we performed fatigue analysis on rib cage models incorporating chest compression conditions and age-specific trabecular bone properties. Fatigue life analyses were conducted on three age-specific rib cage models, each differentiated by trabecular bone properties, to determine the influence of four explanatory variables (the properties of the trabecular bone (a surrogate for the age of the subject), the site of application of the compression force on the breastbone, the magnitude of applied compression force, and the rate of application of the compression force) on the fatigue life of the model. Additionally, considering the complex interaction of chest compression conditions during actual CPR, we aimed to predict rib fatigue fractures under conditions simulating real-life scenarios by analyzing the sensitivity and interrelation of chest compression conditions on the model's fatigue life. Time constraints led to the selection of optimal analysis conditions through the use of design of experiments (DOE), specifically orthogonal array testing, followed by the construction of a deep learning-based metamodel. The predicted fatigue life values of the rib cage model, obtained from the metamodel, showed the influence of the four explanatory variables on fatigue life. These results may be used to devise safer CPR guidelines, particularly for the elderly at a high risk of acute cardiac arrest, safeguarding against potential complications like fatigue fractures.

5.
Bioengineering (Basel) ; 11(3)2024 Feb 23.
Artigo em Inglês | MEDLINE | ID: mdl-38534486

RESUMO

Dynamic crossflow filtration (DCF) is the state-of-the-art technology for solid-liquid separation from viscous and sensitive feed streams in the food and biopharma industry. Up to now, the potential of industrial processes is often not fully exploited, because fixed recipes are usually applied to run the processes. In order to take the varying properties of biological feed materials into account, we aim to develop a digital twin of an industrial brownfield DCF plant, allowing to optimize setpoint decisions in almost real time. The core of the digital twin is a mechanistic-empirical process model combining fundamental filtration laws with process expert knowledge. The effect of variation in the selected process and model parameters on plant productivity has been assessed using a model-based design-of-experiments approach, and a regression metamodel has been trained with the data. A cyclic program that bidirectionally communicates with the DCF asset serves as frame of the digital twin. It monitors the process dynamics membrane torque and transmembrane pressure and feeds back the optimum permeate flow rate setpoint to the physical asset in almost real-time during process runs. We considered a total of 24 industrial production batches from the filtration of grape juice from the years 2022 and 2023 in the study. After implementation of the digital twin on site, the campaign mean productivity increased by 15% over the course of the year 2023. The presented digital twin framework is a simple example how an industrial established process can be controlled by a hybrid model-based algorithm. With a digital process dynamics model at hand, the presented metamodel optimization approach can be easily transferred to other (bio)chemical processes.

6.
Integr Psychol Behav Sci ; 58(1): 149-159, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-37256480

RESUMO

Scientific modeling is a syllogistic system of definitive premise, sound inference and consistent explanation to understand, define, quantify, visualize or simulate feature of the target. Single-model is defined to an informative representation for identifying a property of a target object/phenomenon, and meta-model integrates the relevant single-models to explain phenomenological realities. Human recognition-behavioral adaptation is an information-metabolism system to maintain homeostasis of human-self, and that has been investigated in neurological, psychiatric and psychological aspects. I analyzed human recognition-behavioral adaptation-system via scientific modeling. Neurological meta-model of human recognition-behavioral adaptation system was synthesized as complex-network of the functional neuronal modules, and the meta-model was integrated to Mentality-model in the psychiatric aspect, and to Personality-model in the psychological aspect. The integrated meta-models successfully explained phenomenological realities in the aspects. From the above, I comprehended that the meta-model of human recognition-behavioral adaptation-system has been developed to Biopsychosocial model integrating the biological, psychological and socio-environmental factors.


Assuntos
Modelos Neurológicos , Humanos
7.
Sensors (Basel) ; 23(21)2023 Oct 26.
Artigo em Inglês | MEDLINE | ID: mdl-37960429

RESUMO

The rapid growth of the Internet of Things (IoT) and its integration into various industries has made it extremely challenging to guarantee IoT systems' dependability and quality, including scalability, dynamicity, and integration with existing IoT frameworks. However, the essential principles, approaches, and advantages of model-driven IoT testing indicate a promising strategy for overcoming these. This paper proposes a metamodeling-based interoperability and integration testing approach for IoT systems that automates the creation of test cases and the assessment of system performance by utilizing formal models to reflect the behavior and interactions of IoT systems. The proposed model-based testing enables the systematic verification and validation of complex IoT systems by capturing the essential characteristics of IoT devices, networks, and interactions. This study describes the key elements of model-driven IoT testing, including the development of formal models, methods for generating test cases, and the execution and assessment of models. In addition, it examines various modeling formalisms and their use in IoT testing, including state-based, event-driven, and hybrid models. This study examines several methods for creating test cases to ensure thorough and effective testing, such as constraint-based strategies and model coverage requirements. Model-driven IoT testing improves defect detection, expands test coverage, decreases testing effort, and increases system reliability. It also offers an organized and automated method to confirm the efficiency and dependability of IoT systems.

8.
Artigo em Inglês | MEDLINE | ID: mdl-37994534

RESUMO

Computational modelling was used to assess the capability of a deterministic and a probabilistic method to predict the incidence of AIS3+ injuries in passenger car occupants by comparing the predictions of the methods to the actual injuries observed in real-world crashes. The likelihood of sustaining an injury was first calculated using a computer model for a selected set of injury criteria in different impact conditions based on real-world crashes; AIS3+ injuries were then predicted using each method separately. Regardless of the method, the number of serious injuries was over-predicted. It was also noted that the used injury criteria suggested the occurrence of specific injuries that were not observed in the real world. Although both methods are susceptible to be adapted to improve their predictions, the question of the suitability of using some of the most commonly accepted injury criteria used with crash test dummies for injury assessment with human body models deserves further research.

9.
Toxicol Lett ; 389: 34-44, 2023 Nov 01.
Artigo em Inglês | MEDLINE | ID: mdl-37890682

RESUMO

New Approach Methodologies (NAMs) have ushered in a new era in the field of toxicology, aiming to replace animal testing. However, despite these advancements, they are not exempt from the inherent complexities associated with the study's endpoint. In this review, we have identified three major groups of complexities: mechanistic, chemical space, and methodological. The mechanistic complexity arises from interconnected biological processes within a network that are challenging to model in a single step. In the second group, chemical space complexity exhibits significant dissimilarity between compounds in the training and test series. The third group encompasses algorithmic and molecular descriptor limitations and typical class imbalance problems. To address these complexities, this work provides a guide to the usage of a combination of predictive Quantitative Structure-Activity Relationship (QSAR) models, known as metamodels. This combination of low-level models (LLMs) enables a more precise approach to the problem by focusing on different sub-mechanisms or sub-processes. For mechanistic complexity, multiple Molecular Initiating Events (MIEs) or levels of information are combined to form a mechanistic-based metamodel. Regarding the complexity arising from chemical space, two types of approaches were reviewed to construct a fragment-based chemical space metamodel: those with and without structure sharing. Metamodels with structure sharing utilize unsupervised strategies to identify data patterns and build low-level models for each cluster, which are then combined. For situations without structure sharing due to pharmaceutical industry intellectual property, the use of prediction sharing, and federated learning approaches have been reviewed. Lastly, to tackle methodological complexity, various algorithms are combined to overcome their limitations, diverse descriptors are employed to enhance problem definition and balanced dataset combinations are used to address class imbalance issues (methodological-based metamodels). Remarkably, metamodels consistently outperformed classical QSAR models across all cases, highlighting the importance of alternatives to classical QSAR models when faced with such complexities.


Assuntos
Algoritmos , Relação Quantitativa Estrutura-Atividade , Animais
10.
Mar Pollut Bull ; 197: 115676, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37897965

RESUMO

This research presents a procedure for determining the origin of marine pollution through the use of a time-direct trajectory modeling, associated with a Kriging metamodel technique and Monte Carlo random sampling. These methods were applied to a real case, specifically the oil spill that affected the Brazilian coast in the second half of 2019 and early 2020. A total of 140 trajectories, defined by the geographical coordinates of the origin and the spill date, were generated through Latin Hypercube Sampling and simulated using the PyGNOME model to construct the Kriging metamodel. The metamodel demonstrated cost-effectiveness by efficiently simulating numerous input data combinations which were compared and optimized based on available real data regarding temporal and spatial pollution distribution.


Assuntos
Poluição por Petróleo , Poluição por Petróleo/análise , Brasil , Poluição Ambiental , Geografia , Método de Monte Carlo
11.
Sensors (Basel) ; 23(19)2023 Oct 07.
Artigo em Inglês | MEDLINE | ID: mdl-37837126

RESUMO

The main aim of this paper is to explore new approaches to structural design and to solve the problem of lightweight design of structures involving multivariable and multi-objectives. An integrated optimization design methodology is proposed by combining intelligent optimization algorithms with generative design. Firstly, the meta-model is established to explore the relationship between design variables, quality, strain energy, and inherent energy. Then, employing the Non-dominated Sorting Genetic Algorithm III (NSGA-III), the optimal frameworks of the structure are sought within the entire design space. Immediately following, a structure is rebuilt based on the principle of cooperative equilibrium. Furthermore, the rebuilt structure is integrated into a generative design, enabling automatic iteration by controlling the initial parameter set. The quality and rigidity of the structure under different reconstructions are evaluated, resulting in solution generation for structural optimization. Finally, the optimal structure obtained is validated. Research outcomes indicate that the quality of structures generated through the comprehensive optimization method is reduced by 27%, and the inherent energy increases by 0.95 times. Moreover, the overall structural deformation is less than 0.003 mm, with a maximum stress of 3.2 MPa-significantly lower than the yield strength and meeting industrial usage standards. A qualitative study and analysis of the experimental results substantiate the superiority of the proposed methodology for optimized structural design.

12.
Sensors (Basel) ; 23(16)2023 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-37631719

RESUMO

In the structural design of serial robots, topology and dimensional parameters design are independent, making it challenging to achieve synchronous optimization design between the two. To address this issue, a topology-and-dimension-parameter integrated optimization method (TPOM) is proposed by setting critical variables to connect topology layout and dimensional features. Firstly, the topology layout is extracted by the edge detection technique. Structural manufacturability reconstruction is conducted by measuring the dimensions of the layout through a program. Additionally, for the reconstructed structural layout, critical variables are set using three-dimensional software (SOLIDWORKS2021). The experiments primarily involve critical variables, quality, and deformation as variables. Then, the response surface methodology is selected to construct the stiffness-mass metamodel, and based on this, the structural deformation is analyzed. Lastly, the multi-objective genetic algorithm (MOGA) is employed to optimize the critical variables, and an optimized structure is established for validation. The results indicate that the proposed method (TPOM) reduces the mass of the structure by 15% while maintaining its stiffness. In addition, the deformation of the whole structure is less than 0.352 mm, which meets the requirements of industrial applications. Through quantitative analysis of the experimental results, the feasibility and superiority of the proposed method have been demonstrated.

13.
J Exp Bot ; 74(21): 6722-6734, 2023 11 21.
Artigo em Inglês | MEDLINE | ID: mdl-37632355

RESUMO

Functional-structural plant models are increasingly being used by plant scientists to address a wide variety of questions. However, the calibration of these complex models is often challenging, mainly because of their high computational cost, and, as a result, error propagation is usually ignored. Here we applied an automatic method to the calibration of WALTer: a functional-structural wheat model that simulates the plasticity of tillering in response to competition for light. We used a Bayesian calibration method to jointly estimate the values of five parameters and quantify their uncertainty by fitting the model outputs to tillering dynamics data. We made recourse to Gaussian process metamodels in order to alleviate the computational cost of WALTer. These metamodels are built from an adaptive design that consists of successive runs of WALTer chosen by an efficient global optimization algorithm specifically adapted to this particular calibration task. The method presented here performed well on both synthetic and experimental data. It is an efficient approach for the calibration of WALTer and should be of interest for the calibration of other functional-structural plant models.


Assuntos
Algoritmos , Triticum , Triticum/fisiologia , Calibragem , Teorema de Bayes
14.
Diagnostics (Basel) ; 13(15)2023 Jul 31.
Artigo em Inglês | MEDLINE | ID: mdl-37568902

RESUMO

Accurate prediction of heart failure can help prevent life-threatening situations. Several factors contribute to the risk of heart failure, including underlying heart diseases such as coronary artery disease or heart attack, diabetes, hypertension, obesity, certain medications, and lifestyle habits such as smoking and excessive alcohol intake. Machine learning approaches to predict and detect heart disease hold significant potential for clinical utility but face several challenges in their development and implementation. This research proposes a machine learning metamodel for predicting a patient's heart failure based on clinical test data. The proposed metamodel was developed based on Random Forest Classifier, Gaussian Naive Bayes, Decision Tree models, and k-Nearest Neighbor as the final estimator. The metamodel is trained and tested utilizing a combined dataset comprising five well-known heart datasets (Statlog Heart, Cleveland, Hungarian, Switzerland, and Long Beach), all sharing 11 standard features. The study shows that the proposed metamodel can predict heart failure more accurately than other machine learning models, with an accuracy of 87%.

15.
Water Res ; 242: 120264, 2023 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-37393807

RESUMO

Representing reality in a numerical model is complex. Conventionally, hydraulic models of water distribution networks are a tool for replicating water supply system behaviour through simulation by means of approximation of physical equations. A calibration process is mandatory to achieve plausible simulation results. However, calibration is affected by a set of intrinsic uncertainty sources, mainly related to the lack of system knowledge. This paper proposes a breakthrough approach for calibrating hydraulic models through a graph machine learning approach. The main idea is to create a graph neural network metamodel to estimate the network behaviour based on a limited number of monitoring sensors. Once the flows and pressures of the entire network have been estimated, a calibration is carried out to obtain the set of hydraulic parameters that best approximates the metamodel. Through this process, it is possible to estimate the uncertainty that is transferred from the few available measurements to the final hydraulic model. The paper sparks a discussion to assess under what circumstances a graph-based metamodel might be a solution for water network analysis.


Assuntos
Redes Neurais de Computação , Abastecimento de Água , Incerteza , Calibragem , Simulação por Computador
16.
Heliyon ; 9(6): e16593, 2023 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-37274681

RESUMO

Today, an important problem of the building energy performance area is carrying out multi-criteria optimizations of real building designs. To solve this problem, a new method based on a meta-model is proposed in this study. Hence, the EnergyPlus™ is used as the simulation tool for the performance simulation of the building, then a couple of the multi-criteria Modified Coot Optimization Algorithm (MCOA) dynamically combined with the artificial neural network meta-models (ANN-MM) are employed. For the sample generation applied for training and validation of ANN meta-models, an optimum way is presented by this method to minimize the whole building energy simulations needed for their training, and validate precise results of optimization. Moreover, the method is used for the thermal comfort and energy efficiency optimization of a real house to achieve the optimum balance between the heating and cooling behavior of the case building. 12 effective design variables of this case study are selected. Also, the achieved results are put in comparison with the "true" Pareto front found through an optimization method based on simulation performed for more validation. It is assumed that 1280 points are adequate in this case study to obtain precise results on the Pareto set. Thus, 75% of the required simulations' number based on physics has been saved by this size of sample considering the 5120 applied in the method based on simulation. Consequently, the optimum Pareto set of a real multi-criteria building efficiency optimization problem is achieved by the proposed method and accurate results are achieved.

17.
Ann Oper Res ; : 1-36, 2023 Apr 07.
Artigo em Inglês | MEDLINE | ID: mdl-37361056

RESUMO

Course timetables are the organizational foundation of a university's educational program. While students and lecturers perceive timetable quality individually according to their preferences, there are also collective criteria derived normatively such as balanced workloads or idle time avoidance. A recent challenge and opportunity in curriculum-based timetabling consists of customizing timetables with respect to individual student preferences and with respect to integrating online courses as part of modern course programs or in reaction to flexibility requirements as posed in pandemic situations. Curricula consisting of (large) lectures and (small) tutorials further open the possibility for optimizing not only the lecture and tutorial plan for all students but also the assignments of individual students to tutorial slots. In this paper, we develop a multi-level planning process for university timetabling: On the tactical level, a lecture and tutorial plan is determined for a set of study programs; on the operational level, individual timetables are generated for each student interlacing the lecture plan through a selection of tutorials from the tutorial plan favoring individual preferences. We utilize this mathematical-programming-based planning process as part of a matheuristic which implements a genetic algorithm in order to improve lecture plans, tutorial plans, and individual timetables so as to find an overall university program with well-balanced timetable performance criteria. Since the evaluation of the fitness function amounts to invoking the entire planning process, we additionally provide a proxy in the form of an artificial neural network metamodel. Computational results exhibit the procedure's capability of generating high quality schedules.

18.
Stud Health Technol Inform ; 302: 227-231, 2023 May 18.
Artigo em Inglês | MEDLINE | ID: mdl-37203652

RESUMO

Nationwide implementation and adoption of the Prescription Centre and the Patient Data Repository services required 5.5 years since May 2010 in Finland. The Clinical Adoption Meta-Model (CAMM) was applied in the post-deployment assessment of the Kanta Services in its four dimensions (availability, use, behavior, clinical outcomes) over time. The CAMM results on the national level in this study suggest 'Adoption with Benefits' as the most appropriate CAMM archetype.


Assuntos
Prescrições , Humanos , Finlândia
19.
Heliyon ; 9(4): e15079, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-37095922

RESUMO

Hydrological modeling, water accounting assessments, and land evaluations are well-known techniques to carry out water resources carrying capacity (WRCC) assessments at multiple spatial levels. Using the results of an existing process-based model for assessing WRCC from very fine to national spatial scales, we propose a mathematical meta-model, i.e., a set of easily applicable simplified equations to assess WRCC as a function of high-quality agricultural lands for optimistic to realistic scenarios. These equations are based on multi-scale spatial results. Scales include national scale (L0), watersheds (L1), sub-watersheds (L2), and water management hydrological units (L3). Applying the meta-model for different scales could support spatial planning and water management. This method can quantify the effects of individual and collective behavior on self-sufficient WRCC and the level of dependency on external food resources in each area. Carrying capacity can be seen as the inverse of the ecological footprint. Hence, using publicly available data on the ecological footprint in Iran, the results of the proposed method are validated and give an estimation of lower and upper bounds for all biocapacity of the lands. Moreover, the results confirm the law of diminishing returns in the economy for the carrying capacity assessment across spatial scales. The proposed meta-model could be considered a complex manifest of land, water, plants, and human interaction for food production, and it could be used as a powerful tool in spatial planning studies.

20.
Integr Zool ; 18(6): 994-1008, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36881515

RESUMO

The continuation of the isolated Amur tiger (Panthera tigris altaica) population living along the China-Russia border is facing serious challenges due to factors such as its small size (including 38 individuals) and canine distemper virus (CDV). We use a population viability analysis metamodel, which consists of a traditional individual-based demographic model linked to an epidemiological model, to assess options for controlling the impact of negative factors through domestic dog management in protected areas, increasing connectivity to the neighboring large population (including more than 400 individuals), and habitat expansion. Without intervention, under inbreeding depression of 3.14, 6.29, and 12.26 lethal equivalents, our metamodel predicted the extinction within 100 years is 64.4%, 90.6%, and 99.8%, respectively. In addition, the simulation results showed that dog management or habitat expansion independently will not ensure tiger population viability for the next 100 years, and connectivity to the neighboring population would only keep the population size from rapidly declining. However, when the above three conservation scenarios are combined, even at the highest level of 12.26 lethal equivalents inbreeding depression, population size will not decline and the probability of extinction will be <5.8%. Our findings highlight that protecting the Amur tiger necessitates a multifaceted synergistic effort. Our key management recommendations for this population underline the importance of reducing CDV threats and expanding tiger occupancy to its former range in China, but re-establishing habitat connectivity to the neighboring population is an important long-term objective.


Assuntos
Vírus da Cinomose Canina , Cinomose , Doenças do Cão , Tigres , Animais , Cães , Cinomose/epidemiologia , Densidade Demográfica , Federação Russa
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA