Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 49
Filtrar
1.
Sci Rep ; 14(1): 22189, 2024 Sep 27.
Artigo em Inglês | MEDLINE | ID: mdl-39333634

RESUMO

In the domain of control engineering, effectively tuning the parameters of proportional-integral-derivative (PID) controllers has persistently posed a challenge. This study proposes a hybrid algorithm (HGJGSO) that combines golden jackal optimization (GJO) and golden sine algorithm (Gold-SA) for tuning PID controllers. To accelerate the convergence of GJO, a nonlinear parameter adaptation strategy is incorporated. The improved GJO is combined with Gold-SA, capitalizing on the expedited convergence speed offered by the improved GJO, coupled with the global optimization and precise search capabilities of Gold-SA. HGJGSO maximizes the strengths of two algorithms, facilitating a comprehensive and balanced exploration and exploitation. The effectiveness of HGJGSO is assessed through tuning the PID controllers for three typical systems. The results indicate that HGJGSO surpasses the comparison tuning methods. To evaluate the applicability of HGJGSO, it is used to tune the cascade PID controllers for trajectory tracking in a quadrotor UAV. The results demonstrate the superiority of HGJGSO in addressing practical challenges.

2.
Cancers (Basel) ; 16(18)2024 Sep 23.
Artigo em Inglês | MEDLINE | ID: mdl-39335213

RESUMO

AIM OF THE STUDY: to investigate the incidence of non-mapped isolated metastatic pelvic lymph nodes at pre-defined anatomical positions. PATIENTS AND METHODS: Between June 2019 and January 2024, women with uterine-confined endometrial cancer (EC) deemed suitable for robotic surgery and the detection of pelvic sentinel nodes (SLNs) were included. An anatomically based, published algorithm utilizing indocyanine green (ICG) as a tracer was adhered to. In women where no ICG mapping occurred in either the proximal obturator and/or the interiliac positions, defined as "typical positions", those nodes were removed and designated as "SLN anatomy". Ultrastaging and immunohistochemistry were applied to all SLNs. The proportion of isolated metastatic "SLN anatomy" was evaluated. RESULTS: A non-mapping of either the obturator or interiliac area occurred in 180 of the 620 women (29%). In total, 114 women (18.4%) were node-positive and five of these women (4.3%) had isolated metastases in an "SLN anatomy", suggesting a similar lower sensitivity of the ICG-only algorithm. CONCLUSION: In an optimized SLN algorithm for endometrial cancer, to avoid undetected nodal metastases in 4.3% of node-positive women, if mapping fails in either the proximal obturator or interiliac area, nodes should be removed from those defined anatomic positions, despite mapping at other positions.

3.
J Contam Hydrol ; 265: 104385, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38878553

RESUMO

This study aims to develop a multi-objective quantitative-qualitative reservoir operation model (MOQQROM) by a simulation-optimization approach. However, the main challenge of these models is their computational complexity. The simulation-optimization method used in this study consists of CE-QUAL-W2 as a hydrodynamic and water quality simulation model and a multi-objective firefly algorithm-k nearest neighbor (MOFA-KNN) as an optimization algorithm which is an efficient algorithm to overcome the computational burden in simulation-optimization approaches by decreasing simulation model calls. MOFA-KNN was expanded for this study, and its performance was evaluated in the MOQQROM. Three objectives were considered in this study, including (1) the sum of the squared mass of total dissolved solids (TDS), (2) the sum of the squared temperature difference between reservoir inflow and outflow as water quality objectives, and (3) the vulnerability index as a water quantity objective. Aidoghmoush reservoir was employed as a case study, and the model was investigated under three scenarios, including the normal, wet, and dry years. Results showed the expanded MOFA-KNN reduced the number of original simulation model calls compared to the total number of simulations in MOQQROM by more than 99%, indicating its efficacy in significantly reducing execution time. The three most desired operating policies for meeting each objective were selected for investigation. Results showed that the operation policy with the best value for the second objective could be chosen as a compromise policy to balance the two conflicting goals of improving quality and supplying the demand in normal and wet scenarios. In terms of contamination mass, this policy was, on average, 16% worse than the first policy and 40% better than the third policy in the normal scenario. In the wet scenario, it was, on average, 55% worse than the first policy and 16% better than the third policy. The outflow temperature of this policy was, on average, only 8.35% different from the inflow temperature in the normal scenario and 0.93% different in the wet scenario. The performance of the developed model is satisfactory.


Assuntos
Modelos Teóricos , Qualidade da Água , Abastecimento de Água , Algoritmos , Simulação por Computador
4.
Sci Rep ; 14(1): 11259, 2024 May 17.
Artigo em Inglês | MEDLINE | ID: mdl-38755222

RESUMO

As the terminal of the power system, the distribution network is the main area where failures occur. In addition, with the integration of distributed generation, the traditional distribution network becomes more complex, rendering the conventional fault location algorithms based on a single power supply obsolete. Therefore, it is necessary to seek a new algorithm to locate the fault of the distributed power distribution network. In existing fault localization algorithms for distribution networks, since there are only two states of line faults, which can usually be represented by 0 and 1, most algorithms use discrete algorithms with this characteristic for iterative optimization. Therefore, this paper combines the advantages of the particle swarm algorithm and genetic algorithm and uses continuous real numbers for iteration to construct a successive particle swarm genetic algorithm (SPSO-GA) different from previous algorithms. The accuracy, speed, and fault tolerance of SPSO-GA, discrete particle swarm Genetic algorithm, and artificial fish swarm algorithm are compared in an IEEE33-node distribution network with the distributed power supply. The simulation results show that the SPSO-GA algorithm has high optimization accuracy and stability for single, double, or triple faults. Furthermore, SPSO-GA has a rapid convergence velocity, requires fewer particles, and can locate the fault segment accurately for the distribution network containing distorted information.

5.
Heliyon ; 10(7): e29006, 2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38601575

RESUMO

The estimation of groundwater levels is crucial and an important step in ensuring sustainable management of water resources. In this paper, selected piezometers of the Hamedan-Bahar plain located in west of Iran. The main objective of this study is to compare effect of various pre-processing methods on input data for different artificial intelligence (AI) models to predict groundwater levels (GWLs). The observed GWL, evaporation, precipitation, and temperature were used as input variables in the AI algorithms. Firstly, 126 method of data pre-processing was done by python programming which are classified into three classes: 1- statistical methods, 2- wavelet transform methods and 3- decomposition methods; later, various pre-processed data used by four types of widely used AI models with different kernels, which includes: Support Vector Machine (SVR), Artificial Neural Network (ANN), Long-Short Term memory (LSTM), and Pelican Optimization Algorithm (POA) - Artificial Neural Network (POA-ANN) are classified into three classes: 1- machine learning (SVR and ANN), 2- deep learning (LSTM) and 3- hybrid-ML (POA-ANN) models, to predict groundwater levels (GWLs). Akaike Information Criterion (AIC) were used to evaluate and validate the predictive accuracy of algorithms. According to the results, based on summation (train and test phases) of AIC value of 1778 models, average of AIC values for ML, DL, hybrid-ML classes, was decreased to -25.3%, -29.6% and -57.8%, respectively. Therefore, the results showed that all data pre-processing methods do not lead to improvement of prediction accuracy, and they should be selected very carefully by trial and error. In conclusion, wavelet-ANN model with daubechies 13 and 25 neurons (db13_ANN_25) is the best model to predict GWL that has -204.9 value for AIC which has grown by 5.23% (-194.7) compared to the state without any pre-processing method (ANN_Relu_25).

6.
Front Big Data ; 7: 1358486, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38449564

RESUMO

As the volume and velocity of Big Data continue to grow, traditional cloud computing approaches struggle to meet the demands of real-time processing and low latency. Fog computing, with its distributed network of edge devices, emerges as a compelling solution. However, efficient task scheduling in fog computing remains a challenge due to its inherently multi-objective nature, balancing factors like execution time, response time, and resource utilization. This paper proposes a hybrid Genetic Algorithm (GA)-Particle Swarm Optimization (PSO) algorithm to optimize multi-objective task scheduling in fog computing environments. The hybrid approach combines the strengths of GA and PSO, achieving effective exploration and exploitation of the search space, leading to improved performance compared to traditional single-algorithm approaches. The proposed hybrid algorithm results improved the execution time by 85.68% when compared with GA algorithm, by 84% when compared with Hybrid PWOA and by 51.03% when compared with PSO algorithm as well as it improved the response time by 67.28% when compared with GA algorithm, by 54.24% when compared with Hybrid PWOA and by 75.40% when compared with PSO algorithm as well as it improved the completion time by 68.69% when compared with GA algorithm, by 98.91% when compared with Hybrid PWOA and by 75.90% when compared with PSO algorithm when various tasks inputs are given. The proposed hybrid algorithm results also improved the execution time by 84.87% when compared with GA algorithm, by 88.64% when compared with Hybrid PWOA and by 85.07% when compared with PSO algorithm it improved the response time by 65.92% when compared with GA algorithm, by 80.51% when compared with Hybrid PWOA and by 85.26% when compared with PSO algorithm as well as it improved the completion time by 67.60% when compared with GA algorithm, by 81.34% when compared with Hybrid PWOA and by 85.23% when compared with PSO algorithm when various fog nodes are given.

7.
Stud Health Technol Inform ; 305: 190-193, 2023 Jun 29.
Artigo em Inglês | MEDLINE | ID: mdl-37386993

RESUMO

Process Mining is a technique looking into the analysis and mining of existing process flow. On the other hand, Machine Learning is a data science field and a sub-branch of Artificial Intelligence with the main purpose of replicating human behavior through algorithms. The separate application of Process Mining and Machine Learning for healthcare purposes has been widely explored with a various number of published works discussing their use. However, the simultaneous application of Process Mining and Machine Learning algorithms is still a growing field with ongoing studies on its application. This paper proposes a feasible framework where Process Mining and Machine Learning can be used in combination within the healthcare environment.


Assuntos
Inteligência Artificial , Insuficiência Renal Crônica , Humanos , Aprendizado de Máquina , Pacientes , Insuficiência Renal Crônica/diagnóstico , Insuficiência Renal Crônica/terapia , Progressão da Doença
8.
Entropy (Basel) ; 25(3)2023 Mar 21.
Artigo em Inglês | MEDLINE | ID: mdl-36981429

RESUMO

Recent advances in quantum hardware offer new approaches to solve various optimization problems that can be computationally expensive when classical algorithms are employed. We propose a hybrid quantum-classical algorithm to solve a dynamic asset allocation problem where a target return and a target risk metric (expected shortfall) are specified. We propose an iterative algorithm that treats the target return as a constraint in a Markowitz portfolio optimization model, and dynamically adjusts the target return to satisfy the targeted expected shortfall. The Markowitz optimization is formulated as a Quadratic Unconstrained Binary Optimization (QUBO) problem. The use of the expected shortfall risk metric enables the modeling of extreme market events. We compare the results from D-Wave's 2000Q and Advantage quantum annealers using real-world financial data. Both quantum annealers are able to generate portfolios with more than 80% of the return of the classical optimal solutions, while satisfying the expected shortfall. We observe that experiments on assets with higher correlations tend to perform better, which may help to design practical quantum applications in the near term.

9.
Heliyon ; 9(1): e12802, 2023 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-36704286

RESUMO

Regardless of their nature of stochasticity and uncertain nature, wind and solar resources are the most abundant energy resources used in the development of microgrid systems. In microgrid systems and distribution networks, the uncertain nature of both solar and wind resources results in power quality and system stability issues. The randomization behavior of solar and wind energy resources is controlled through the precise development of a power prediction model. Fuzzy-based solar PV and wind prediction models may more efficiently manage this randomness and uncertain character. However, this method has several drawbacks, it has limited performance when the volumes of wind and solar resources historical data are huge in size and it has also many membership functions of the fuzzy input and output variables as well as multiple fuzzy rules available. The hybrid Fuzzy-PSO intelligent prediction approach improves the fuzzy system's limitations and hence increases the prediction model's performance. The Fuzzy-PSO hybrid forecast model is developed using MATLAB programming of the particle swarm optimization (PSO) algorithm with the help of the global optimization toolbox. In this paper, an error correction factor (ECF) is considered a new fuzzy input variable. It depends on the validation and forecasted data values of both wind and solar prediction models to improve the accuracy of the prediction model. The impact of ECF is observed in fuzzy, Fuzzy-PSO, and Fuzzy-GA wind and solar PV power forecasting models. The hybrid Fuzzy-PSO prediction model of wind and solar power generation has a high degree of accuracy compared to the Fuzzy and Fuzzy-GA forecasting models. The rest of this paper is organized as: Section II is about the analysis of solar and wind resources row data. The Fuzzy-PSO prediction model problem formulation is covered in Section III. Section IV, is about the results and discussion of the study. Section V contains the conclusion. The references and abbreviations are presented at the end of the paper.

10.
Neth Heart J ; 31(3): 117-123, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36445615

RESUMO

INTRODUCTION: In a Dutch heart centre, a dedicated chronic total occlusion (CTO) team was implemented in June 2017. The aim of this study was to the evaluate treatment success and clinical outcomes before and after this implementation. METHODS: A total of 662 patients who underwent percutaneous coronary intervention (PCI) for a CTO between January 2013 and June 2020 were included and divided into pre- and post-CTO team groups. The primary endpoint was the angiographic success rate of CTO-PCI. Secondary endpoints included angiographic success stratified by complexity using the J­CTO score and the following clinical outcomes: in-hospital complications and myocardial infarction, target vessel revascularisation, all-cause mortality, quality of life (QoL) and major adverse cardiac events (MACE) at 30-day and 1­year follow-up. RESULTS: Compared with the pre-CTO team group, the success rate in the post-CTO team group was higher after the first attempt (81.4% vs 62.7%; p < 0.001) and final attempt (86.7% vs 73.8%; p = 0.001). This was mainly driven by higher success rates for difficult and very difficult CTO lesions according to the J­CTO score. The MACE rate at 1 year was lower in the post-CTO team group than in the pre-CTO team group (6.4% vs 16.0%; p < 0.01), while it was comparable at 30-day follow-up (0.1% vs 1.7%; p = 0.74). Angina symptoms were significantly reduced at 30-day and 1­year follow-up, and QoL scores were higher after 1 year. CONCLUSION: This study demonstrated higher success rates of CTO-PCI and improved clinical outcomes and QoL at 1­year follow-up after implementation of a dedicated CTO team using the hybrid algorithm.

11.
J Cardiovasc Dev Dis ; 11(1)2023 Dec 22.
Artigo em Inglês | MEDLINE | ID: mdl-38248873

RESUMO

Whereas coronary computed tomography angiography (CCTA) exceeds invasive angiography for predicting the procedural outcome of chronic total occlusion (CTO) percutaneous coronary intervention (PCI), CCTA-derived scores have never been validated in the hybrid CTO PCI population. In this single-center, retrospective, observational study, we included 108 consecutive patients with 110 CTO lesions and preprocedural CCTA who underwent hybrid CTO PCI to assess the diagnostic accuracy of CCTA-derived scoring systems. Successful guidewire crossing within 30 min was set as the primary endpoint. The secondary endpoints were final procedural success and the need for using any non-antegrade wiring (AW) strategy within the hybrid algorithm. Time-efficient guidewire crossing and final procedural success were achieved in 53.6% and 89.1% of lesions, respectively, while in 36.4% of the procedures, any non-AW strategy was applied. The median J-CTO score was 1 (interquartile range (IQR): 0, 2), while the CT-RECTOR, KCCT, J-CTOCCTA, and RECHARGECCTA scores were 2 (IQR: 1, 3), 3 (IQR: 2, 5), 1 (IQR: 0, 3), and 2 (IQR: 1, 3), respectively. All scores were significantly higher in the lesions with failed versus successful time-efficient guidewire crossing. Although all of the CCTA-derived scores had numerically higher predictive values than the angiographic J-CTO score, no significant differences were noted between the scores in any of the analyzed study endpoints. High sensitivity of the CT-RECTOR and RECHARGECCTA scores (both 89.8%) for predicting successful guidewire crossing within 30 min, and high sensitivity (90.8%) of the KCCT score for predicting final procedural success, were noted. CCTA-derived scoring systems are accurate, noninvasive tools for the prediction of the procedural outcome of hybrid CTO PCI, and may aid in identifying the need for use of the hybrid algorithm.

12.
Comput Biol Med ; 151(Pt A): 106236, 2022 12.
Artigo em Inglês | MEDLINE | ID: mdl-36370584

RESUMO

By taking a new perspective to combine a machine learning method with an evolutionary algorithm, a new hybrid algorithm is developed to predict cancer driver genes. Firstly, inspired by the search strategy with the capability of global search in evolutionary algorithms, a gravitational kernel is proposed to act on the full range of gene features. Constructed by fusing PPI and mutation features, the gravitational kernel is capable to produce repulsion effects. The candidate genes with greater mutation effects and PPI have higher similarity scores. According to repulsion, the similarity score of these promising genes is larger than ordinary genes, which is beneficial to search for these promising genes. Secondly, inspired by the idea of elite populations related to evolutionary algorithms, the concept of vital few is proposed. Targeted at a local scale, it acts on the candidate genes associated with vital few genes. Under attraction effect, these vital few driver genes attract those with similar mutational effects to them, which leads to greater similarity scores. Lastly, the model and parameters are optimized by using an evolutionary algorithm, so as to obtain the optimal model and parameters for cancer driver gene prediction. Herein, a comparison is performed with six other advanced methods of cancer driver gene prediction. According to the experimental results, the method proposed in this study outperforms these six state-of-the-art algorithms on the pan-oncogene dataset.


Assuntos
Algoritmos , Neoplasias , Humanos , Oncogenes , Aprendizado de Máquina , Mutação , Neoplasias/genética
13.
Sensors (Basel) ; 22(21)2022 Oct 24.
Artigo em Inglês | MEDLINE | ID: mdl-36365820

RESUMO

Impact force is the most common form of load which acts on engineering structures and presents a great hidden risk to the healthy operation of machinery. Therefore, the identification or monitoring of impact forces is a significant issue in structural health monitoring. The conventional optimisation scheme based on inversion techniques requires a significant amount of time to identify random impact forces (impact force localisation and time history reconstruction) and is not suitable for engineering applications. Recently, a pattern recognition method combined with the similarity metric, PRMCSM, has been proposed, which exhibits rapidity in practical engineering applications. This study proposes a novel scheme for identifying unknown random impact forces which hybridises two existing methods and combines the advantages of both. The experimental results indicate that the localisation accuracy of the proposed algorithm (100%) is higher than that of PRMCSM (92%), and the calculation time of the hybrid algorithm (179 s) for 25 validation cases is approximately one nineteenth of the traditional optimisation strategy (3446 s).


Assuntos
Algoritmos , Aço , Fenômenos Mecânicos
14.
Comput Biol Med ; 150: 106003, 2022 11.
Artigo em Inglês | MEDLINE | ID: mdl-36228462

RESUMO

Medical image segmentation is a crucial step in Computer-Aided Diagnosis systems, where accurate segmentation is vital for perfect disease diagnoses. This paper proposes a multilevel thresholding technique for 2D and 3D medical image segmentation using Otsu and Kapur's entropy methods as fitness functions to determine the optimum threshold values. The proposed algorithm applies the hybridization concept between the recent Coronavirus Optimization Algorithm (COVIDOA) and Harris Hawks Optimization Algorithm (HHOA) to benefit from both algorithms' strengths and overcome their limitations. The improved performance of the proposed algorithm over COVIDOA and HHOA algorithms is demonstrated by solving 5 test problems from IEEE CEC 2019 benchmark problems. Medical image segmentation is tested using two groups of images, including 2D medical images and volumetric (3D) medical images, to demonstrate its superior performance. The utilized test images are from different modalities such as Magnetic Resonance Imaging (MRI), Computed Tomography (CT), and X-ray images. The proposed algorithm is compared with seven well-known metaheuristic algorithms, where the performance is evaluated using four different metrics, including the best fitness values, Peak Signal to Noise Ratio (PSNR), Structural Similarity Index (SSIM), and Normalized Correlation Coefficient (NCC). The experimental results demonstrate the superior performance of the proposed algorithm in terms of convergence to the global optimum and making a good balance between exploration and exploitation properties. Moreover, the quality of the segmented images using the proposed algorithm at different threshold levels is better than the other methods according to PSNR, SSIM, and NCC values. Additionally, the Wilcoxon rank-sum test is conducted to prove the statistical significance of the proposed algorithm.


Assuntos
Coronavirus , Tomografia Computadorizada por Raios X/métodos , Imageamento Tridimensional/métodos , Imageamento por Ressonância Magnética , Algoritmos , Processamento de Imagem Assistida por Computador/métodos
15.
Math Biosci Eng ; 19(11): 10963-11017, 2022 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-36124577

RESUMO

Aquila Optimizer (AO) and African Vultures Optimization Algorithm (AVOA) are two newly developed meta-heuristic algorithms that simulate several intelligent hunting behaviors of Aquila and African vulture in nature, respectively. AO has powerful global exploration capability, whereas its local exploitation phase is not stable enough. On the other hand, AVOA possesses promising exploitation capability but insufficient exploration mechanisms. Based on the characteristics of both algorithms, in this paper, we propose an improved hybrid AO and AVOA optimizer called IHAOAVOA to overcome the deficiencies in the single algorithm and provide higher-quality solutions for solving global optimization problems. First, the exploration phase of AO and the exploitation phase of AVOA are combined to retain the valuable search competence of each. Then, a new composite opposition-based learning (COBL) is designed to increase the population diversity and help the hybrid algorithm escape from the local optima. In addition, to more effectively guide the search process and balance the exploration and exploitation, the fitness-distance balance (FDB) selection strategy is introduced to modify the core position update formula. The performance of the proposed IHAOAVOA is comprehensively investigated and analyzed by comparing against the basic AO, AVOA, and six state-of-the-art algorithms on 23 classical benchmark functions and the IEEE CEC2019 test suite. Experimental results demonstrate that IHAOAVOA achieves superior solution accuracy, convergence speed, and local optima avoidance than other comparison methods on most test functions. Furthermore, the practicality of IHAOAVOA is highlighted by solving five engineering design problems. Our findings reveal that the proposed technique is also highly competitive and promising when addressing real-world optimization tasks. The source code of the IHAOAVOA is publicly available at https://doi.org/10.24433/CO.2373662.v1.


Assuntos
Águias , Algoritmos , Animais , Engenharia , Aprendizagem , Resolução de Problemas
16.
Res Math Sci ; 9(3): 51, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35915747

RESUMO

With the invention of the COVID-19 vaccine, shipping and distributing are crucial in controlling the pandemic. In this paper, we build a mean-field variational problem in a spatial domain, which controls the propagation of pandemics by the optimal transportation strategy of vaccine distribution. Here, we integrate the vaccine distribution into the mean-field SIR model designed in Lee W, Liu S, Tembine H, Li W, Osher S (2020) Controlling propagation of epidemics via mean-field games. arXiv preprint arXiv:2006.01249. Numerical examples demonstrate that the proposed model provides practical strategies for vaccine distribution in a spatial domain.

17.
Materials (Basel) ; 15(9)2022 Apr 21.
Artigo em Inglês | MEDLINE | ID: mdl-35591352

RESUMO

Fiber-reinforced polymer (FRP) has several benefits, in addition to excellent tensile strength and low self-weight, including corrosion resistance, high durability, and easy construction, making it among the most optimum options for concrete structure restoration. The bond behavior of the FRP-concrete (FRPC) interface, on the other hand, is extremely intricate, making the bond strength challenging to estimate. As a result, a robust modeling framework is necessary. In this paper, data-driven hybrid models are developed by combining state-of-the-art population-based algorithms (bald eagle search (BES), dynamic fitness distance balance-manta ray foraging optimization (dFDB-MRFO), RUNge Kutta optimizer (RUN)) and artificial neural networks (ANN) named "BES-ANN", "dFDB-MRFO -ANN", and "RUN-ANN" to estimate the FRPC interfacial-bond strength accurately. The efficacy of these models in predicting bond strength is examined using an extensive database of 969 experimental samples. Compared to the BES-ANN and dFDB-MRFO models, the RUN-ANN model better estimates the interfacial-bond strength. In addition, the SHapley Additive Explanations (SHAP) approach is used to help interpret the best model and examine how the features influence the model's outcome. Among the studied hybrid models, the RUN-ANN algorithm is the most accurate model with the highest coefficient of determination (R2 = 92%), least mean absolute error (0.078), and least coefficient of variation (18.6%). The RUN-ANN algorithm also outperformed mechanics-based models. Based on SHAP and sensitivity analysis method, the FRP bond length and width contribute more to the final prediction results.

18.
BMC Bioinformatics ; 23(1): 122, 2022 Apr 07.
Artigo em Inglês | MEDLINE | ID: mdl-35392798

RESUMO

BACKGROUND: The assembly task is an indispensable step in sequencing genomes of new organisms and studying structural genomic changes. In recent years, the dynamic development of next-generation sequencing (NGS) methods raises hopes for making whole-genome sequencing a fast and reliable tool used, for example, in medical diagnostics. However, this is hampered by the slowness and computational requirements of the current processing algorithms, which raises the need to develop more efficient algorithms. One possible approach, still little explored, is the use of quantum computing. RESULTS: We present a proof of concept of de novo assembly algorithm, using the Genomic Signal Processing approach, detecting overlaps between DNA reads by calculating the Pearson correlation coefficient and formulating the assembly problem as an optimization task (Traveling Salesman Problem). Computations performed on a classic computer were compared with the results achieved by a hybrid method combining CPU and QPU calculations. For this purpose quantum annealer by D-Wave was used. The experiments were performed with artificially generated data and DNA reads coming from a simulator, with actual organism genomes used as input sequences. To our knowledge, this work is one of the few where actual sequences of organisms were used to study the de novo assembly task on quantum annealer. CONCLUSIONS: Proof of concept carried out by us showed that the use of quantum annealer (QA) for the de novo assembly task might be a promising alternative to the computations performed in the classical model. The current computing power of the available devices requires a hybrid approach (combining CPU and QPU computations). The next step may be developing a hybrid algorithm strictly dedicated to the de novo assembly task, using its specificity (e.g. the sparsity and bounded degree of the overlap-layout-consensus graph).


Assuntos
Metodologias Computacionais , Teoria Quântica , Algoritmos , Sequência de Bases , DNA/genética , Sequenciamento de Nucleotídeos em Larga Escala/métodos , Análise de Sequência de DNA/métodos
19.
Sensors (Basel) ; 22(3)2022 Jan 24.
Artigo em Inglês | MEDLINE | ID: mdl-35161630

RESUMO

The waste mine water is produced in the process of coal mining, which is the main cause of mine flood and environmental pollution. Therefore, economic treatment and efficient reuse of mine water is one of the main research directions in the mining area at present. It is an urgent problem to use an intelligent algorithm to realize optimal allocation and economic reuse of mine water. In order to solve this problem, this paper first designs a reuse mathematical model according to the mine water treatment system, which includes the mine water reuse rate, the reuse cost at different stages and the operational efficiency of the whole mine water treatment system. Then, a hybrid optimization algorithm, GAPSO, was proposed by combining genetic algorithm (GA) and particle swarm optimization (PSO), and adaptive improvement (TSA-GAPSO) was carried out for the two optimization stages. Finally, simulation analysis and actual data detection of the mine water reuse model are carried out by using four algorithms, respectively. The results show that the hybrid improved algorithm has better convergence speed and precision in solving the mine water scheduling problem. TSA-GAPSO algorithm has the best effect and is superior to the other three algorithms. The cost of mine water reuse is reduced by 9.09%, and the treatment efficiency of the whole system is improved by 5.81%, which proves the practicability and superiority of the algorithm.


Assuntos
Algoritmos , Modelos Teóricos , Simulação por Computador , Poluição Ambiental , Inundações
20.
J Environ Manage ; 303: 114252, 2022 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-34894493

RESUMO

Many companies and organizations are pursuing "carbon footprint" projects to estimate their own contribution due to growing concerns about global climate change and carbon emissions. Measures such as carbon taxes are the most powerful means of dealing with the threats of climate change. In recent years, researchers have shown a particular interest in modelling supply chain networks under this scheme. Disorganized disposal of by-products from sugarcane mills is the inspiration of this research. In order to connect the problem with the real world, the proposed sustainable sugarcane supply chain network considers carbon taxes on the emission from industries and during transportation of goods. The presented mixed-integer linear programming modelling is a location-allocation problem and, due to the inherent complexity, it is considered a Non-Polynomial hard (NP-hard) problem. To deal with the model, three superior metaheuristics Genetic Algorithm (GA), Simulated Annealing (SA), Social Engineering Optimizer (SEO) and hybrid methods based on these metaheuristics, namely, Genetic-Simulated Annealing (GASA) and Genetic-Social Engineering Optimizer (GASEO), are employed. The control parameters of the algorithms are tuned using the Taguchi approach. Subsequently, one-way ANOVA is used to elucidate the performance of the proposed algorithms, which compliments the performance of the proposed GASEO.


Assuntos
Modelos Teóricos , Saccharum , Algoritmos , Pegada de Carbono , Meios de Transporte
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA