RESUMO
Several methods have been used to solve structural optimum design problems since the creation of a need for light weight design of structures and there is still no single method for solving the optimum design problems in structural engineering field that is capable of providing efficient solutions to all of the structural optimum design problems. Therefore, there are several proposed and utilized methods to deal with optimum design issues and problems, that sometimes give promising results and sometimes the solutions are quite unacceptable. This issue with metaheuristic algorithms, which are suitable approaches to solve these set of problems, is quite usual and is supported by the "No Free Lunch theorem". Researchers try harder than the past to propose methods capable of presenting robust and optimal solutions in a wider range of structural optimum design problems, so that to find an algorithm that can cover a wider range of structural optimization problems and obtain a better optimum design. Truss structures are one of these problems which have extremely complex search spaces to conduct search procedures by metaheuristic algorithms. This paper proposes a method for optimum design of truss sizing problems. The presented method is used against 6 well-known benchmark truss structures (10 bar, 17 bar, 18 bar, 25 bar, 72 bar and 120 bar) and its results are compared with some of the available studies in the literature. The performance of the presented algorithm can be considered as very acceptable.
RESUMO
Based on bee foraging behaviour, the Bees Algorithm (BA) is an optimisation metaheuristic algorithm which has found many applications in both the continuous and combinatorial domains. The original version of the Bees Algorithm has six user-selected parameters: the number of scout bees, the number of high-performing bees, the number of top-performing or "elite" bees, the number of forager bees following the elite bees, the number of forager bees recruited by the other high-performing bees, and the neighbourhood size. These parameters must be chosen with due care, as their values can impact the algorithm's performance, particularly when the problem is complex. However, determining the optimum values for those parameters can be time-consuming for users who are not familiar with the algorithm. This paper presents BA1, a Bees Algorithm with just one parameter. BA1 eliminates the need to specify the numbers of high-performing and elite bees and other associated parameters. Instead, it uses incremental k-means clustering to divide the scout bees into groups. By reducing the required number of parameters, BA1 simplifies the tuning process and increases efficiency. BA1 has been evaluated on 23 benchmark functions in the continuous domain, followed by 12 problems from the TSPLIB in the combinatorial domain. The results show good performance against popular nature-inspired optimisation algorithms on the problems tested.
RESUMO
The integration of IoT systems into automotive vehicles has raised concerns associated with intrusion detection within these systems. Vehicles equipped with a controller area network (CAN) control several systems within a vehicle where disruptions in function can lead to significant malfunctions, injuries, and even loss of life. Detecting disruption is a primary concern as vehicles move to higher degrees of autonomy and the possibility of self-driving is explored. Tackling cyber-security challenges within CAN is essential to improve vehicle and road safety. Standard differences between different manufacturers make the implementation of a discreet system difficult; therefore, data-driven techniques are needed to tackle the ever-evolving landscape of cyber security within the automotive field. This paper examines the possibility of using machine learning classifiers to identify cyber assaults in CAN systems. To achieve applicability, we cover two classifiers: extreme gradient boost and K-nearest neighbor algorithms. However, as their performance hinges on proper parameter selection, a modified metaheuristic optimizer is introduced as well to tackle parameter optimization. The proposed approach is tested on a publicly available dataset with the best-performing models exceeding 89% accuracy. Optimizer outcomes have undergone rigorous statistical analysis, and the best-performing models were subjected to analysis using explainable artificial intelligence techniques to determine feature impacts on the best-performing model.
RESUMO
Electroencephalography (EEG) has emerged as a primary non-invasive and mobile modality for understanding the complex workings of the human brain, providing invaluable insights into cognitive processes, neurological disorders, and brain-computer interfaces. Nevertheless, the volume of EEG data, the presence of artifacts, the selection of optimal channels, and the need for feature extraction from EEG data present considerable challenges in achieving meaningful and distinguishing outcomes for machine learning algorithms utilized to process EEG data. Consequently, the demand for sophisticated optimization techniques has become imperative to overcome these hurdles effectively. Evolutionary algorithms (EAs) and other nature-inspired metaheuristics have been applied as powerful design and optimization tools in recent years, showcasing their significance in addressing various design and optimization problems relevant to brain EEG-based applications. This paper presents a comprehensive survey highlighting the importance of EAs and other metaheuristics in EEG-based applications. The survey is organized according to the main areas where EAs have been applied, namely artifact mitigation, channel selection, feature extraction, feature selection, and signal classification. Finally, the current challenges and future aspects of EAs in the context of EEG-based applications are discussed.
Assuntos
Algoritmos , Encéfalo , Eletroencefalografia , Eletroencefalografia/métodos , Humanos , Encéfalo/fisiologia , Artefatos , Interfaces Cérebro-Computador , Aprendizado de MáquinaRESUMO
Atherosclerosis causes heart disease by forming plaques in arterial walls. IVUS imaging provides a high-resolution cross-sectional view of coronary arteries and plaque morphology. Healthcare professionals diagnose and quantify atherosclerosis physically or using VH-IVUS software. Since manual or VH-IVUS software-based diagnosis is time-consuming, automated plaque characterization tools are essential for accurate atherosclerosis detection and classification. Recently, deep learning (DL) and computer vision (CV) approaches are promising tools for automatically classifying plaques on IVUS images. With this motivation, this manuscript proposes an automated atherosclerotic plaque classification method using a hybrid Ant Lion Optimizer with Deep Learning (AAPC-HALODL) technique on IVUS images. The AAPC-HALODL technique uses the faster regional convolutional neural network (Faster RCNN)-based segmentation approach to identify diseased regions in the IVUS images. Next, the ShuffleNet-v2 model generates a useful set of feature vectors from the segmented IVUS images, and its hyperparameters can be optimally selected by using the HALO technique. Finally, an average ensemble classification process comprising a stacked autoencoder (SAE) and deep extreme learning machine (DELM) model can be utilized. The MICCAI Challenge 2011 dataset was used for AAPC-HALODL simulation analysis. A detailed comparative study showed that the AAPC-HALODL approach outperformed other DL models with a maximum accuracy of 98.33%, precision of 97.87%, sensitivity of 98.33%, and F score of 98.10%.
RESUMO
Classification rule mining represents a significant field of machine learning, facilitating informed decision-making through the extraction of meaningful rules from complex data. Many classification methods cannot simultaneously optimize both explainability and different performance metrics at the same time. Metaheuristic optimization-based solutions, inspired by natural phenomena, offer a potential paradigm shift in this field, enabling the development of interpretable and scalable classifiers. In contrast to classical methods, such rule extraction-based solutions are capable of classification by taking multiple purposes into consideration simultaneously. To the best of our knowledge, although there are limited studies on metaheuristic based classification, there is not any method that optimize more than three objectives while increasing the explainability and interpretability for classification task. In this study, data sets are treated as the search space and metaheuristics as the many-objective rule discovery strategy and study proposes a metaheuristic many-objective optimization-based rule extraction approach for the first time in the literature. Chaos theory is also integrated to the optimization method for performance increment and the proposed chaotic rule-based SPEA2 algorithm enables the simultaneous optimization of four different success metrics and automatic rule extraction. Another distinctive feature of the proposed algorithm is that, in contrast to classical random search methods, it can mitigate issues such as correlation and poor uniformity between candidate solutions through the use of a chaotic random search mechanism in the exploration and exploitation phases. The efficacy of the proposed method is evaluated using three distinct data sets, and its performance is demonstrated in comparison with other classical machine learning results.
RESUMO
Bladder cancer (BC) diagnosis presents a critical challenge in biomedical research, necessitating accurate tumor classification from diverse datasets for effective treatment planning. This paper introduces a novel wrapper feature selection (FS) method that leverages a hybrid optimization algorithm combining Orthogonal Learning (OL) with a rime optimization algorithm (RIME), termed mRIME. The mRIME algorithm is designed to avoid local optima, streamline the search process, and select the most relevant features without compromising classifier performance. It also introduces mRIME-SVM, a novel hybrid model integrating modified mRIME for FS with Support Vector Machine (SVM) for classification. The mRIME algorithm is employed as an FS method and is also utilized to fine-tune the hyperparameters of it the It SVM, enhancing the overall classification accuracy. Specifically, mRIME navigates complex search spaces to optimize FS without compromising classifier performance. Evaluated on eight diverse BC datasets, mRIME-SVM outperforms popular metaheuristic algorithms, ensuring precise and reliable diagnostic outcomes. Moreover, the proposed mRIME was employed for tackling global optimization problems. It has been thoroughly assessed using the IEEE Congress on Evolutionary Computation 2022 (CEC'2022) test suite. Comparative analyzes with Gray wolf optimization (GWO), Whale optimization algorithm (WOA), Harris hawks optimization (HHO), Golden Jackal Optimization (GJO), Hunger Game optimization algorithm (HGS), Sinh Cosh Optimizer (SCHO), and the original RIME highlight mRIME's competitiveness and efficacy across diverse optimization tasks. Leveraging mRIME's success, mRIME-SVM achieves high classification accuracy on nine BC datasets, surpassing existing models. Results underscore mRIME's competitiveness and applicability across diverse optimization tasks, extending its utility to enhance BC classification. This study contributes to advancing BC diagnostics with a robust computational framework, promising broader applications in bioinformatics and AI-driven medical research.
RESUMO
Enterprise risk management (ERM) frameworks convey vital principles that help create a consistent risk management culture, irrespective of employee turnover or industry standards. Enterprise Management System (EMS) are becoming a popular research area for assuring a company's long-term success. Statistical pattern recognition, federated learning, database administration, visualization technology, and social networking are all used in this field, which includes artificial intelligence (AI), data science, and statistics. Risk assessment in EMS is critical for enterprise decision-making to be effective. Recent advancements in AI, machine learning (ML), and deep learning (DL) concepts have enabled the development of effective risk assessment models for EMS. This special issue seeks groundbreaking research articles that showcase the application of applied probability and statistics to interdisciplinary studies. This study offers Improved Metaheuristics with a Deep Learning Enabled Risk Assessment Model (IMDLRA-SES) for Smart Enterprise Systems. Using feature selection (FS) and DL models, the provided IMDLRA-SES technique estimates business risks. Preprocessing is used in the IMDLRA-SES technique to change the original financial data into a usable format. In addition, an FS technique based on oppositional lion swarm optimization (OLSO) is utilized to find the best subset of features. In addition, the presence or absence of financial hazards in firms is classified using the triple tree seed algorithm (TTSA) with a probabilistic neural network (PNN) model. The TTSA is used as a hyperparameter optimizer to improve the efficiency of the PNN-based categorization. An extensive set of experimental evaluations is performed on German and Australian credit datasets to illustrate the IMDLRA-SES model's improved performance. The performance validation of the IMDLRA-SES model portrayed a superior accuracy value of 95.70% and 96.09% over existing techniques.
RESUMO
This paper presents two novel bio-inspired particle swarm optimisation (PSO) variants, namely biased eavesdropping PSO (BEPSO) and altruistic heterogeneous PSO (AHPSO). These algorithms are inspired by types of group behaviour found in nature that have not previously been exploited in search algorithms. The primary search behaviour of the BEPSO algorithm is inspired by eavesdropping behaviour observed in nature coupled with a cognitive bias mechanism that enables particles to make decisions on cooperation. The second algorithm, AHPSO, conceptualises particles in the swarm as energy-driven agents with bio-inspired altruistic behaviour, which allows for the formation of lending-borrowing relationships. The mechanisms underlying these algorithms provide new approaches to maintaining swarm diversity, which contributes to the prevention of premature convergence. The new algorithms were tested on the 30, 50 and 100-dimensional CEC'13, CEC'14 and CEC'17 test suites and various constrained real-world optimisation problems, as well as against 13 well-known PSO variants, the CEC competition winner, differential evolution algorithm L-SHADE and the recent bio-inspired I-CPA metaheuristic. The experimental results show that both the BEPSO and AHPSO algorithms provide very competitive performance on the unconstrained test suites and the constrained real-world problems. On the CEC13 test suite, across all dimensions, both BEPSO and AHPSO performed statistically significantly better than 10 of the 15 comparator algorithms, while none of the remaining 5 algorithms performed significantly better than either BEPSO or AHPSO. On the CEC17 test suite, on the 50D and 100D problems, both BEPSO and AHPSO performed statistically significantly better than 11 of the 15 comparator algorithms, while none of the remaining 4 algorithms performed significantly better than either BEPSO or AHPSO. On the constrained problem set, in terms of mean rank across 30 runs on all problems, BEPSO was first, and AHPSO was third.
RESUMO
In multi-mass systems, torsional vibration is a common and annoying phenomenon. Effective vibration suppression and robustness to wide-range parameter variations are essential for a sound motion system. However, most control methods focus on the primary resonance mode, and the high-order resonance modes are not actively treated in the control design, resulting in the control bandwidth not being high enough and limiting the control performance. This paper proposes a novel two-stage design scheme to realize a wideband control to improve control performance. First, a hybrid uncertainty model is tailored for multi-mass systems, which uses an equivalent and uncertain spring constant to describe the variation of the primary mode and a dynamic uncertainty to cover the other resonance modes. This hybrid model strikes a better balance between the model conservatism and the feasibility of a less conservative design. Then, the passivity of the parameter uncertainty is utilized to conduct a phase compensation on the nominal system. After the phase compensation, all uncertainties are converted into norm-bounded ones, and the robust performance design is carried out. This method is applied to vehicle drivetrain benches, and its superiority is validated through simulation comparisons and experiments on two typical types of drivetrain benches.
RESUMO
Recent sensor, communication, and computing technological advancements facilitate smart grid use. The heavy reliance on developed data and communication technology increases the exposure of smart grids to cyberattacks. Existing mitigation in the electricity grid focuses on protecting primary or redundant measurements. These approaches make certain assumptions regarding false data injection (FDI) attacks, which are inadequate and restrictive to cope with cyberattacks. The reliance on communication technology has emphasized the exposure of power systems to FDI assaults that can bypass the current bad data detection (BDD) mechanism. The current study on unobservable FDI attacks (FDIA) reveals the severe threat of secured system operation because these attacks can avoid the BDD method. Thus, a Data-driven learning-based approach helps detect unobservable FDIAs in distribution systems to mitigate these risks. This study presents a new Hybrid Metaheuristics-based Dimensionality Reduction with Deep Learning for FDIA (HMDR-DLFDIA) Detection technique for Enhanced Network Security. The primary objective of the HMDR-DLFDIA technique is to recognize and classify FDIA attacks in the distribution systems. In the HMDR-DLFDIA technique, the min-max scalar is primarily used for the data normalization process. Besides, a hybrid Harris Hawks optimizer with a sine cosine algorithm (hybrid HHO-SCA) is applied for feature selection. For FDIA detection, the HMDR-DLFDIA technique utilizes the stacked autoencoder (SAE) method. To improve the detection outcomes of the SAE model, the gazelle optimization algorithm (GOA) is exploited. A complete set of experiments was organized to highlight the supremacy of the HMDR-DLFDIA method. The comprehensive result analysis stated that the HMDR-DLFDIA technique performed better than existing DL models.
RESUMO
Healthcare processes are complex and involve uncertainties to influence the service quality and health of patients. Patient transportation takes place between the hospitals or between the departments within the hospital (i.e., Inter- or Intra-Hospital Transportation respectively). The focus of our paper is route planning for transporting patients within the hospital. The route planning task is complex due to multiple factors such as regulations, fairness considerations (i.e., balanced workload amongst transporters), and other dynamic factors (i.e., transport delays, wait times). Transporters perform the physical transportation of patients within the hospital. In principle, each job allocation respects the transition time between the subsequent jobs. The primary objective was to determine the feasible number of transporters, and then generate the route plan for all determined transporters by distributing all transport jobs (i.e., from retrospective data) within each shift. Secondary objectives are to minimize the sum of total travel time and sum of total idle time of all transporters and minimize the deviations in total travel time amongst transporters. Our method used multi-staged Local Search Metaheuristics to attain the primary objective. Metaheuristics incorporate Mixed Integer Linear Programming to allocate fairly the transport jobs by formulating optimization constraints with bounds for satisfying the secondary objectives. The obtained results using formulated optimization constraints represent better efficacy in multi-objective route planning of Intra-Hospital Transportation of patients.
Assuntos
Programação Linear , Transporte de Pacientes , Humanos , AlgoritmosRESUMO
Bladder Cancer (BC) is a common disease that comes with a high risk of morbidity, death, and expense. Primary risk factors for BC include exposure to carcinogens in the workplace or the environment, particularly tobacco. There are several difficulties, such as the requirement for a qualified expert in BC classification. The Parrot Optimizer (PO), is an optimization method inspired by key behaviors observed in trained Pyrrhura Molinae parrots, but the PO algorithm becomes stuck in sub-regions, has less accuracy, and a high error rate. So, an Improved variant of the PO (IPO) algorithm was developed using a combination of two strategies: (1) Mirror Reflection Learning (MRL) and (2) Bernoulli Maps (BMs). Both strategies improve optimization performance by avoiding local optimums and striking a compromise between convergence speed and solution diversity. The performance of the proposed IPO is evaluated against eight other competitor algorithms in terms of statistical convergence and other metrics according to Friedman's test and Bonferroni-Dunn test on the IEEE Congress on Evolutionary Computation conducted in 2022 (CEC 2022) test suite functions and nine BC datasets from official repositories. The IPO algorithm ranked number one in best fitness and is more optimal than the other eight MH algorithms for CEC 2022 functions. The proposed IPO algorithm was integrated with the Support Vector Machine (SVM) classifier termed (IPO-SVM) approach for bladder cancer classification purposes. Nine BC datasets were then used to confirm the effectiveness of the proposed IPO algorithm. The experiments show that the IPO-SVM approach outperforms eight recently proposed MH algorithms. Using the nine BC datasets, IPO-SVM achieved an Accuracy (ACC) of 84.11%, Sensitivity (SE) of 98.10%, Precision (PPV) of 95.59%, Specificity (SP) of 95.98%, and F-score (F1) of 94.15%. This demonstrates how the proposed IPO approach can help to classify BCs effectively. The open-source codes are available at https://www.mathworks.com/matlabcentral/fileexchange/169846-an-efficient-improved-parrot-optimizer.
Assuntos
Algoritmos , Neoplasias da Bexiga Urinária , Neoplasias da Bexiga Urinária/classificação , HumanosRESUMO
The nurse scheduling problem (NSP) has been a crucial and challenging research issue for hospitals, especially considering the serious deterioration in nursing shortages in recent years owing to long working hours, considerable work pressure, and irregular lifestyle, which are important in the service industry. This study investigates the NSP that aims to maximize nurse satisfaction with the generated schedule subject to government laws, internal regulations of hospitals, doctor-nurse pairing rules, shift and day off preferences of nurses, etc. The computational experiment results show that our proposed hybrid metaheuristic outperforms other metaheuristics and manual scheduling in terms of both computation time and solution quality. The presented solution procedure is implemented in a real-world clinic, which is used as a case study. The developed scheduling technique reduced the time spent on scheduling by 93% and increased the satisfaction of the schedule by 21%, which further enhanced the operating efficiency and service quality.
Assuntos
Satisfação no Emprego , Admissão e Escalonamento de Pessoal , Humanos , Admissão e Escalonamento de Pessoal/organização & administração , Recursos Humanos de Enfermagem Hospitalar/organização & administração , Recursos Humanos de Enfermagem Hospitalar/psicologia , Eficiência Organizacional , MédicosRESUMO
Laryngeal cancer (LC) represents a substantial world health problem, with diminished survival rates attributed to late-stage diagnoses. Correct treatment for LC is complex, particularly in the final stages. This kind of cancer is a complex malignancy inside the head and neck region of patients. Recently, researchers serving medical consultants to recognize LC efficiently develop different analysis methods and tools. However, these existing tools and techniques have various problems regarding performance constraints, like lesser accuracy in detecting LC at the early stages, additional computational complexity, and colossal time utilization in patient screening. Deep learning (DL) approaches have been established that are effective in the recognition of LC. Therefore, this study develops an efficient LC Detection using the Chaotic Metaheuristics Integration with the DL (LCD-CMDL) technique. The LCD-CMDL technique mainly focuses on detecting and classifying LC utilizing throat region images. In the LCD-CMDL technique, the contrast enhancement process uses the CLAHE approach. For feature extraction, the LCD-CMDL technique applies the Squeeze-and-Excitation ResNet (SE-ResNet) model to learn the complex and intrinsic features from the image preprocessing. Moreover, the hyperparameter tuning of the SE-ResNet approach is performed using a chaotic adaptive sparrow search algorithm (CSSA). Finally, the extreme learning machine (ELM) model was applied to detect and classify the LC. The performance evaluation of the LCD-CMDL approach occurs utilizing a benchmark throat region image database. The experimental values implied the superior performance of the LCD-CMDL approach over recent state-of-the-art approaches.
RESUMO
The RIME optimization algorithm is a newly developed physics-based optimization algorithm used for solving optimization problems. The RIME algorithm proved high-performing in various fields and domains, providing a high-performance solution. Nevertheless, like many swarm-based optimization algorithms, RIME suffers from many limitations, including the exploration-exploitation balance not being well balanced. In addition, the likelihood of falling into local optimal solutions is high, and the convergence speed still needs some work. Hence, there is room for enhancement in the search mechanism so that various search agents can discover new solutions. The authors suggest an adaptive chaotic version of the RIME algorithm named ACRIME, which incorporates four main improvements, including an intelligent population initialization using chaotic maps, a novel adaptive modified Symbiotic Organism Search (SOS) mutualism phase, a novel mixed mutation strategy, and the utilization of restart strategy. The main goal of these improvements is to improve the variety of the population, achieve a better balance between exploration and exploitation, and improve RIME's local and global search abilities. The study assesses the effectiveness of ACRIME by using the standard benchmark functions of the CEC2005 and CEC2019 benchmarks. The proposed ACRIME is also applied as a feature selection to fourteen various datasets to test its applicability to real-world problems. Besides, the ACRIME algorithm is applied to the COVID-19 classification real problem to test its applicability and performance further. The suggested algorithm is compared to other sophisticated classical and advanced metaheuristics, and its performance is assessed using statistical tests such as Wilcoxon rank-sum and Friedman rank tests. The study demonstrates that ACRIME exhibits a high level of competitiveness and often outperforms competing algorithms. It discovers the optimal subset of features, enhancing the accuracy of classification and minimizing the number of features employed. This study primarily focuses on enhancing the equilibrium between exploration and exploitation, extending the scope of local search.
Assuntos
Algoritmos , Humanos , COVID-19RESUMO
Despite the ability of Low-Power Wide-Area Networks to offer extended range, they encounter challenges with coverage blind spots in the network. This article proposes an innovative energy-efficient and nature-inspired relay selection algorithm for LoRa-based LPWAN networks, serving as a solution for challenges related to poor signal range in areas with limited coverage. A swarm behavior-inspired approach is utilized to select the relays' localization in the network, providing network energy efficiency and radio signal extension. These relays help to bridge communication gaps, significantly reducing the impact of coverage blind spots by forwarding signals from devices with poor direct connectivity with the gateway. The proposed algorithm considers critical factors for the LoRa standard, such as the Spreading Factor and device energy budget analysis. Simulation experiments validate the proposed scheme's effectiveness in terms of energy efficiency under diverse multi-gateway (up to six gateways) network topology scenarios involving thousands of devices (1000-1500). Specifically, it is verified that the proposed approach outperforms a reference method in preventing battery depletion of the relays, which is vital for battery-powered IoT devices. Furthermore, the proposed heuristic method achieves over twice the speed of the exact method for some large-scale problems, with a negligible accuracy loss of less than 2%.
RESUMO
This research suggests a robust integration of artificial neural networks (ANN) for predicting swell pressure and the unconfined compression strength of expansive soils (PsUCS-ES). Four novel ANN-based models, namely ANN-PSO (i.e., particle swarm optimization), ANN-GWO (i.e., grey wolf optimization), ANN-SMA (i.e., slime mould algorithm) alongside ANN-MPA (i.e., marine predators' algorithm) were deployed to assess the PsUCS-ES. The models were trained using the nine most influential parameters affecting PsUCS-ES, collected from a broader range of 145 published papers. The observed results were compared with the predictions made by the ANN-based metaheuristics models. The efficacy of all these formulated models was evaluated by utilizing mean absolute error (MAE), Nash-Sutcliffe (NS) efficiency, performance index ρ, regression coefficient (R2), root mean square error (RMSE), ratio of RMSE to standard deviation of actual observations (RSR), variance account for (VAF), Willmott's index of agreement (WI), and weighted mean absolute percentage error (WMAPE). All the developed models for Ps-ES had an R significantly > 0.8 for the overall dataset. However, ANN-MPA excelled in yielding high R values for training dataset (TrD), testing dataset (TsD), and validation dataset (VdD). This model also exhibited the lowest MAE of 5.63%, 5.68%, and 5.48% for TrD, TsD, and VdD, respectively. The results of the UCS model's performance revealed that R exceeded 0.9 in the TrD. However, R decreased for TsD and VdD. Also, the ANN-MPA model yielded higher R values (0.89, 0.93, and 0.94) and comparatively low MAE values (5.11%, 5.67, and 3.61%) in the case of PSO, GWO, and SMA, respectively. The UCS models witnessed an overfitting problem because the aforementioned R values of the metaheuristics were 0.62, 0.56, and 0.58 (TsD), respectively. On the contrary, no significant observation was recorded in the VdD of UCS models. All the ANN-base models were also tested using the a-20 index. For all the formulated models, maximum points were recorded to lie within ± 20% error. The results of sensitivity as well as monotonicity analyses depicted trending results that corroborate the existing literature. Therefore, it can be inferred that the recently built swarm-based ANN models, particularly ANN-MPA, can solve the complexities of tuning the hyperparameters of the ANN-predicted PsUCS-ES that can be replicated in practical scenarios of geoenvironmental engineering.
RESUMO
This paper provides six metaheuristic algorithms, namely Fast Cuckoo Search (FCS), Salp Swarm Algorithm (SSA), Dynamic control Cuckoo search (DCCS), Gradient-Based Optimizer (GBO), Northern Goshawk Optimization (NGO), Opposition Flow Direction Algorithm (OFDA) to efficiently solve the optimal power flow (OPF) issue. Under standard and conservative operating settings, the OPF problem is modeled utilizing a range of objectives, constraints, and formulations. Five case studies have been conducted using IEEE 30-bus and IEEE 118-bus standard test systems to evaluate the effectiveness and robustness of the proposed algorithms. A performance evaluation procedure is suggested to compare the optimization techniques' strength and resilience. A fresh comparison methodology is created to compare the proposed methodologies with other well-known methodologies. Compared to previously reported optimization algorithms in the literature, the obtained results show the potential of GBO to solve various OPF problems efficiently.
RESUMO
As a preprocessing for machine learning and data mining, Feature Selection plays an important role. Feature selection aims to streamline high-dimensional data by eliminating irrelevant and redundant features, which reduces the potential curse of dimensionality of a given large dataset. When working with datasets containing many features, algorithms that aim to identify the most valuable features to improve dataset accuracy may encounter difficulties because of local optima. Many studies have been conducted to solve this problem. One of the solutions is to use meta-heuristic techniques. This paper presents a combination of the Differential evolution and the sailfish optimizer algorithms (DESFO) to tackle the feature selection problem. To assess the effectiveness of the proposed algorithm, a comparison between Differential Evolution, sailfish optimizer, and nine other modern algorithms, including different optimization algorithms, is presented. The evaluation used Random forest and key nearest neighbors as quality measures. The experimental results show that the proposed algorithm is a superior algorithm compared to others. It significantly impacts high classification accuracy, achieving 85.7% with the Random Forest classifier and 100% with the Key Nearest Neighbors classifier across 14 multi-scale benchmarks. According to fitness values, it gained 71% with the Random forest and 85.7% with the Key Nearest Neighbors classifiers.