Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 16.054
Filtrar
1.
J Neural Eng ; 21(4)2024 Jul 26.
Artigo em Inglês | MEDLINE | ID: mdl-38981500

RESUMO

Objective.To evaluate the inter- and intra-rater reliability for the identification of bad channels among neurologists, EEG Technologists, and naïve research personnel, and to compare their performance with the automated bad channel detection (ABCD) algorithm for detecting bad channels.Approach.Six Neurologists, ten EEG Technologists, and six naïve research personnel (22 raters in total) were asked to rate 1440 real intracranial EEG channels as good or bad. Intra- and interrater kappa statistics were calculated for each group. We then compared each group to the ABCD algorithm which uses spectral and temporal domain features to classify channels as good or bad.Main results.Analysis of channel ratings from our participants revealed variable intra-rater reliability within each group, with no significant differences across groups. Inter-rater reliability was moderate among neurologists and EEG Technologists but minimal among naïve participants. Neurologists demonstrated a slightly higher consistency in ratings than EEG Technologists. Both groups occasionally misclassified flat channels as good, and participants generally focused on low-frequency content for their assessments. The ABCD algorithm, in contrast, relied more on high-frequency content. A logistic regression model showed a linear relationship between the algorithm's ratings and user responses for predominantly good channels, but less so for channels rated as bad. Sensitivity and specificity analyses further highlighted differences in rating patterns among the groups, with neurologists showing higher sensitivity and naïve personnel higher specificity.Significance.Our study reveals the bias in human assessments of intracranial electroencephalography (iEEG) data quality and the tendency of even experienced professionals to overlook certain bad channels, highlighting the need for standardized, unbiased methods. The ABCD algorithm, outperforming human raters, suggests the potential of automated solutions for more reliable iEEG interpretation and seizure characterization, offering a reliable approach free from human biases.


Assuntos
Algoritmos , Humanos , Reprodutibilidade dos Testes , Variações Dependentes do Observador , Eletrocorticografia/métodos , Eletrocorticografia/normas , Eletroencefalografia/métodos , Eletroencefalografia/normas , Neurologistas/estatística & dados numéricos , Neurologistas/normas
2.
ACS Appl Mater Interfaces ; 16(29): 38466-38477, 2024 Jul 24.
Artigo em Inglês | MEDLINE | ID: mdl-38995996

RESUMO

Prolonged sitting can easily result in pressure injury (PI) for certain people who have had strokes or spinal cord injuries. There are not many methods available for tracking contact surface pressure and shear force to evaluate the PI risk. Here, we propose a smart cushion that uses two-dimensional force sensors (2D-FSs) to measure the pressure and shear force in the buttocks. A machine learning algorithm is then used to compute the shear stresses in the gluteal muscles, which helps to determine the PI risk. The 2D-FS consists of a ferroelectret coaxial sensor (FCS) unit placed atop a ferroelectret film sensor (FFS) unit, allowing it to detect both vertical and horizontal forces simultaneously. To characterize and calibrate, two experimental approaches are applied: one involves simultaneously applying two perpendicular forces, and one involves applying a single force. To separate the two forces, the 2D-FS is decoupled using a deep neural network technique. Multiple FCSs are embedded to form a smart cushion, and a genetic algorithm-optimized backpropagation neural network is proposed and trained to predict the shear strain in the buttocks to prevent PI. By tracking the danger of PI, the smart cushion based on 2D-FSs may be further connected with home-based intelligent care platforms to increase patient equality for spinal cord injury patients and lower the expense of nursing or rehabilitation care.


Assuntos
Aprendizado de Máquina , Úlcera por Pressão , Úlcera por Pressão/prevenção & controle , Humanos , Nádegas , Medição de Risco , Pressão , Redes Neurais de Computação , Algoritmos
3.
Biosystems ; 243: 105271, 2024 Jul 20.
Artigo em Inglês | MEDLINE | ID: mdl-39038529

RESUMO

At any moment in time, evolution is faced with a formidable challenge: refining the already highly optimised design of biological species, a feat accomplished through all preceding generations. In such a scenario, the impact of random changes (the method employed by evolution) is much more likely to be harmful than advantageous, potentially lowering the reproductive fitness of the affected individuals. Our hypothesis is that ageing is, at least in part, caused by the cumulative effect of all the experiments carried out by evolution to improve a species' design. These experiments are almost always unsuccessful, as expected given their pseudorandom nature, cause harm to the body and ultimately lead to death. This hypothesis is consistent with the concept of "terminal addition", by which nature is biased towards adding innovations at the end of development. From the perspective of evolution as an optimisation algorithm, ageing is advantageous as it allows to test innovations during a phase when their impact on fitness is present but less pronounced. Our inference suggests that ageing has a key biological role, as it contributes to the system's evolvability by exerting a regularisation effect on the fitness landscape of evolution.

4.
Stat Med ; 2024 Jul 23.
Artigo em Inglês | MEDLINE | ID: mdl-39044448

RESUMO

Logistic regression models are widely used in case-control data analysis, and testing the goodness-of-fit of their parametric model assumption is a fundamental research problem. In this article, we propose to enhance the power of the goodness-of-fit test by exploiting a monotonic density ratio model, in which the ratio of case and control densities is assumed to be a monotone function. We show that such a monotonic density ratio model is naturally induced by the retrospective case-control sampling design under the alternative hypothesis. The pool-adjacent-violator algorithm is adapted to solve for the constrained nonparametric maximum likelihood estimator under the alternative hypothesis. By measuring the discrepancy between this estimator and the semiparametric maximum likelihood estimator under the null hypothesis, we develop a new Kolmogorov-Smirnov-type statistic to test the goodness-of-fit for logistic regression models with case-control data. A bootstrap resampling procedure is suggested to approximate the p $$ p $$ -value of the proposed test. Simulation results show that the type I error of the proposed test is well controlled and the power improvement is substantial in many cases. Three real data applications are also included for illustration.

5.
Scand J Gastroenterol ; : 1-6, 2024 Jul 26.
Artigo em Inglês | MEDLINE | ID: mdl-39061129

RESUMO

OBJECTIVES: In patients evaluated for hepatocellular carcinoma (HCC), magnetic resonance imaging (MRI) is often used secondarily when multiphase contrast-enhanced computed tomography (ceCT) is inconclusive. We investigated the clinical impact of adding MRI. MATERIALS AND METHODS: This single-institution retrospective study included 48 MRI scans (44 patients) conducted from May 2016 to July 2023 due to suspicion of HCC on a multiphase ceCT scan. Data included medical history, preceding and subsequent imaging, histology when available, and decisions made at multidisciplinary team meetings. RESULTS: In case of possible HCC recurrence, 63% of the MRI scans were diagnostic of HCC. For 80% of the negative MRI scans, the patients were diagnosed with HCC within a median of 165 days in the suspicious area of the liver. In case of possible de-novo HCC in patients with cirrhosis, 22% of the scans were diagnostic of HCC and 33% of the negative MRI scans were of patients diagnosed with HCC within a median of 109 days. None of the non-cirrhotic patients with possible de-novo HCC and negative MRI scans (64%) were later diagnosed with HCC, but 3/5 of the indeterminate scans were of patients diagnosed with HCC in a biopsy. CONCLUSIONS: Secondary MRI to a multiphase ceCT scan suspicious of HCC is highly valuable in ruling out HCC in non-cirrhotic patients and in diagnosing HCC non-invasively in cirrhotic patients and patients with prior HCC. Patients with cirrhosis or prior HCC are still at high risk of having HCC if MRI results are inconclusive or negative.

6.
Sci Rep ; 14(1): 16776, 2024 Jul 22.
Artigo em Inglês | MEDLINE | ID: mdl-39039187

RESUMO

There is a complex high-dimensional nonlinear mapping relationship between the compressive strength of High-Performance Concrete (HPC) and its components, which has great influence on the accurate prediction of compressive strength. In this paper, an efficient robust software calculation strategy combining BP Neural Network (BPNN), Support Vector Machine (SVM) and Genetic Algorithm (GA) is proposed for the prediction of compressive strength of HPC. 8 features were extracted from the previous literature, and a compressive strength database containing 454 sets of data was constructed. The model was trained and tested, and the performance of 4 Machine Learning (ML) models, namely BPNN, SVM, GA-BPNN and GA-SVM, was compared. The results show that the coupled model is superior to the single model. Moreover, because GA-SVM has better generalization ability and theoretical basis, its convergence speed and prediction accuracy are better than GA-BPNN. Then Grey Relational Analysis (GRA) and Shapley analysis were used to verify the interpretability of the GA-SVM model, which showed that the water-binder ratio had the most significant influence on the compressive strength. Finally, the combination of multiple input variables to evaluate the compressive strength supplemented this research, and again verified the significant influence of water-binder ratio, providing reference value for subsequent research.

7.
J Chromatogr A ; 1730: 465133, 2024 Aug 16.
Artigo em Inglês | MEDLINE | ID: mdl-38996515

RESUMO

The use of a ternary mobile-phase system comprising ammonium sulphate, sodium chloride, and phosphate buffer was explored to tune retention and enhance selectivity in hydrophobic interaction chromatography. The accuracy of the linear solvent-strength model to predict protein retention with the ternary mobile-phase system based on isocratic scouting runs is limited, as the extrapolated retention factor at aqueous buffer conditions (k0) cannot be reliably established. The Jandera retention model utilizing a salt concentration averaged retention factor (k¯0) in aqueous buffer for ternary systems overcomes this bottleneck. Gradient retention factors were derived based on isocratic scouting runs after numerical integration of the isocratic Jandera model, leading to retention-time prediction errors below 11 % for linear gradients. Furthermore, an analytical expression was formulated to predict HIC retention for both linear and segmented linear gradients, considering the linear solvent-strength (LSS) model within ternary salt systems, relying on a fixed k0. The approach involved conducting two gradient scouting runs for each of the two binary salt systems to determine model parameters. Retention-time prediction errors for linear gradients were below 12 % for lysozyme and 3 % for trypsinogen and α-chymotrypsinogen A. Finally, the analytical expression for a ternary mobile-phase system was used in combination with a genetic algorithm to tune the HIC selectivity. With an optimized segmented ternary gradient, a critical-pair separation for a mixture of 7 proteins was achieved within 15 min with retention-time prediction errors ranging between 0.7 and 15.7 %.


Assuntos
Sulfato de Amônio , Interações Hidrofóbicas e Hidrofílicas , Muramidase , Muramidase/química , Muramidase/análise , Sulfato de Amônio/química , Cloreto de Sódio/química , Cromatografia Líquida/métodos , Algoritmos , Soluções Tampão , Fosfatos/química , Fosfatos/análise , Quimotripsinogênio/química , Modelos Químicos
8.
Technol Health Care ; 2024 Jul 13.
Artigo em Inglês | MEDLINE | ID: mdl-39058460

RESUMO

BACKGROUND: Healthcare is crucial to patient care because it provides vital services for maintaining and restoring health. As healthcare technology evolves, cutting-edge tools facilitate faster diagnosis and more effective patient treatment. In the present age of pandemics, the Internet of Things (IoT) offers a potential solution to the problem of patient safety monitoring by creating a massive quantity of data about the patient through the linked devices around them and then analyzing it to estimate the patient's current status. Utilizing the IoT-based meta-heuristic algorithm allows patients to be remotely monitored, resulting in timely diagnosis and improved care. Meta-heuristic algorithms are successful, resilient, and effective in solving real-world enhancement, clustering, predicting, and grouping. Healthcare organizations need an efficient method for dealing with big data since the prevalence of such data makes it challenging to analyze for diagnosis. The current techniques used in medical diagnostics have limitations due to imbalanced data and the overfitting issue. OBJECTIVE: This study introduces the particle swarm optimization and convolutional neural network to be used as a meta-heuristic optimization method for extensive data analysis in the IoT to monitor patients' health conditions. METHOD: Particle Swarm Optimization is used to optimize the data used in the study. Information for a diabetes diagnosis model that includes cardiac risk forecasting is collected. Particle Swarm Optimization and Convolutional Neural Networks (PSO-CNN) results effectively make illness predictions. Support Vector Machine has been used to predict the possibility of a heart attack based on the classification of the collected data into projected abnormal and normal ranges for diabetes. RESULTS: The results of the simulations reveal that the PSO-CNN model used to predict diabetic disease increased in accuracy by 92.6%, precision by 92.5%, recall by 93.2%, F1-score by 94.2%, and quantization error by 4.1%. CONCLUSION: The suggested approach could be applied to identify cancer cells.

9.
Cancer Treat Rev ; 129: 102803, 2024 Jul 14.
Artigo em Inglês | MEDLINE | ID: mdl-39029154

RESUMO

This review presents a comprehensive comparative analysis of international guidelines for managing advanced, non-functioning, well-differentiated pancreatic neuroendocrine tumors (panNETs). PanNETs, which represent a significant proportion of pancreatic neuroendocrine neoplasms, exhibit diverse clinical behaviors and prognoses based on differentiation, grading, and other molecular markers. The varying therapeutic strategies proposed by different guidelines reflect their distinct emphases and regional considerations, such as the ESMO guideline's focus on advanced disease management and the ENETS guidance paper's multidisciplinary approach. This review examines the most recent guidelines from ESMO, NCCN, ASCO, ENETS, and NANETS, analyzing the recommendations for first-line therapies and subsequent treatment pathways in different clinical scenarios. Significant variations are observed in the recommendations, particularly concerning the choice and sequence of systemic therapies, the role of tumor grading and the Ki-67 index in therapeutic decisions, and the integration of regional regulatory and clinical practices. The analysis highlights the need for a tailored approach to managing advanced NF panNETs, advocating for flexibility in applying guidelines to account for individual patient circumstances and the evolving evidence base. This work underscores the complexities of managing this patient population and the critical role of a multidisciplinary team in optimizing treatment outcomes.

10.
Sci Rep ; 14(1): 16892, 2024 Jul 23.
Artigo em Inglês | MEDLINE | ID: mdl-39043713

RESUMO

In this paper, a novel Moth-Flame Optimization (MFO) algorithm, namely MFO algorithm enhanced by Multiple Improvement Strategies (MISMFO) is proposed for solving parameter optimization in Multi-Kernel Support Vector Regressor (MKSVR), and the MISMFO-MKSVR model is further employed to deal with the software effort estimation problems. In MISMFO, the logistic chaotic mapping is applied to increase initial population diversity, while the mutation and flame number phased reduction mechanisms are carried out to improve the search efficiency, as well the adaptive weight adjustment mechanism is used to accelerate convergence and balance exploration and exploitation. The MISMFO model is verified on fifteen benchmark functions and CEC 2020 test set. The results show that the MISMFO has advantages over other meta-heuristic algorithms and MFO variants in terms of convergence speed and accuracy. Additionally, the MISMFO-MKSVR model is tested by simulations on five software effort datasets and the results demonstrate that the proposed model has better performance in software effort estimation problem. The Matlab code of MISMFO can be found at https://github.com/loadstar1997/MISMFO .

11.
Biomimetics (Basel) ; 9(7)2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-39056840

RESUMO

The recently introduced coati optimization algorithm suffers from drawbacks such as slow search velocity and weak optimization precision. An enhanced coati optimization algorithm called CMRLCCOA is proposed. Firstly, the Sine chaotic mapping function is used to initialize the CMRLCCOA as a way to obtain better-quality coati populations and increase the diversity of the population. Secondly, the generated candidate solutions are updated again using the convex lens imaging reverse learning strategy to expand the search range. Thirdly, the Lévy flight strategy increases the search step size, expands the search range, and avoids the phenomenon of convergence too early. Finally, utilizing the crossover strategy can effectively reduce the search blind spots, making the search particles constantly close to the global optimum solution. The four strategies work together to enhance the efficiency of COA and to boost the precision and steadiness. The performance of CMRLCCOA is evaluated on CEC2017 and CEC2019. The superiority of CMRLCCOA is comprehensively demonstrated by comparing the output of CMRLCCOA with the previously submitted algorithms. Besides the results of iterative convergence curves, boxplots and a nonparametric statistical analysis illustrate that the CMRLCCOA is competitive, significantly improves the convergence accuracy, and well avoids local optimal solutions. Finally, the performance and usefulness of CMRLCCOA are proven through three engineering application problems. A mathematical model of the hypersonic vehicle cruise trajectory optimization problem is developed. The result of CMRLCCOA is less than other comparative algorithms and the shortest path length for this problem is obtained.

12.
Biomimetics (Basel) ; 9(7)2024 Jul 06.
Artigo em Inglês | MEDLINE | ID: mdl-39056853

RESUMO

In complex traffic environments, 3D target tracking and detection are often occluded by various stationary and moving objects. When the target is occluded, its apparent characteristics change, resulting in a decrease in the accuracy of tracking and detection. In order to solve this problem, we propose to learn the vehicle behavior from the driving data, predict and calibrate the vehicle trajectory, and finally use the artificial fish swarm algorithm to optimize the tracking results. The experiments show that compared with the CenterTrack method, the proposed method improves the key indicators of MOTA (Multi-Object Tracking Accuracy) in 3D object detection and tracking on the nuScenes dataset, and the frame rate is 26 fps.

13.
Biomimetics (Basel) ; 9(7)2024 Jul 07.
Artigo em Inglês | MEDLINE | ID: mdl-39056858

RESUMO

The flying foxes optimization (FFO) algorithm stimulated by the strategy used by flying foxes for subsistence in heat wave environments has shown good performance in the single-objective domain. Aiming to explore the effectiveness and benefits of the subsistence strategy used by flying foxes in solving optimization challenges involving multiple objectives, this research proposes a decomposition-based multi-objective flying foxes optimization algorithm (MOEA/D-FFO). It exhibits a great population management strategy, which mainly includes the following features. (1) In order to improve the exploration effectiveness of the flying fox population, a new offspring generation mechanism is introduced to improve the efficiency of exploration of peripheral space by flying fox populations. (2) A new population updating approach is proposed to adjust the neighbor matrices to the corresponding flying fox individuals using the new offspring, with the aim of enhancing the rate of convergence in the population. Through comparison experiments with classical algorithms (MOEA/D, NSGA-II, IBEA) and cutting-edge algorithms (MOEA/D-DYTS, MOEA/D-UR), MOEA/D-FFO achieves more than 11 best results. In addition, the experimental results under different population sizes show that the proposed algorithm is highly adaptable and has good application prospects in optimization problems for engineering applications.

14.
Biomimetics (Basel) ; 9(7)2024 Jul 08.
Artigo em Inglês | MEDLINE | ID: mdl-39056860

RESUMO

The football team training algorithm (FTTA) is a new metaheuristic algorithm that was proposed in 2024. The FTTA has better performance but faces challenges such as poor convergence accuracy and ease of falling into local optimality due to limitations such as referring too much to the optimal individual for updating and insufficient perturbation of the optimal agent. To address these concerns, this paper presents an improved football team training algorithm called IFTTA. To enhance the exploration ability in the collective training phase, this paper proposes the fitness distance-balanced collective training strategy. This enables the players to train more rationally in the collective training phase and balances the exploration and exploitation capabilities of the algorithm. To further perturb the optimal agent in FTTA, a non-monopoly extra training strategy is designed to enhance the ability to get rid of the local optimum. In addition, a population restart strategy is then designed to boost the convergence accuracy and population diversity of the algorithm. In this paper, we validate the performance of IFTTA and FTTA as well as six comparison algorithms in CEC2017 test suites. The experimental results show that IFTTA has strong optimization performance. Moreover, several engineering-constrained optimization problems confirm the potential of IFTTA to solve real-world optimization problems.

15.
Biomimetics (Basel) ; 9(7)2024 Jul 17.
Artigo em Inglês | MEDLINE | ID: mdl-39056879

RESUMO

This paper aims to solve the multi-objective operating planning problem in the radioactive environment. First, a more complicated radiation dose model is constructed, considering difficulty levels at each operating point. Based on this model, the multi-objective operating planning problem is converted to a variant traveling salesman problem (VTSP). Second, with respect to this issue, a novel combinatorial algorithm framework, namely hyper-parameter adaptive genetic algorithm (HPAGA), integrating bio-inspired optimization with reinforcement learning, is proposed, which allows for adaptive adjustment of the hyperparameters of GA so as to obtain optimal solutions efficiently. Third, comparative studies demonstrate the superior performance of the proposed HPAGA against classical evolutionary algorithms for various TSP instances. Additionally, a case study in the simulated radioactive environment implies the potential application of HPAGA in the future.

16.
Entropy (Basel) ; 26(7)2024 Jul 07.
Artigo em Inglês | MEDLINE | ID: mdl-39056941

RESUMO

The rapid evolution of computer technology and social networks has led to massive data generation through interpersonal communications, necessitating improved methods for information mining and relational analysis in areas such as criminal activity. This paper introduces a Social Network Forensic Analysis model that employs network representation learning to identify and analyze key figures within criminal networks, including leadership structures. The model incorporates traditional web forensics and community algorithms, utilizing concepts such as centrality and similarity measures and integrating the Deepwalk, Line, and Node2vec algorithms to map criminal networks into vector spaces. This maintains node features and structural information that are crucial for the relational analysis. The model refines node relationships through modified random walk sampling, using BFS and DFS, and employs a Continuous Bag-of-Words with Hierarchical Softmax for node vectorization, optimizing the value distribution via the Huffman tree. Hierarchical clustering and distance measures (cosine and Euclidean) were used to identify the key nodes and establish a hierarchy of influence. The findings demonstrate the effectiveness of the model in accurately vectorizing nodes, enhancing inter-node relationship precision, and optimizing clustering, thereby advancing the tools for combating complex criminal networks.

17.
J Imaging ; 10(7)2024 Jul 15.
Artigo em Inglês | MEDLINE | ID: mdl-39057739

RESUMO

Accurate prognosis and diagnosis are crucial for selecting and planning lung cancer treatments. As a result of the rapid development of medical imaging technology, the use of computed tomography (CT) scans in pathology is becoming standard practice. An intricate interplay of requirements and obstacles characterizes computer-assisted diagnosis, which relies on the precise and effective analysis of pathology images. In recent years, pathology image analysis tasks such as tumor region identification, prognosis prediction, tumor microenvironment characterization, and metastasis detection have witnessed the considerable potential of artificial intelligence, especially deep learning techniques. In this context, an artificial intelligence (AI)-based methodology for lung cancer diagnosis is proposed in this research work. As a first processing step, filtering using the Butterworth smooth filter algorithm was applied to the input images from the LUNA 16 lung cancer dataset to remove noise without significantly degrading the image quality. Next, we performed the bi-level feature selection step using the Chaotic Crow Search Algorithm and Random Forest (CCSA-RF) approach to select features such as diameter, margin, spiculation, lobulation, subtlety, and malignancy. Next, the Feature Extraction step was performed using the Multi-space Image Reconstruction (MIR) method with Grey Level Co-occurrence Matrix (GLCM). Next, the Lung Tumor Severity Classification (LTSC) was implemented by using the Sparse Convolutional Neural Network (SCNN) approach with a Probabilistic Neural Network (PNN). The developed method can detect benign, normal, and malignant lung cancer images using the PNN algorithm, which reduces complexity and efficiently provides classification results. Performance parameters, namely accuracy, precision, F-score, sensitivity, and specificity, were determined to evaluate the effectiveness of the implemented hybrid method and compare it with other solutions already present in the literature.

18.
Sci Total Environ ; : 174724, 2024 Jul 25.
Artigo em Inglês | MEDLINE | ID: mdl-39059649

RESUMO

Sustained deep emission reduction in road transportation is encountering bottleneck. The Intelligent Transportation-Speed Guidance System (ITSGS) is anticipated to overcome this challenge and facilitate the achievement of low-carbon and clean transportation. Here, we compiled vehicle emission datasets collected from real-world road experiments and identified the mapping relationships between four pollutants (CO2, CO, NOx, and THC) and their influencing factors through machine learning. We developed random forest models for each pollutant and achieved strong predictive performance, with an R2 exceeding 0.85 on the test dataset for all models. The environmental benefits of ITSGS at the urban scale were quantified by combining emission models with large-scale real trajectory data from Zibo, Shandong Province. Based on temporal and spatial analyses, we found that ITSGS has varying degrees of emission reduction potential during the morning peak, flat peak, and evening peak hours. Values can range from 5.71 %-8.16 % for CO2 emissions, 13.63 %-16.25 % for NOx emissions, 13.69 %-16.45 % for CO emissions, and 4.84-7.07 % for THC emissions, respectively. Additionally, ITSGS can significantly expand the area of low transient emission zones. The best time for achieving maximum environmental benefits from ITSGS is during the workday flat peak. ITSGS limits high-speed and aggressive driving behavior, thereby smoothing the driving trajectory, reducing the frequency of speed switches, and lowering road traffic emissions. The results of the ITSGS environmental benefits evaluation will provide new insights and solutions for sustainable road traffic emission reduction. SYNOPSIS: Large-scale deployment of Intelligent Transportation - Speed Guidance System is a sustainable solution to help achieve low-carbon and clean transportation.

19.
Sci Rep ; 14(1): 17245, 2024 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-39060295

RESUMO

This paper proposes a knowledge-based decision-making system for energy bill assessment and competitive energy consumption analysis for energy savings. As humans have a tendency toward comparison between peers and self-groups, the same concept of competitive behavior is utilized to design knowledge-based decision-making systems. A total of 225 house monthly energy consumption datasets are collected for Maharashtra state, along with a questionnaire-based survey that includes socio-demographic information, household appliances, family size, and some other parameters. After data collection, the pre-processing technique is applied for data normalization, and correlation technique-based key features are extracted. These features are used to classify different house categories based on consumption. A knowledge-based system is designed based on historical datasets for future energy consumption prediction and comparison with actual usage. These comparative studies provide a path for knowledgebase system design to generate monthly energy utilization reports for significant behavior changes for energy savings. Further, Linear Programming and Genetic Algorithms are used to optimize energy consumption for different household categories based on socio-demographic constraints. This will also benefit the consumers with an electricity bill evaluation range (i.e., normal, high, or very high) and find the energy conservation potential (kWh) as well as a cost-saving solution to solve real-world complex electricity conservation problem.

20.
Sci Rep ; 14(1): 17148, 2024 Jul 26.
Artigo em Inglês | MEDLINE | ID: mdl-39060369

RESUMO

The Internet of Things (IoT) permeates various sectors, including healthcare, smart cities, and agriculture, alongside critical infrastructure management. However, its susceptibility to malware due to limited processing power and security protocols poses significant challenges. Traditional antimalware solutions fall short in combating evolving threats. To address this, the research work developed a feature selection-based classification model. At first stage, a preprocessing stage enhances dataset quality through data smoothing and consistency improvement. Feature selection via the Zebra Optimization Algorithm (ZOA) reduces dimensionality, while a classification phase integrates the Graph Attention Network (GAN), specifically the Dual-channel GAN (DGAN). DGAN incorporates Node Attention Networks and Semantic Attention Networks to capture intricate IoT device interactions and detect anomalous behaviors like botnet activity. The model's accuracy is further boosted by leveraging both structural and semantic data with the Sooty Tern Optimization Algorithm (STOA) for hyperparameter tuning. The proposed STOA-DGAN model achieves an impressive 99.87% accuracy in botnet activity classification, showcasing robustness and reliability compared to existing approaches.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...