Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Health Inf Sci Syst ; 12(1): 35, 2024 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-38764569

RESUMO

Gastrointestinal (GI) cancer detection includes the detection of cancerous or potentially cancerous lesions within the GI tract. Earlier diagnosis is critical for increasing the success of treatment and improving patient outcomes. Medical imaging plays a major role in diagnosing and detecting GI cancer. CT scans, endoscopy, MRI, ultrasound, and positron emission tomography (PET) scans can help detect lesions, abnormal masses, and changes in tissue structure. Artificial intelligence (AI) and machine learning (ML) methods are being gradually applied to medical imaging for cancer diagnosis. ML algorithms, including deep learning methodologies like convolutional neural network (CNN), are applied frequently for cancer diagnosis. These models learn features and patterns from labelled datasets to discriminate between normal and abnormal areas in medical images. This article presents a new Harbor Seal Whiskers Optimization Algorithm with Deep Learning based Medical Imaging Analysis for Gastrointestinal Cancer Detection (HSWOA-DLGCD) technique. The goal of the HSWOA-DLGCD algorithm is to explore the GI images for the cancer diagnosis. In order to accomplish this, the HSWOA-DLGCD system applies bilateral filtering (BF) approach for the removal of noise. In addition, the HSWOA-DLGCD technique makes use of HSWOA with Xception model for feature extraction. For cancer recognition, the HSWOA-DLGCD technique applies extreme gradient boosting (XGBoost) model. Finally, the parameters compared with the XGBoost system can be selected by moth flame optimization (MFO) system. The experimental results of the HSWOA-DLGCD technique could be verified on the Kvasir database. The simulation outcome demonstrated a best possible solution of the HSWOA-DLGCD method than other recent methods.

2.
PeerJ Comput Sci ; 9: e1663, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38077610

RESUMO

The neurological ailment known as Parkinson's disease (PD) affects people throughout the globe. The neurodegenerative PD-related disorder primarily affects people in middle to late life. Motor symptoms such as tremors, muscle rigidity, and sluggish, clumsy movement are common in patients with this disorder. Genetic and environmental variables play significant roles in the development of PD. Despite much investigation, the root cause of this neurodegenerative disease is still unidentified. Clinical diagnostics rely heavily on promptly detecting such irregularities to slow or stop the progression of illnesses successfully. Because of its direct correlation with brain activity, electroencephalography (EEG) is an essential PD diagnostic technique. Electroencephalography, or EEG, data are biomarkers of brain activity changes. However, these signals are non-linear, non-stationary, and complicated, making analysis difficult. One must often resort to a lengthy human labor process to accomplish results using traditional machine-learning approaches. The breakdown, feature extraction, and classification processes are typical examples of these stages. To overcome these obstacles, we present a novel deep-learning model for the automated identification of Parkinson's disease (PD). The Gabor transform, a standard method in EEG signal processing, was used to turn the raw data from the EEG recordings into spectrograms. In this research, we propose densely linked bidirectional long short-term memory (DLBLSTM), which first represents each layer as the sum of its hidden state plus the hidden states of all layers above it, then recursively transmits that representation to all layers below it. This study's suggested deep learning model was trained using these spectrograms as input data. Using a robust sixfold cross-validation method, the proposed model showed excellent accuracy with a classification accuracy of 99.6%. The results indicate that the suggested algorithm can automatically identify PD.

3.
Biomimetics (Basel) ; 8(7)2023 Nov 10.
Artigo em Inglês | MEDLINE | ID: mdl-37999176

RESUMO

Recently, the usage of remote sensing (RS) data attained from unmanned aerial vehicles (UAV) or satellite imagery has become increasingly popular for crop classification processes, namely soil classification, crop mapping, or yield prediction. Food crop classification using RS images (RSI) is a significant application of RS technology in agriculture. It involves the use of satellite or aerial imagery to identify and classify different types of food crops grown in a specific area. This information can be valuable for crop monitoring, yield estimation, and land management. Meeting the criteria for analyzing these data requires increasingly sophisticated methods and artificial intelligence (AI) technologies provide the necessary support. Due to the heterogeneity and fragmentation of crop planting, typical classification approaches have a lower classification performance. However, the DL technique can detect and categorize crop types effectively and has a stronger feature extraction capability. In this aspect, this study designed a new remote sensing imagery data analysis using the marine predators algorithm with deep learning for food crop classification (RSMPA-DLFCC) technique. The RSMPA-DLFCC technique mainly investigates the RS data and determines the variety of food crops. In the RSMPA-DLFCC technique, the SimAM-EfficientNet model is utilized for the feature extraction process. The MPA is applied for the optimal hyperparameter selection process in order to optimize the accuracy of SimAM-EfficientNet architecture. MPA, inspired by the foraging behaviors of marine predators, perceptively explores hyperparameter configurations to optimize the hyperparameters, thereby improving the classification accuracy and generalization capabilities. For crop type detection and classification, an extreme learning machine (ELM) model can be used. The simulation analysis of the RSMPA-DLFCC technique is performed on two benchmark datasets. The extensive analysis of the results portrayed the higher performance of the RSMPA-DLFCC approach over existing DL techniques.

4.
Sensors (Basel) ; 23(8)2023 Apr 18.
Artigo em Inglês | MEDLINE | ID: mdl-37112414

RESUMO

An Internet of Things (IoT)-assisted Wireless Sensor Network (WSNs) is a system where WSN nodes and IoT devices together work to share, collect, and process data. This incorporation aims to enhance the effectiveness and efficiency of data analysis and collection, resulting in automation and improved decision-making. Security in WSN-assisted IoT can be referred to as the measures initiated for protecting WSN linked to the IoT. This article presents a Binary Chimp Optimization Algorithm with Machine Learning based Intrusion Detection (BCOA-MLID) technique for secure IoT-WSN. The presented BCOA-MLID technique intends to effectively discriminate different types of attacks to secure the IoT-WSN. In the presented BCOA-MLID technique, data normalization is initially carried out. The BCOA is designed for the optimal selection of features to improve intrusion detection efficacy. To detect intrusions in the IoT-WSN, the BCOA-MLID technique employs a class-specific cost regulation extreme learning machine classification model with a sine cosine algorithm as a parameter optimization approach. The experimental result of the BCOA-MLID technique is tested on the Kaggle intrusion dataset, and the results showcase the significant outcomes of the BCOA-MLID technique with a maximum accuracy of 99.36%, whereas the XGBoost and KNN-AOA models obtained a reduced accuracy of 96.83% and 97.20%, respectively.

5.
Sensors (Basel) ; 23(5)2023 Feb 27.
Artigo em Inglês | MEDLINE | ID: mdl-36904839

RESUMO

Wireless sensor networks (WSNs) are becoming a significant technology for ubiquitous living and continue to be involved in active research because of their varied applications. Energy awareness will be a critical design problem in WSNs. Clustering is a widespread energy-efficient method and grants several benefits such as scalability, energy efficiency, less delay, and lifetime, but it results in hotspot issues. To solve this, unequal clustering (UC) has been presented. In UC, the size of the cluster differs with the distance to the base station (BS). This paper devises an improved tuna-swarm-algorithm-based unequal clustering for hotspot elimination (ITSA-UCHSE) technique in an energy-aware WSN. The ITSA-UCHSE technique intends to resolve the hotspot problem and uneven energy dissipation in the WSN. In this study, the ITSA is derived from the use of a tent chaotic map with the traditional TSA. In addition, the ITSA-UCHSE technique computes a fitness value based on energy and distance metrics. Moreover, the cluster size determination via the ITSA-UCHSE technique helps to address the hotspot issue. To demonstrate the enhanced performance of the ITSA-UCHSE approach, a series of simulation analyses were conducted. The simulation values stated that the ITSA-UCHSE algorithm has reached improved results over other models.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...