Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(19)2023 Sep 23.
Artigo em Inglês | MEDLINE | ID: mdl-37836874

RESUMO

The Internet of Things (IoT) has significantly benefited several businesses, but because of the volume and complexity of IoT systems, there are also new security issues. Intrusion detection systems (IDSs) guarantee both the security posture and defense against intrusions of IoT devices. IoT systems have recently utilized machine learning (ML) techniques widely for IDSs. The primary deficiencies in existing IoT security frameworks are their inadequate intrusion detection capabilities, significant latency, and prolonged processing time, leading to undesirable delays. To address these issues, this work proposes a novel range-optimized attention convolutional scattered technique (ROAST-IoT) to protect IoT networks from modern threats and intrusions. This system uses the scattered range feature selection (SRFS) model to choose the most crucial and trustworthy properties from the supplied intrusion data. After that, the attention-based convolutional feed-forward network (ACFN) technique is used to recognize the intrusion class. In addition, the loss function is estimated using the modified dingo optimization (MDO) algorithm to ensure the maximum accuracy of classifier. To evaluate and compare the performance of the proposed ROAST-IoT system, we have utilized popular intrusion datasets such as ToN-IoT, IoT-23, UNSW-NB 15, and Edge-IIoT. The analysis of the results shows that the proposed ROAST technique did better than all existing cutting-edge intrusion detection systems, with an accuracy of 99.15% on the IoT-23 dataset, 99.78% on the ToN-IoT dataset, 99.88% on the UNSW-NB 15 dataset, and 99.45% on the Edge-IIoT dataset. On average, the ROAST-IoT system achieved a high AUC-ROC of 0.998, demonstrating its capacity to distinguish between legitimate data and attack traffic. These results indicate that the ROAST-IoT algorithm effectively and reliably detects intrusion attacks mechanism against cyberattacks on IoT systems.

2.
Sensors (Basel) ; 23(20)2023 Oct 13.
Artigo em Inglês | MEDLINE | ID: mdl-37896541

RESUMO

Cloud organizations now face a challenge in managing the enormous volume of data and various resources in the cloud due to the rapid growth of the virtualized environment with many service users, ranging from small business owners to large corporations. The performance of cloud computing may suffer from ineffective resource management. As a result, resources must be distributed fairly among various stakeholders without sacrificing the organization's profitability or the satisfaction of its customers. A customer's request cannot be put on hold indefinitely just because the necessary resources are not available on the board. Therefore, a novel cloud resource allocation model incorporating security management is developed in this paper. Here, the Deep Linear Transition Network (DLTN) mechanism is developed for effectively allocating resources to cloud systems. Then, an Adaptive Mongoose Optimization Algorithm (AMOA) is deployed to compute the beamforming solution for reward prediction, which supports the process of resource allocation. Moreover, the Logic Overhead Security Protocol (LOSP) is implemented to ensure secured resource management in the cloud system, where Burrows-Abadi-Needham (BAN) logic is used to predict the agreement logic. During the results analysis, the performance of the proposed DLTN-LOSP model is validated and compared using different metrics such as makespan, processing time, and utilization rate. For system validation and testing, 100 to 500 resources are used in this study, and the results achieved a make-up of 2.3% and a utilization rate of 13 percent. Moreover, the obtained results confirm the superiority of the proposed framework, with better performance outcomes.

3.
Diagnostics (Basel) ; 13(16)2023 Aug 10.
Artigo em Inglês | MEDLINE | ID: mdl-37627904

RESUMO

Diabetes is a widely spread disease that significantly affects people's lives. The leading cause is uncontrolled levels of blood glucose, which develop eye defects over time, including Diabetic Retinopathy (DR), which results in severe visual loss. The primary factor causing blindness is considered to be DR in diabetic patients. DR treatment tries to control the disease's severity, as it is irreversible. The primary goal of this effort is to create a reliable method for automatically detecting the severity of DR. This paper proposes a new automated system (DR-NASNet) to detect and classify DR severity using an improved pretrained NASNet Model. To develop the DR-NASNet system, we first utilized a preprocessing technique that takes advantage of Ben Graham and CLAHE to lessen noise, emphasize lesions, and ultimately improve DR classification performance. Taking into account the imbalance between classes in the dataset, data augmentation procedures were conducted to control overfitting. Next, we have integrated dense blocks into the NASNet architecture to improve the effectiveness of classification results for five severity levels of DR. In practice, the DR-NASNet model achieves state-of-the-art results with a smaller model size and lower complexity. To test the performance of the DR-NASNet system, a combination of various datasets is used in this paper. To learn effective features from DR images, we used a pretrained model on the dataset. The last step is to put the image into one of five categories: No DR, Mild, Moderate, Proliferate, or Severe. To carry this out, the classifier layer of a linear SVM with a linear activation function must be added. The DR-NASNet system was tested using six different experiments. The system achieves 96.05% accuracy with the challenging DR dataset. The results and comparisons demonstrate that the DR-NASNet system improves a model's performance and learning ability. As a result, the DR-NASNet system provides assistance to ophthalmologists by describing an effective system for classifying early-stage levels of DR.

4.
Sensors (Basel) ; 23(16)2023 Aug 16.
Artigo em Inglês | MEDLINE | ID: mdl-37631741

RESUMO

Cardiovascular disorders are often diagnosed using an electrocardiogram (ECG). It is a painless method that mimics the cyclical contraction and relaxation of the heart's muscles. By monitoring the heart's electrical activity, an ECG can be used to identify irregular heartbeats, heart attacks, cardiac illnesses, or enlarged hearts. Numerous studies and analyses of ECG signals to identify cardiac problems have been conducted during the past few years. Although ECG heartbeat classification methods have been presented in the literature, especially for unbalanced datasets, they have not proven to be successful in recognizing some heartbeat categories with high performance. This study uses a convolutional neural network (CNN) model to combine the benefits of dense and residual blocks. The objective is to leverage the benefits of residual and dense connections to enhance information flow, gradient propagation, and feature reuse, ultimately improving the model's performance. This proposed model consists of a series of residual-dense blocks interleaved with optional pooling layers for downsampling. A linear support vector machine (LSVM) classified heartbeats into five classes. This makes it easier to learn and represent features from ECG signals. We first denoised the gathered ECG data to correct issues such as baseline drift, power line interference, and motion noise. The impacts of the class imbalance are then offset by resampling techniques that denoise ECG signals. An RD-CNN algorithm is then used to categorize the ECG data for the various cardiac illnesses using the retrieved characteristics. On two benchmarked datasets, we conducted extensive simulations and assessed several performance measures. On average, we have achieved an accuracy of 98.5%, a sensitivity of 97.6%, a specificity of 96.8%, and an area under the receiver operating curve (AUC) of 0.99. The effectiveness of our suggested method for detecting heart disease from ECG data was compared with several recently presented algorithms. The results demonstrate that our method is lightweight and practical, qualifying it for continuous monitoring applications in clinical settings for automated ECG interpretation to support cardiologists.


Assuntos
Cardiopatias , Infarto do Miocárdio , Humanos , Coração , Eletrocardiografia , Redes Neurais de Computação , Cardiopatias/diagnóstico
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA