Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 2.744
Filter
1.
Comput Biol Med ; 181: 109034, 2024 Aug 31.
Article in English | MEDLINE | ID: mdl-39217966

ABSTRACT

We propose a biodynamic model for managing waterborne diseases over an Internet of Things (IoT) network, leveraging the scalability of LoRa IoT technology to accommodate a growing human population. The model, based on fractional order derivatives (FOD), enables smart prediction and control of pathogens that cause waterborne diseases using IoT infrastructure. The human-pathogen-based biodynamic FOD model utilises epidemic parameters (SVIRT: susceptibility, vaccination, infection, recovery, and treatment) transmitted over the IoT network to predict pathogenic contamination in water reservoirs and dumpsites in Iji-Nike, Enugu, the study community in Nigeria. These pathogens contribute to person-to-person, water-to-person, and dumpsite-to-person transmission of disease vectors. Five control measures are proposed: potable water supply, treatment, vaccination, adequate sanitation, and health education campaigns. A stable disease-free equilibrium point is found when the effective reproduction number of the pathogens, R0eff<1 and unstable if R0eff>1. While other studies showed a 98.2% reduction in infections when using IoT alone, this paper demonstrates that combining the SVIRT epidemic control parameters (such as potable water supply and health education campaign) with IoT achieves a 99.89% reduction in infected human populations and a 99.56% reduction in pathogen populations in water reservoirs. Furthermore, integrating treatment with sanitation results in a 99.97% reduction in infected populations. Finally, combining these five control strategies nearly eliminates infection and pathogen populations, demonstrating the effectiveness of multifaceted approaches in public health and environmental management. This study provides a blueprint for governments to plan sustainable smart cities for a growing population, ensuring potable water free from pathogenic contamination,in line with the United Nations Sustainable Development Goals #6 (Clean Water and Sanitation) and #11 (Sustainable Cities and Communities).

2.
Heliyon ; 10(16): e36269, 2024 Aug 30.
Article in English | MEDLINE | ID: mdl-39224301

ABSTRACT

The Internet of Medical Things (IoMT) has transformed healthcare by connecting medical devices, sensors, and patients, significantly improving patient care. However, the sensitive data exchanged through IoMT is vulnerable to security attacks, raising serious privacy concerns. Traditional key sharing mechanisms are susceptible to compromise, posing risks to data integrity. This paper proposes a Timestamp-based Secret Key Generation (T-SKG) scheme for resource-constrained devices, generating a secret key at the patient's device and regenerating it at the doctor's device, thus eliminating direct key sharing and minimizing key compromise risks. Simulation results using MATLAB and Java demonstrate the T-SKG scheme's resilience against guessing, birthday, and brute force attacks. Specifically, there is only a 9 % chance of key compromise in a guessing attack if the attacker knows the key sequence pattern, while the scheme remains secure against brute force and birthday attacks within a specified timeframe. The T-SKG scheme is integrated into a healthcare framework to securely transmit health vitals collected using the MySignals sensor kit. For confidentiality, the Data Encryption Standard (DES) with various Cipher Block modes (ECB, CBC, CTR) is employed.

3.
Sci Rep ; 14(1): 20269, 2024 Aug 31.
Article in English | MEDLINE | ID: mdl-39217214

ABSTRACT

Implicit poisoning in federated learning is a significant threat, with malicious nodes subtly altering gradient parameters each round, making detection difficult. This study investigates this problem, revealing that temporal analysis alone struggles to identify such covert attacks, which can bypass online methods like cosine similarity and clustering. Common detection methods rely on offline analysis, resulting in delayed responses. However, recalculating gradient updates reveals distinct characteristics of malicious clients. Based on this finding, we designed a privacy-preserving detection algorithm using trajectory anomaly detection. Singular values of matrices are used as features, and an improved Isolation Forest algorithm processes these to detect malicious behavior. Experiments on MNIST, FashionMNIST, and CIFAR-10 datasets show our method achieves 94.3% detection accuracy and a false positive rate below 1.2%, indicating its high accuracy and effectiveness in detecting implicit model poisoning attacks.

4.
Sensors (Basel) ; 24(16)2024 Aug 09.
Article in English | MEDLINE | ID: mdl-39204846

ABSTRACT

Machine learning (ML) represents one of the main pillars of the current digital era, specifically in modern real-world applications. The Internet of Things (IoT) technology is foundational in developing advanced intelligent systems. The convergence of ML and IoT drives significant advancements across various domains, such as making IoT-based security systems smarter and more efficient. However, ML-based IoT systems are vulnerable to lurking attacks during the training and testing phases. An adversarial attack aims to corrupt the ML model's functionality by introducing perturbed inputs. Consequently, it can pose significant risks leading to devices' malfunction, services' interruption, and personal data misuse. This article examines the severity of adversarial attacks and accentuates the importance of designing secure and robust ML models in the IoT context. A comprehensive classification of adversarial machine learning (AML) is provided. Moreover, a systematic literature review of the latest research trends (from 2020 to 2024) of the intersection of AML and IoT-based security systems is presented. The results revealed the availability of various AML attack techniques, where the Fast Gradient Signed Method (FGSM) is the most employed. Several studies recommend the adversarial training technique to defend against such attacks. Finally, potential open issues and main research directions are highlighted for future consideration and enhancement.

5.
Sensors (Basel) ; 24(16)2024 Aug 10.
Article in English | MEDLINE | ID: mdl-39204866

ABSTRACT

Intentional electromagnetic interference attacks (e.g., jamming) against wireless connected devices such as the Internet of Things (IoT) remain a serious challenge, especially as such attacks evolve in complexity. Similarly, eavesdropping on wireless communication channels persists as an inherent vulnerability that is often exploited by adversaries. This article investigates a novel approach to enhancing information security for IoT systems via collaborative strategies that can effectively mitigate attacks targeting availability via interference and confidentiality via eavesdropping. We examine the proposed approach for two use cases. First, we consider an IoT device that experiences an interference attack, causing wireless channel outages and hindering access to transmitted IoT data. A physical-layer-based security (PLS) transmission strategy is proposed in this article to maintain target levels of information availability for devices targeted by adversarial interference. In the proposed strategy, select IoT devices leverage a cooperative transmission approach to mitigate the IoT signal outages under active interference attacks. Second, we consider the case of information confidentiality for IoT devices as they communicate over wireless channels with possible eavesdroppers. In this case, we propose a collaborative transmission strategy where IoT devices create a signal outage for the eavesdropper, preventing it from decoding the signal of the targeted devices. The analytical and numerical results of this article illustrate the effectiveness of the proposed transmission strategy in achieving desired IoT security levels with respect to availability and confidentiality for both use cases.

6.
Sensors (Basel) ; 24(16)2024 Aug 12.
Article in English | MEDLINE | ID: mdl-39204906

ABSTRACT

The Internet of Things forensics is a specialised field within digital forensics that focuses on the identification of security incidents, as well as the collection and analysis of evidence with the aim of preventing future attacks on IoT networks. IoT forensics differs from other digital forensic fields due to the unique characteristics of IoT devices, such as limited processing power and connectivity. Although numerous studies are available on IoT forensics, the field is rapidly evolving, and comprehensive surveys are needed to keep up with new developments, emerging threats, and evolving best practices. In this respect, this paper aims to review the state of the art in IoT forensics and discuss the challenges in current investigation techniques. A qualitative analysis of related reviews in the field of IoT forensics has been conducted, identifying key issues and assessing primary obstacles. Despite the variety of topics and approaches, common issues emerge. The majority of these issues are related to the collection and pre-processing of evidence because of the counter-analysis techniques and challenges associated with gathering data from devices and the cloud. Our analysis extends beyond technological problems; it further identifies the procedural problems with preparedness, reporting, and presentation as well as ethical issues. In particular, it provides insights into emerging threats and challenges in IoT forensics, increases awareness and understanding of the importance of IoT forensics in preventing cybercrimes, and ensures the security and privacy of IoT devices and networks. Our findings make a substantial contribution to the field of IoT forensics, as they not only involve a critical analysis of the challenges presented in existing works but also identify numerous problems. These insights will greatly assist researchers in identifying appropriate directions for their future research.

7.
Sensors (Basel) ; 24(16)2024 Aug 14.
Article in English | MEDLINE | ID: mdl-39204950

ABSTRACT

To establish ubiquitous and energy-efficient wireless sensor networks (WSNs), short-range Internet of Things (IoT) devices require Bluetooth low energy (BLE) technology, which functions at 2.4 GHz. This study presents a novel approach as follows: a fully integrated all-digital phase-locked loop (ADPLL)-based Gaussian frequency shift keying (GFSK) modulator incorporating two-point modulation (TPM). The modulator aims to enhance the efficiency of BLE communication in these networks. The design includes a time-to-digital converter (TDC) with the following three key features to improve linearity and time resolution: fast settling time, low dropout regulators (LDOs) that adapt to process, voltage, and temperature (PVT) variations, and interpolation assisted by an analog-to-digital converter (ADC). It features a digital controlled oscillator (DCO) with two key enhancements as follows: ΔΣ modulator dithering and hierarchical capacitive banks, which expand the frequency tuning range and improve linearity, and an integrated, fast-converging least-mean-square (LMS) algorithm for DCO gain calibration, which ensures compliance with BLE 5.0 stable modulation index (SMI) requirements. Implemented in a 28 nm CMOS process, occupying an active area of 0.33 mm2, the modulator demonstrates a wide frequency tuning range of from 2.21 to 2.58 GHz, in-band phase noise of -102.1 dBc/Hz, and FSK error of 1.42% while consuming 1.6 mW.

8.
Sensors (Basel) ; 24(16)2024 Aug 15.
Article in English | MEDLINE | ID: mdl-39204979

ABSTRACT

In the era of ubiquitous computing, the challenges imposed by the increasing demand for real-time data processing, security, and energy efficiency call for innovative solutions. The emergence of fog computing has provided a promising paradigm to address these challenges by bringing computational resources closer to data sources. Despite its advantages, the fog computing characteristics pose challenges in heterogeneous environments in terms of resource allocation and management, provisioning, security, and connectivity, among others. This paper introduces COGNIFOG, a novel cognitive fog framework currently under development, which was designed to leverage intelligent, decentralized decision-making processes, machine learning algorithms, and distributed computing principles to enable the autonomous operation, adaptability, and scalability across the IoT-edge-cloud continuum. By integrating cognitive capabilities, COGNIFOG is expected to increase the efficiency and reliability of next-generation computing environments, potentially providing a seamless bridge between the physical and digital worlds. Preliminary experimental results with a limited set of connectivity-related COGNIFOG building blocks show promising improvements in network resource utilization in a real-world-based IoT scenario. Overall, this work paves the way for further developments on the framework, which are aimed at making it more intelligent, resilient, and aligned with the ever-evolving demands of next-generation computing environments.

9.
Sensors (Basel) ; 24(16)2024 Aug 16.
Article in English | MEDLINE | ID: mdl-39205003

ABSTRACT

The Industrial Internet of Things has enabled the integration and analysis of vast volumes of data across various industries, with the maritime sector being no exception. Advances in cloud computing and deep learning (DL) are continuously reshaping the industry, particularly in optimizing maritime operations such as Predictive Maintenance (PdM). In this study, we propose a novel DL-based framework focusing on the fault detection task of PdM in marine operations, leveraging time-series data from sensors installed on shipboard machinery. The framework is designed as a scalable and cost-efficient software solution, encompassing all stages from data collection and pre-processing at the edge to the deployment and lifecycle management of DL models. The proposed DL architecture utilizes Graph Attention Networks (GATs) to extract spatio-temporal information from the time-series data and provides explainable predictions through a feature-wise scoring mechanism. Additionally, a custom evaluation metric with real-world applicability is employed, prioritizing both prediction accuracy and the timeliness of fault identification. To demonstrate the effectiveness of our framework, we conduct experiments on three types of open-source datasets relevant to PdM: electrical data, bearing datasets, and data from water circulation experiments.

10.
Sensors (Basel) ; 24(16)2024 Aug 17.
Article in English | MEDLINE | ID: mdl-39205014

ABSTRACT

The proliferation of the IoT has led to the development of diverse application architectures to optimize IoT systems' deployment, operation, and maintenance. This survey provides a comprehensive overview of the existing IoT application architectures, highlighting their key features, strengths, and limitations. The architectures are categorized based on their deployment models, such as cloud, edge, and fog computing approaches, each offering distinct advantages regarding scalability, latency, and resource efficiency. Cloud architectures leverage centralized data processing and storage capabilities to support large-scale IoT applications but often suffer from high latency and bandwidth constraints. Edge architectures mitigate these issues by bringing computation closer to the data source, enhancing real-time processing, and reducing network congestion. Fog architectures combine the strengths of both cloud and edge paradigms, offering a balanced solution for complex IoT environments. This survey also examines emerging trends and technologies in IoT application management, such as the solutions provided by the major IoT service providers like Intel, AWS, Microsoft Azure, and GCP. Through this study, the survey identifies latency, privacy, and deployment difficulties as key areas for future research. It highlights the need to advance IoT Edge architectures to reduce network traffic, improve data privacy, and enhance interoperability by developing multi-application and multi-protocol edge gateways for efficient IoT application management.

11.
Sensors (Basel) ; 24(16)2024 Aug 17.
Article in English | MEDLINE | ID: mdl-39205018

ABSTRACT

Orthogonal time frequency space (OTFS) modulation has recently found its place in the literature as a much more effective waveform in time-varying channels. It is anticipated that OTFS will be widely used in the communications of smart vehicles, especially those considered within the scope of Internet of Things (IoT). There are efforts to obtain customized traditional point-to-point single-input single-output (SISO)-OTFS studies in the literature, but their BER performance seems a bit low. It is possible to use cooperative communications in order improve BER performance, but it is noticeable that there are very few OTFS studies in the area of cooperative communications. In this study, to the best of the authors' knowledge, it is addressed for the first time in the literature that better performance is achieved for the OTFS waveform transmission in a selective decode-and-forward (SDF) cooperative communication scenario. In this context, by establishing a cooperative communication model consisting of a base station/source, a traffic sign/relay and a smart vehicle/destination moving at a constant speed, an end-to-end BER expression is derived. SNR-BER analysis is performed with this SDF-OTFS scheme and it is shown that a superior BER performance is achieved compared to the traditional point-to-point single-input single-output (SISO)-OTFS structure.

12.
Sensors (Basel) ; 24(16)2024 Aug 19.
Article in English | MEDLINE | ID: mdl-39205047

ABSTRACT

The Internet of Things (IoT) is a promising technology for sensing and monitoring the environment to reduce disaster impact. Energy is one of the major concerns for IoT devices, as sensors used in IoT devices are battery-operated. Thus, it is important to reduce energy consumption, especially during data transmission in disaster-prone situations. Clustering-based communication helps reduce a node's energy decay during data transmission and enhances network lifetime. Many hybrid combination algorithms have been proposed for clustering and routing protocols to improve network lifetime in disaster scenarios. However, the performance of these protocols varies widely based on the underlying network configuration and the optimisation parameters considered. In this research, we used the clustering parameters most relevant to disaster scenarios, such as the node's residual energy, distance to sink, and network coverage. We then proposed the bio-inspired hybrid BOA-PSO algorithm, where the Butterfly Optimisation Algorithm (BOA) is used for clustering and Particle Swarm Optimisation (PSO) is used for the routing protocol. The performance of the proposed algorithm was compared with that of various benchmark protocols: LEACH, DEEC, PSO, PSO-GA, and PSO-HAS. Residual energy, network throughput, and network lifetime were considered performance metrics. The simulation results demonstrate that the proposed algorithm effectively conserves residual energy, achieving more than a 17% improvement for short-range scenarios and a 10% improvement for long-range scenarios. In terms of throughput, the proposed method delivers a 60% performance enhancement compared to LEACH, a 53% enhancement compared to DEEC, and a 37% enhancement compared to PSO. Additionally, the proposed method results in a 60% reduction in packet drops compared to LEACH and DEEC, and a 30% reduction compared to PSO. It increases network lifetime by 10-20% compared to the benchmark algorithms.

13.
Sensors (Basel) ; 24(16)2024 Aug 20.
Article in English | MEDLINE | ID: mdl-39205063

ABSTRACT

Energy harvesting combined with spectrum sharing offers a promising solution to the growing demand for spectrum while keeping energy costs low. New Radio Unlicensed (NR-U) technology enables telecom operators to utilize unlicensed spectrum in addition to the licensed spectrum already in use. Along with this, the energy demands for the Internet of Things (IoT) can be met through energy harvesting. In this regard, the ubiquity and ease of implementation make the RF-powered NR-U network a sustainable solution for cellular IoT. Using a Markov chain, we model the NR-U network with nodes powered by the base station (BS). We derive closed-form expressions for the normalized saturated throughput of nodes and the BS, along with the mean packet delay at the node. Additionally, we compute the transmit outage probability of the node. These quality of service (QoS) parameters are analyzed for different values of congestion window size, TXOP parameter, maximum energy level, and energy threshold of the node. Additionally, the effect of network density on collision, transmission, and energy harvesting probabilities is observed. We validate our model through simulations.

14.
Sensors (Basel) ; 24(16)2024 Aug 20.
Article in English | MEDLINE | ID: mdl-39205072

ABSTRACT

The excessive use of electronic devices for prolonged periods has led to problems such as neck pain and pressure injury in sedentary people. If not detected and corrected early, these issues can cause serious risks to physical health. Detectors for generic objects cannot adequately capture such subtle neck behaviors, resulting in missed detections. In this paper, we explore a deep learning-based solution for detecting abnormal behavior of the neck and propose a model called NABNet that combines object detection based on YOLOv5s with pose estimation based on Lightweight OpenPose. NABNet extracts the detailed behavior characteristics of the neck from global to local and detects abnormal behavior by analyzing the angle of the data. We deployed NABNet on the cloud and edge devices to achieve remote monitoring and abnormal behavior alarms. Finally, we applied the resulting NABNet-based IoT system for abnormal behavior detection in order to evaluate its effectiveness. The experimental results show that our system can effectively detect abnormal neck behavior and raise alarms on the cloud platform, with the highest accuracy reaching 94.13%.


Subject(s)
Deep Learning , Neck , Humans , Neck Pain/diagnosis , Internet of Things , Algorithms
15.
Sensors (Basel) ; 24(16)2024 Aug 22.
Article in English | MEDLINE | ID: mdl-39205138

ABSTRACT

This paper presents a new edge detection process implemented in an embedded IoT device called Bee Smart Detection node to detect catastrophic apiary events. Such events include swarming, queen loss, and the detection of Colony Collapse Disorder (CCD) conditions. Two deep learning sub-processes are used for this purpose. The first uses a fuzzy multi-layered neural network of variable depths called fuzzy-stranded-NN to detect CCD conditions based on temperature and humidity measurements inside the beehive. The second utilizes a deep learning CNN model to detect swarming and queen loss cases based on sound recordings. The proposed processes have been implemented into autonomous Bee Smart Detection IoT devices that transmit their measurements and the detection results to the cloud over Wi-Fi. The BeeSD devices have been tested for easy-to-use functionality, autonomous operation, deep learning model inference accuracy, and inference execution speeds. The author presents the experimental results of the fuzzy-stranded-NN model for detecting critical conditions and deep learning CNN models for detecting swarming and queen loss. From the presented experimental results, the stranded-NN achieved accuracy results up to 95%, while the ResNet-50 model presented accuracy results up to 99% for detecting swarming or queen loss events. The ResNet-18 model is also the fastest inference speed replacement of the ResNet-50 model, achieving up to 93% accuracy results. Finally, cross-comparison of the deep learning models with machine learning ones shows that deep learning models can provide at least 3-5% better accuracy results.

16.
IEEE Sens J ; 24(3): 3863-3873, 2024 Feb.
Article in English | MEDLINE | ID: mdl-39131729

ABSTRACT

Ultra high frequency (UHF) passive radio frequency identification (RFID) tag-based sensors are proposed for intravenous (IV) fluid level monitoring in medical Internet of Things (IoT) applications. Two versions of the sensor are proposed: a binary sensor (i.e., full vs. empty state sensing) and a real-time (i.e., continuous level) sensor. The operating principle is demonstrated using full-wave electromagnetic simulation at 910 MHz and validated with experimental results. Generalized Additive Model (GAM) and random forest algorithms are employed for each interrogation dataset. Real-time sensing is accomplished with small deviations across the models. A minimum of 72% and a maximum of 97% of cases are within a 20% error for the GAM model and 62% to 98% for the random forest model. The proposed sensor is battery-free, lightweight, low-cost, and highly reliable. The read range of the proposed sensor is 4.6 m.

17.
J Med Internet Res ; 26: e57258, 2024 Aug 07.
Article in English | MEDLINE | ID: mdl-39110963

ABSTRACT

BACKGROUND: The integration of smart technologies, including wearables and voice-activated devices, is increasingly recognized for enhancing the independence and well-being of older adults. However, the long-term dynamics of their use and the coadaptation process with older adults remain poorly understood. This scoping review explores how interactions between older adults and smart technologies evolve over time to improve both user experience and technology utility. OBJECTIVE: This review synthesizes existing research on the coadaptation between older adults and smart technologies, focusing on longitudinal changes in use patterns, the effectiveness of technological adaptations, and the implications for future technology development and deployment to improve user experiences. METHODS: Following the Joanna Briggs Institute Reviewer's Manual and PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) guidelines, this scoping review examined peer-reviewed papers from databases including Ovid MEDLINE, Ovid Embase, PEDro, Ovid PsycINFO, and EBSCO CINAHL from the year 2000 to August 28, 2023, and included forward and backward searches. The search was updated on March 1, 2024. Empirical studies were included if they involved (1) individuals aged 55 years or older living independently and (2) focused on interactions and adaptations between older adults and wearables and voice-activated virtual assistants in interventions for a minimum period of 8 weeks. Data extraction was informed by the selection and optimization with compensation framework and the sex- and gender-based analysis plus theoretical framework and used a directed content analysis approach. RESULTS: The search yielded 16,143 papers. Following title and abstract screening and a full-text review, 5 papers met the inclusion criteria. Study populations were mostly female participants and aged 73-83 years from the United States and engaged with voice-activated virtual assistants accessed through smart speakers and wearables. Users frequently used simple commands related to music and weather, integrating devices into daily routines. However, communication barriers often led to frustration due to devices' inability to recognize cues or provide personalized responses. The findings suggest that while older adults can integrate smart technologies into their lives, a lack of customization and user-friendly interfaces hinder long-term adoption and satisfaction. The studies highlight the need for technology to be further developed so they can better meet this demographic's evolving needs and call for research addressing small sample sizes and limited diversity. CONCLUSIONS: Our findings highlight a critical need for continued research into the dynamic and reciprocal relationship between smart technologies and older adults over time. Future studies should focus on more diverse populations and extend monitoring periods to provide deeper insights into the coadaptation process. Insights gained from this review are vital for informing the development of more intuitive, user-centric smart technology solutions to better support the aging population in maintaining independence and enhancing their quality of life. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): RR2-10.2196/51129.


Subject(s)
Wearable Electronic Devices , Humans , Aged , Middle Aged , Female , Male , Aged, 80 and over , Voice , Longitudinal Studies
18.
Heliyon ; 10(15): e34429, 2024 Aug 15.
Article in English | MEDLINE | ID: mdl-39145001

ABSTRACT

Due to the advent of IoT (Internet of Things) based devices that help to monitor different human behavioral aspects. These aspects include sleeping patterns, activity patterns, heart rate variability (HRV) patterns, location-based moving patterns, blood oxygen levels, etc. A correlative study of these patterns can be used to find linkages of behavioral patterns with human health conditions. To perform this task, a wide variety of models is proposed by researchers, but most of them vary in terms of used parameters, which limits their accuracy of analysis. Moreover, most of these models are highly complex and have lower parameter flexibility, thus, cannot be scaled for real-time use cases. To overcome these issues, this paper proposes design of a behavior modeling method that assists in future health predictions via multimodal feature correlations using medical IoT devices via deep transfer learning analysis. The proposed model initially collects large-scale sensor data about the subjects, and correlates them with the existing medical conditions. This correlation is done via extraction of multidomain feature sets that assist in spectral analysis, entropy evaluations, scaling estimation, and window-based analysis. These multidomain feature sets are selected by a Firefly Optimizer (FFO) and are used to train a Recurrent Neural Network (RNN) Model, that assists in prediction of different diseases. These predictions are used to train a recommendation engine that uses Apriori and Fuzzy C Means (FCM) for suggesting corrective behavioral measures for a healthier lifestyle under real-time conditions. Due to these operations, the proposed model is able to improve behavior prediction accuracy by 16.4%, precision of prediction by 8.3%, AUC (area under the curve) of prediction by 9.5%, and accuracy of corrective behavior recommendation by 3.9% when compared with existing methods under similar evaluation conditions.

19.
PeerJ Comput Sci ; 10: e2231, 2024.
Article in English | MEDLINE | ID: mdl-39145209

ABSTRACT

In the modern digital market flooded by nearly endless cyber-security hazards, sophisticated IDS (intrusion detection systems) can become invaluable in defending against intricate security threats. Sybil-Free Metric-based routing protocol for low power and lossy network (RPL) Trustworthiness Scheme (SF-MRTS) captures the nature of the biggest threat to the routing protocol for low-power and lossy networks under the RPL module, known as the Sybil attack. Sybil attacks build a significant security challenge for RPL networks where an attacker can distort at least two hop paths and disrupt network processes. Using such a new way of calculating node reliability, we introduce a cutting-edge approach, evaluating parameters beyond routing metrics like energy conservation and actuality. SF-MRTS works precisely towards achieving a trusted network by introducing such trust metrics on secure paths. Therefore, this may be considered more likely to withstand the attacks because of these security improvements. The simulation function of SF-MRTS clearly shows its concordance with the security risk management features, which are also necessary for the network's performance and stability maintenance. These mechanisms are based on the principles of game theory, and they allocate attractions to the nodes that cooperate while imposing penalties on the nodes that do not. This will be the way to avoid damage to the network, and it will lead to collaboration between the nodes. SF-MRTS is a security technology for emerging industrial Internet of Things (IoT) network attacks. It effectively guaranteed reliability and improved the networks' resilience in different scenarios.

20.
PeerJ Comput Sci ; 10: e2145, 2024.
Article in English | MEDLINE | ID: mdl-39145228

ABSTRACT

The Internet of Things (IoT) is becoming more prevalent in our daily lives. A recent industry report projected the global IoT market to be worth more than USD 4 trillion by 2032. To cope with the ever-increasing IoT devices in use, identifying and securing IoT devices has become highly crucial for network administrators. In that regard, network traffic classification offers a promising solution by precisely identifying IoT devices to enhance network visibility, allowing better network security. Currently, most IoT device identification solutions revolve around machine learning, outperforming prior solutions like port and behavioural-based. Although performant, these solutions often experience performance degradation over time due to statistical changes in the data. As a result, they require frequent retraining, which is computationally expensive. Therefore, this article aims to improve the model performance through a robust alternative feature set. The improved feature set leverages payload lengths to model the unique characteristics of IoT devices and remains stable over time. Besides that, this article utilizes the proposed feature set with Random Forest and OneVSRest to optimize the learning process, particularly concerning the easier addition of new IoT devices. On the other hand, this article introduces weekly dataset segmentation to ensure fair evaluation over different time frames. Evaluation on two datasets, a public dataset, IoT Traffic Traces, and a self-collected dataset, IoT-FSCIT, show that the proposed feature set maintained above 80% accuracy throughout all weeks on the IoT Traffic Traces dataset, outperforming selected benchmark studies while improving accuracy over time by +10.13% on the IoT-FSCIT dataset.

SELECTION OF CITATIONS
SEARCH DETAIL