Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 655
Filtrar
1.
BMC Med Imaging ; 24(1): 123, 2024 May 27.
Artículo en Inglés | MEDLINE | ID: mdl-38797827

RESUMEN

The quick proliferation of pandemic diseases has been imposing many concerns on the international health infrastructure. To combat pandemic diseases in smart cities, Artificial Intelligence of Things (AIoT) technology, based on the integration of artificial intelligence (AI) with the Internet of Things (IoT), is commonly used to promote efficient control and diagnosis during the outbreak, thereby minimizing possible losses. However, the presence of multi-source institutional data remains one of the major challenges hindering the practical usage of AIoT solutions for pandemic disease diagnosis. This paper presents a novel framework that utilizes multi-site data fusion to boost the accurateness of pandemic disease diagnosis. In particular, we focus on a case study of COVID-19 lesion segmentation, a crucial task for understanding disease progression and optimizing treatment strategies. In this study, we propose a novel multi-decoder segmentation network for efficient segmentation of infections from cross-domain CT scans in smart cities. The multi-decoder segmentation network leverages data from heterogeneous domains and utilizes strong learning representations to accurately segment infections. Performance evaluation of the multi-decoder segmentation network was conducted on three publicly accessible datasets, demonstrating robust results with an average dice score of 89.9% and an average surface dice of 86.87%. To address scalability and latency issues associated with centralized cloud systems, fog computing (FC) emerges as a viable solution. FC brings resources closer to the operator, offering low latency and energy-efficient data management and processing. In this context, we propose a unique FC technique called PANDFOG to deploy the multi-decoder segmentation network on edge nodes for practical and clinical applications of automated COVID-19 pneumonia analysis. The results of this study highlight the efficacy of the multi-decoder segmentation network in accurately segmenting infections from cross-domain CT scans. Moreover, the proposed PANDFOG system demonstrates the practical deployment of the multi-decoder segmentation network on edge nodes, providing real-time access to COVID-19 segmentation findings for improved patient monitoring and clinical decision-making.


Asunto(s)
COVID-19 , Aprendizaje Profundo , Pandemias , Tomografía Computarizada por Rayos X , Humanos , Tomografía Computarizada por Rayos X/métodos , SARS-CoV-2 , Ciudades , Internet de las Cosas
2.
Sensors (Basel) ; 24(11)2024 Jun 01.
Artículo en Inglés | MEDLINE | ID: mdl-38894363

RESUMEN

The inability to see makes moving around very difficult for visually impaired persons. Due to their limited movement, they also struggle to protect themselves against moving and non-moving objects. Given the substantial rise in the population of those with vision impairments in recent years, there has been an increasing amount of research devoted to the development of assistive technologies. This review paper highlights the state-of-the-art assistive technology, tools, and systems for improving the daily lives of visually impaired people. Multi-modal mobility assistance solutions are also evaluated for both indoor and outdoor environments. Lastly, an analysis of several approaches is also provided, along with recommendations for the future.


Asunto(s)
Dispositivos de Autoayuda , Personas con Daño Visual , Humanos , Personas con Daño Visual/rehabilitación
3.
Sensors (Basel) ; 24(7)2024 Mar 29.
Artículo en Inglés | MEDLINE | ID: mdl-38610425

RESUMEN

The Internet of Things (IoT) has revolutionized the world, connecting billions of devices that offer assistance in various aspects of users' daily lives. Context-aware IoT applications exploit real-time environmental, user-specific, or situational data to dynamically adapt to users' needs, offering tailored experiences. In particular, Location-Based Services (LBS) exploit geographical information to adapt to environmental settings or provide recommendations based on users' and nodes' positions, thus delivering efficient and personalized services. To this end, there is growing interest in developing IoT localization systems within the scientific community. In addition, due to the sensitivity and privacy inherent to precise location information, LBS introduce new security challenges. To ensure a more secure and trustworthy system, researchers are studying how to prevent vulnerabilities and mitigate risks from the early design stages of LBS-empowered IoT applications. The goal of this study is to carry out an in-depth examination of localization techniques for IoT, with an emphasis on both the signal-processing design and security aspects. The investigation focuses primarily on active radio localization techniques, classifying them into range-based and range-free algorithms, while also exploring hybrid approaches. Next, security considerations are explored in depth, examining the main attacks for each localization technique and linking them to the most interesting solutions proposed in the literature. By highlighting advances, analyzing challenges, and providing solutions, the survey aims to guide researchers in navigating the complex IoT localization landscape.

4.
Sensors (Basel) ; 24(8)2024 Apr 14.
Artículo en Inglés | MEDLINE | ID: mdl-38676127

RESUMEN

The Internet of Things (IoT) will bring about the next industrial revolution in Industry 4.0. The communication aspect of IoT devices is one of the most critical factors in choosing the device that is suitable for use. Thus far, the IoT physical layer communication challenges have been met with various communications protocols that provide varying strengths and weaknesses. This paper summarizes the network architectures of some of the most popular IoT wireless communications protocols. It also presents a comparative analysis of some of the critical features, including power consumption, coverage, data rate, security, cost, and quality of service (QoS). This comparative study shows that low-power wide area network (LPWAN)-based IoT protocols (LoRa, Sigfox, NB-IoT, LTE-M) are more suitable for future industrial applications because of their energy efficiency, high coverage, and cost efficiency. In addition, the study also presents an Industrial Internet of Things (IIoT) application perspective on the suitability of LPWAN protocols in a particular scenario and addresses some open issues that need to be researched. Thus, this study can assist in deciding the most suitable IoT communication protocol for an industrial and production field.

5.
Sensors (Basel) ; 24(9)2024 Apr 24.
Artículo en Inglés | MEDLINE | ID: mdl-38732793

RESUMEN

During the implementation of the Internet of Things (IoT), the performance of communication and sensing antennas that are embedded in smart surfaces or smart devices can be affected by objects in their reactive near field due to detuning and antenna mismatch. Matching networks have been proposed to re-establish impedance matching when antennas become detuned due to environmental factors. In this work, the change in the reflection coefficient at the antenna, due to the presence of objects, is first characterized as a function of the frequency and object distance by applying Gaussian process regression on experimental data. Based on this characterization, for random object positions, it is shown through simulation that a dynamic environment can lower the reliability of a matching network by up to 90%, depending on the type of object, the probability distribution of the object distance, and the required bandwidth. As an alternative to complex and power-consuming real-time adaptive matching, a new, resilient network tuning strategy is proposed that takes into account these random variations. This new approach increases the reliability of the system by 10% to 40% in these dynamic environment scenarios.

6.
Sensors (Basel) ; 24(9)2024 Apr 30.
Artículo en Inglés | MEDLINE | ID: mdl-38732961

RESUMEN

Wireless Sensor Networks (WSNs) are crucial in various fields including Health Care Monitoring, Battlefield Surveillance, and Smart Agriculture. However, WSNs are susceptible to malicious attacks due to the massive quantity of sensors within them. Hence, there is a demand for a trust evaluation framework within WSNs to function as a secure system, to identify and isolate malicious or faulty sensor nodes. This information can be leveraged by neighboring nodes, to prevent collaboration in tasks like data aggregation and forwarding. While numerous trust frameworks have been suggested in the literature to assess trust scores and examine the reliability of sensors through direct and indirect communications, implementing these trust evaluation criteria is challenging due to the intricate nature of the trust evaluation process and the limited availability of datasets. This research conducts a novel comparative analysis of three trust management models: "Lightweight Trust Management based on Bayesian and Entropy (LTMBE)", "Beta-based Trust and Reputation Evaluation System (BTRES)", and "Lightweight and Dependable Trust System (LDTS)". To assess the practicality of these trust management models, we compare and examine their performance in multiple scenarios. Additionally, we assess and compare how well the trust management approaches perform in response to two significant cyber-attacks. Based on the experimental comparative analysis, it can be inferred that the LTMBE model is optimal for WSN applications emphasizing high energy efficiency, while the BTRES model is most suitable for WSN applications prioritizing critical security measures. The conducted empirical comparative analysis can act as a benchmark for upcoming research on trust evaluation frameworks for WSNs.

7.
Sensors (Basel) ; 24(9)2024 Apr 30.
Artículo en Inglés | MEDLINE | ID: mdl-38732972

RESUMEN

The escalating demand for versatile wireless devices has fostered the need to reduce the antenna footprint to support the integration of multiple new functionalities. This poses a significant challenge for the Internet of things (IoT) antenna designers tasked with creating antennas capable of supporting multiband operation within physical constraints. This work aims to address this challenge by focusing on the optimization of an antenna booster element to achieve multiband performance, accomplished through the design of a band-reject filter. This proposal entails a printed circuit board (PCB) measuring 142 mm × 60 mm, with a clearance area of 12 mm × 40 mm, incorporating an antenna booster element of 30 mm × 3 mm × 1 mm (0.07 λ). This configuration covers frequencies in the LFR (low-frequency range) from 698 MHz to 960 MHz and the HFR (high-frequency range) from 1710 MHz to 2690 MHz. A theoretical analysis is conducted to optimize bandwidth in both frequency regions. Finally, a prototype validates the analytic results.

8.
Sensors (Basel) ; 24(11)2024 May 22.
Artículo en Inglés | MEDLINE | ID: mdl-38894095

RESUMEN

The revolution of the Internet of Things (IoT) and the Web of Things (WoT) has brought new opportunities and challenges for the information retrieval (IR) field. The exponential number of interconnected physical objects and real-time data acquisition requires new approaches and architectures for IR systems. Research and prototypes can be crucial in designing and developing new systems and refining architectures for IR in the WoT. This paper proposes a unified and holistic approach for IR in the WoT, called IR.WoT. The proposed system contemplates the critical indexing, scoring, and presentation stages applied to some smart cities' use cases and scenarios. Overall, this paper describes the research, architecture, and vision for advancing the field of IR in the WoT and addresses some of the remaining challenges and opportunities in this exciting area. The article also describes the design considerations, cloud implementation, and experimentation based on a simulated collection of synthetic XML documents with technical efficiency measures. The experimentation results show promising outcomes, whereas further studies are required to improve IR.WoT effectiveness, considering the WoT dynamic characteristics and, more importantly, the heterogeneity and divergence of WoT modeling proposals in the IR domain.

9.
Sensors (Basel) ; 24(2)2024 Jan 12.
Artículo en Inglés | MEDLINE | ID: mdl-38257587

RESUMEN

Traditional aquaculture systems appear challenged by the high levels of total ammoniacal nitrogen (TAN) produced, which can harm aquatic life. As demand for global fish production continues to increase, farmers should adopt recirculating aquaculture systems (RAS) equipped with biofilters to improve the water quality of the culture. The biofilter plays a crucial role in ammonia removal. Therefore, a biofilter such as a moving bed biofilm reactor (MBBR) biofilter is usually used in the RAS to reduce ammonia. However, the disadvantage of biofilter operation is that it requires an automatic system with a water quality monitoring and control system to ensure optimal performance. Therefore, this study focuses on developing an Internet of Things (IoT) system to monitor and control water quality to achieve optimal biofilm performance in laboratory-scale MBBR. From 35 days into the experiment, water quality was maintained by an aerator's on/off control to provide oxygen levels suitable for the aquatic environment while monitoring the pH, temperature, and total dissolved solids (TDS). When the amount of dissolved oxygen (DO) in the MBBR was optimal, the highest TAN removal efficiency was 50%, with the biofilm thickness reaching 119.88 µm. The forthcoming applications of the IoT water quality monitoring and control system in MBBR enable farmers to set up a system in RAS that can perform real-time measurements, alerts, and adjustments of critical water quality parameters such as TAN levels.


Asunto(s)
Amoníaco , Internet de las Cosas , Animales , Biopelículas , Reactores Biológicos , Calidad del Agua , Oxígeno
10.
Sensors (Basel) ; 24(4)2024 Feb 10.
Artículo en Inglés | MEDLINE | ID: mdl-38400323

RESUMEN

In the era of continuous development in Internet of Things (IoT) technology, smart services are penetrating various facets of societal life, leading to a growing demand for interconnected devices. Many contemporary devices are no longer mere data producers but also consumers of data. As a result, massive amounts of data are transmitted to the cloud, but the latency generated in edge-to-cloud communication is unacceptable for many tasks. In response to this, this paper introduces a novel contribution-a layered computing network built on the principles of fog computing, accompanied by a newly devised algorithm designed to optimize user tasks and allocate computing resources within rechargeable networks. The proposed algorithm, a synergy of Lyapunov-based, dynamic Long Short-Term Memory (LSTM) networks, and Particle Swarm Optimization (PSO), allows for predictive task allocation. The fog servers dynamically train LSTM networks to effectively forecast the data features of user tasks, facilitating proper unload decisions based on task priorities. In response to the challenge of slower hardware upgrades in edge devices compared to user demands, the algorithm optimizes the utilization of low-power devices and addresses performance limitations. Additionally, this paper considers the unique characteristics of rechargeable networks, where computing nodes acquire energy through charging. Utilizing Lyapunov functions for dynamic resource control enables nodes with abundant resources to maximize their potential, significantly reducing energy consumption and enhancing overall performance. The simulation results demonstrate that our algorithm surpasses traditional methods in terms of energy efficiency and resource allocation optimization. Despite the limitations of prediction accuracy in Fog Servers (FS), the proposed results significantly promote overall performance. The proposed approach improves the efficiency and the user experience of Internet of Things systems in terms of latency and energy consumption.

11.
Sensors (Basel) ; 24(4)2024 Feb 11.
Artículo en Inglés | MEDLINE | ID: mdl-38400338

RESUMEN

In order to achieve the Sustainable Development Goals (SDG), it is imperative to ensure the safety of drinking water. The characteristics of each drinkable water, encompassing taste, aroma, and appearance, are unique. Inadequate water infrastructure and treatment can affect these features and may also threaten public health. This study utilizes the Internet of Things (IoT) in developing a monitoring system, particularly for water quality, to reduce the risk of contracting diseases. Water quality components data, such as water temperature, alkalinity or acidity, and contaminants, were obtained through a series of linked sensors. An Arduino microcontroller board acquired all the data and the Narrow Band-IoT (NB-IoT) transmitted them to the web server. Due to limited human resources to observe the water quality physically, the monitoring was complemented by real-time notifications alerts via a telephone text messaging application. The water quality data were monitored using Grafana in web mode, and the binary classifiers of machine learning techniques were applied to predict whether the water was drinkable or not based on the data collected, which were stored in a database. The non-decision tree, as well as the decision tree, were evaluated based on the improvements of the artificial intelligence framework. With a ratio of 60% for data training: at 20% for data validation, and 10% for data testing, the performance of the decision tree (DT) model was more prominent in comparison with the Gradient Boosting (GB), Random Forest (RF), Neural Network (NN), and Support Vector Machine (SVM) modeling approaches. Through the monitoring and prediction of results, the authorities can sample the water sources every two weeks.


Asunto(s)
Agua Potable , Internet de las Cosas , Humanos , Inteligencia Artificial , Nube Computacional , Exactitud de los Datos
12.
Sensors (Basel) ; 24(5)2024 Feb 22.
Artículo en Inglés | MEDLINE | ID: mdl-38474942

RESUMEN

It is well known that buildings have a sizeable energy and environmental footprint. In particular, in environments like university campuses, the occupants as well as occupancy in shared spaces varies over time. Systems for cooling in such environments that are centrally controlled are typically threshold driven and do not account for occupant feedback and thus are often relying on a reactive approach (fix after identifying problems). Therefore, having a fixed thermal operating set point may not be optimal in such cases-both from an occupant comfort and well-being as well as an energy efficiency perspective. To address this issue, a study was conducted which involved development and deployment of an experimental Internet of Things (IoT) prototype system and an Android application that facilitated people engagement on a university campus located in the UAE which typically exhibits hot climatic conditions. This paper showcases data driven insights obtained from this study, and in particular, how to achieve a balance between the conflicting goals of improving occupant comfort and energy efficiency. Findings from this study underscore the need for regular reassessments and adaptation. The proposed solution is low cost and easy to deploy and has the potential to reap significant savings through a reduction in energy consumption with estimates indicating around 50-100 kWh/day of savings per building and the resulting environmental impact. These findings would appeal to stakeholders who are keen to improve energy efficiency and reduce their operating expenses and environmental footprint in such climatic conditions. Furthermore, collective action from a large number of entities could result in significant impact through this cumulative effect.

13.
Sensors (Basel) ; 24(8)2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38676156

RESUMEN

The Internet of Things (IoT) includes billions of sensors and actuators (which we refer to as IoT devices) that harvest data from the physical world and send it via the Internet to IoT applications to provide smart IoT services and products. Deploying, managing, and maintaining IoT devices for the exclusive use of an individual IoT application is inefficient and involves significant costs and effort that often outweigh the benefits. On the other hand, enabling large numbers of IoT applications to share available third-party IoT devices, which are deployed and maintained independently by a variety of IoT device providers, reduces IoT application development costs, time, and effort. To achieve a positive cost/benefit ratio, there is a need to support the sharing of third-party IoT devices globally by providing effective IoT device discovery, use, and pay between IoT applications and third-party IoT devices. A solution for global IoT device sharing must be the following: (1) scalable to support a vast number of third-party IoT devices, (2) interoperable to deal with the heterogeneity of IoT devices and their data, and (3) IoT-owned, i.e., not owned by a specific individual or organization. This paper surveys existing techniques that support discovering, using, and paying for third-party IoT devices. To ensure that this survey is comprehensive, this paper presents our methodology, which is inspired by Systematic Literature Network Analysis (SLNA), combining the Systematic Literature Review (SLR) methodology with Citation Network Analysis (CNA). Finally, this paper outlines the research gaps and directions for novel research to realize global IoT device sharing.

14.
Sensors (Basel) ; 24(10)2024 May 18.
Artículo en Inglés | MEDLINE | ID: mdl-38794074

RESUMEN

Stress is a natural yet potentially harmful aspect of human life, necessitating effective management, particularly during overwhelming experiences. This paper presents a scoping review of personalized stress detection models using wearable technology. Employing the PRISMA-ScR framework for rigorous methodological structuring, we systematically analyzed literature from key databases including Scopus, IEEE Xplore, and PubMed. Our focus was on biosignals, AI methodologies, datasets, wearable devices, and real-world implementation challenges. The review presents an overview of stress and its biological mechanisms, details the methodology for the literature search, and synthesizes the findings. It shows that biosignals, especially EDA and PPG, are frequently utilized for stress detection and demonstrate potential reliability in multimodal settings. Evidence for a trend towards deep learning models was found, although the limited comparison with traditional methods calls for further research. Concerns arise regarding the representativeness of datasets and practical challenges in deploying wearable technologies, which include issues related to data quality and privacy. Future research should aim to develop comprehensive datasets and explore AI techniques that are not only accurate but also computationally efficient and user-centric, thereby closing the gap between theoretical models and practical applications to improve the effectiveness of stress detection systems in real scenarios.


Asunto(s)
Dispositivos Electrónicos Vestibles , Humanos , Estrés Psicológico/diagnóstico , Técnicas Biosensibles/métodos
15.
Sensors (Basel) ; 24(11)2024 Jun 06.
Artículo en Inglés | MEDLINE | ID: mdl-38894471

RESUMEN

The integration of cutting-edge technologies such as the Internet of Things (IoT), robotics, and machine learning (ML) has the potential to significantly enhance the productivity and profitability of traditional fish farming. Farmers using traditional fish farming methods incur enormous economic costs owing to labor-intensive schedule monitoring and care, illnesses, and sudden fish deaths. Another ongoing issue is automated fish species recommendation based on water quality. On the one hand, the effective monitoring of abrupt changes in water quality may minimize the daily operating costs and boost fish productivity, while an accurate automatic fish recommender may aid the farmer in selecting profitable fish species for farming. In this paper, we present AquaBot, an IoT-based system that can automatically collect, monitor, and evaluate the water quality and recommend appropriate fish to farm depending on the values of various water quality indicators. A mobile robot has been designed to collect parameter values such as the pH, temperature, and turbidity from all around the pond. To facilitate monitoring, we have developed web and mobile interfaces. For the analysis and recommendation of suitable fish based on water quality, we have trained and tested several ML algorithms, such as the proposed custom ensemble model, random forest (RF), support vector machine (SVM), decision tree (DT), K-nearest neighbor (KNN), logistic regression (LR), bagging, boosting, and stacking, on a real-time pond water dataset. The dataset has been preprocessed with feature scaling and dataset balancing. We have evaluated the algorithms based on several performance metrics. In our experiment, our proposed ensemble model has delivered the best result, with 94% accuracy, 94% precision, 94% recall, a 94% F1-score, 93% MCC, and the best AUC score for multi-class classification. Finally, we have deployed the best-performing model in a web interface to provide cultivators with recommendations for suitable fish farming. Our proposed system is projected to not only boost production and save money but also reduce the time and intensity of the producer's manual labor.


Asunto(s)
Aprendizaje Automático , Estanques , Calidad del Agua , Animales , Peces , Algoritmos , Monitoreo del Ambiente/métodos , Máquina de Vectores de Soporte , Acuicultura/métodos , Internet de las Cosas , Explotaciones Pesqueras
16.
Sensors (Basel) ; 24(6)2024 Mar 21.
Artículo en Inglés | MEDLINE | ID: mdl-38544266

RESUMEN

With the development of IoT technology and 5G massive machine-type communication, the 3GPP standardization body considered as viable the integration of Narrowband Internet of Things (NB-IoT) in low Earth orbit (LEO) satellite-based architectures. However, the presence of the LEO satellite channel comes up with new challenges for the NB-IoT random access procedures and coverage enhancement mechanism. In this paper, an Adaptive Coverage Enhancement (ACE) method is proposed to meet the requirement of random access parameter configurations for diverse applications. Based on stochastic geometry theory, an expression of random access channel (RACH) success probability is derived for LEO satellite-based NB-IoT networks. On the basis of a power consumption model of the NB-IoT terminal, a multi-objective optimization problem is formulated to trade-off RACH success probability and power consumption. To solve this multi-objective optimization problem, we employ the Non-dominated Sorting Genetic Algorithms-II (NSGA-II) method to obtain the Pareto-front solution set. According to different application requirements, we also design a random access parameter configuration method to minimize the power consumption under the constraints of RACH success probability requirements. Simulation results show that the maximum number of repetitions and back-off window size have a great influence on the system performance and their value ranges should be set within [4, 18] and [0, 2048]. The power consumption of coverage enhancement with ACE is about 58% lower than that of the 3GPP proposed model. All this research together provides good reference for the scale deployment of NB-IoT in LEO satellite networks.

17.
Sensors (Basel) ; 24(4)2024 Feb 19.
Artículo en Inglés | MEDLINE | ID: mdl-38400504

RESUMEN

Addressing the increasing demand for remote patient monitoring, especially among the elderly and mobility-impaired, this study proposes the "ScalableDigitalHealth" (SDH) framework. The framework integrates smart digital health solutions with latency-aware edge computing autoscaling, providing a novel approach to remote patient monitoring. By leveraging IoT technology and application autoscaling, the "SDH" enables the real-time tracking of critical health parameters, such as ECG, body temperature, blood pressure, and oxygen saturation. These vital metrics are efficiently transmitted in real time to AWS cloud storage through a layered networking architecture. The contributions are two-fold: (1) establishing real-time remote patient monitoring and (2) developing a scalable architecture that features latency-aware horizontal pod autoscaling for containerized healthcare applications. The architecture incorporates a scalable IoT-based architecture and an innovative microservice autoscaling strategy in edge computing, driven by dynamic latency thresholds and enhanced by the integration of custom metrics. This work ensures heightened accessibility, cost-efficiency, and rapid responsiveness to patient needs, marking a significant leap forward in the field. By dynamically adjusting pod numbers based on latency, the system optimizes system responsiveness, particularly in edge computing's proximity-based processing. This innovative fusion of technologies not only revolutionizes remote healthcare delivery but also enhances Kubernetes performance, preventing unresponsiveness during high usage.


Asunto(s)
Concienciación , Benchmarking , Anciano , Humanos , Presión Sanguínea , Temperatura Corporal , Monitoreo Fisiológico
18.
Sensors (Basel) ; 24(4)2024 Feb 06.
Artículo en Inglés | MEDLINE | ID: mdl-38400208

RESUMEN

In today's competitive landscape, achieving customer-centricity is paramount for the sustainable growth and success of organisations. This research is dedicated to understanding customer preferences in the context of the Internet of things (IoT) and employs a two-part modeling approach tailored to this digital era. In the first phase, we leverage the power of the self-organizing map (SOM) algorithm to segment IoT customers based on their connected device usage patterns. This segmentation approach reveals three distinct customer clusters, with the second cluster demonstrating the highest propensity for IoT device adoption and usage. In the second phase, we introduce a robust decision tree methodology designed to prioritize various factors influencing customer satisfaction in the IoT ecosystem. We employ the classification and regression tree (CART) technique to analyze 17 key questions that assess the significance of factors impacting IoT device purchase decisions. By aligning these factors with the identified IoT customer clusters, we gain profound insights into customer behaviour and preferences in the rapidly evolving world of connected devices. This comprehensive analysis delves into the factors contributing to customer retention in the IoT space, with a strong emphasis on crafting logical marketing strategies, enhancing customer satisfaction, and fostering customer loyalty in the digital realm. Our research methodology involves surveys and questionnaires distributed to 207 IoT users, categorizing them into three distinct IoT customer groups. Leveraging analytical statistical methods, regression analysis, and IoT-specific tools and software, this study rigorously evaluates the factors influencing IoT device purchases. Importantly, this approach not only effectively clusters the IoT customer relationship management (IoT-CRM) dataset but also provides valuable visualisations that are essential for understanding the complex dynamics of the IoT customer landscape. Our findings underscore the critical role of logical marketing strategies, customer satisfaction, and customer loyalty in enhancing customer retention in the IoT era. This research offers a significant contribution to businesses seeking to optimize their IoT-CRM strategies and capitalize on the opportunities presented by the IoT ecosystem.


Asunto(s)
Comportamiento del Consumidor , Internet de las Cosas , Comercio , Programas Informáticos , Encuestas y Cuestionarios , Humanos
19.
Sensors (Basel) ; 24(18)2024 Sep 21.
Artículo en Inglés | MEDLINE | ID: mdl-39338858

RESUMEN

In this paper, we present the development and evaluation of a contextually relevant, cost-effective, multihop cluster-based agricultural Internet of Things (MCA-IoT) network. This network utilizes commercial off-the-shelf (COTS) Bluetooth Low-Energy (BLE) and LoRa communication technologies, along with the Raspberry Pi 3 Model B+ (RPi 3 B+), to address the challenges of climate change-induced global food insecurity in smart farming applications. Employing the lean engineering design approach, we initially implemented a centralized cluster-based agricultural IoT (CA-IoT) hardware testbed incorporating BLE, RPi 3 B+, STEMMA soil moisture sensors, UM25 m, and LoPy low-power Wi-Fi modules. This system was subsequently adapted and refined to assess the performance of the MCA-IoT network. This study offers a comprehensive reference on the novel, location-independent MCA-IoT technology, including detailed design and deployment insights for the agricultural IoT (Agri-IoT) community. The proposed solution demonstrated favorable performance in indoor and outdoor environments, particularly in water-stressed regions of Northern Ghana. Performance evaluations revealed that the MCA-IoT technology is easy to deploy and manage by users with limited expertise, is location-independent, robust, energy-efficient for battery operation, and scalable in terms of task and size, thereby providing a versatile range of measurements for future applications. Our results further demonstrated that the most effective approach to utilizing existing IoT-based communication technologies within a typical farming context in sub-Saharan Africa is to integrate them.

20.
Sensors (Basel) ; 24(15)2024 Jul 23.
Artículo en Inglés | MEDLINE | ID: mdl-39123812

RESUMEN

Maintaining security in communication networks has long been a major concern. This issue has become increasingly crucial due to the emergence of new communication architectures like the Internet of Things (IoT) and the advancement and complexity of infiltration techniques. For usage in networks based on the Internet of Things, previous intrusion detection systems (IDSs), which often use a centralized design to identify threats, are now ineffective. For the resolution of these issues, this study presents a novel and cooperative approach to IoT intrusion detection that may be useful in resolving certain current security issues. The suggested approach chooses the most important attributes that best describe the communication between objects by using Black Hole Optimization (BHO). Additionally, a novel method for describing the network's matrix-based communication properties is put forward. The inputs of the suggested intrusion detection model consist of these two feature sets. The suggested technique splits the network into a number of subnets using the software-defined network (SDN). Monitoring of each subnet is done by a controller node, which uses a parallel combination of convolutional neural networks (PCNN) to determine the presence of security threats in the traffic passing through its subnet. The proposed method also uses the majority voting approach for the cooperation of controller nodes in order to more accurately detect attacks. The findings demonstrate that, in comparison to the prior approaches, the suggested cooperative strategy can detect assaults in the NSLKDD and NSW-NB15 datasets with an accuracy of 99.89 and 97.72 percent, respectively. This is a minimum 0.6 percent improvement.

SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda