Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 247
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-39249173

RESUMEN

PURPOSE: Healthcare systems around the world are increasingly facing severe challenges due to problems such as staff shortage, changing demographics and the reliance on an often strongly human-dependent environment. One approach aiming to address these issues is the development of new telemedicine applications. The currently researched network standard 6G promises to deliver many new features which could be beneficial to leverage the full potential of emerging telemedical solutions and overcome the limitations of current network standards. METHODS: We developed a telerobotic examination system with a distributed robot control infrastructure to investigate the benefits and challenges of distributed computing scenarios, such as fog computing, in medical applications. We investigate different software configurations for which we characterize the network traffic and computational loads and subsequently establish network allocation strategies for different types of modular application functions (MAFs). RESULTS: The results indicate a high variability in the usage profiles of these MAFs, both in terms of computational load and networking behavior, which in turn allows the development of allocation strategies for different types of MAFs according to their requirements. Furthermore, the results provide a strong basis for further exploration of distributed computing scenarios in medical robotics. CONCLUSION: This work lays the foundation for the development of medical robotic applications using 6G network architectures and distributed computing scenarios, such as fog computing. In the future, we plan to investigate the capability to dynamically shift MAFs within the network based on current situational demand, which could help to further optimize the performance of network-based medical applications and play a role in addressing the increasingly critical challenges in healthcare.

2.
Heliyon ; 10(17): e37453, 2024 Sep 15.
Artículo en Inglés | MEDLINE | ID: mdl-39296026

RESUMEN

Distributed control is an effective method to coordinate the microgrid with various components, and also in a smart microgrid, communication graph layouts are essential since changing the topology unexpectedly could disrupt the operation of the distributed controllers, and also an imbalance may occur between the production and load. Hence, reducing the exchanged data between units and system operator is essential in order to reduce the transmitted data volume and computational burden. For this purpose, an islanded microgrid with multiple agents which is using cloud-fog computing is proposed here, in order to reduce the computing burden on the central control unit as well as reducing data exchange among units. To balance the production power and loads in a smart island with a stable voltage/frequency, a hybrid backstepping sliding mode controller (BSMC) with disturbance observer (DO) is suggested to control voltage/frequency and current in the MG-based master-slave organization. Therefore, this paper proposes a DO-driven BSMC for controlling voltage/frequency, and power of energy sources within a Master-Slave organization; in addition, the study proposes a clod-fog computing for enhancing performance, reducing transferred data volume, and processing information on time. In the extensive simulations, the suggested controller shows a reduction in steady-state error, a fast response, and a lower total harmonic distortion (THD) for nonlinear and linear loads less than 0.33 %. The fog layer serves as a local processing level, so it reduces the exchanged data between cloud and fog nodes.

3.
Heliyon ; 10(18): e37490, 2024 Sep 30.
Artículo en Inglés | MEDLINE | ID: mdl-39309787

RESUMEN

The current society is becoming increasingly interconnected and hyper-connected. Communication networks are advancing, as well as logistics networks, or even networks for the transportation and distribution of natural resources. One of the key benefits of the evolution of these networks is to bring consumers closer to the source of a resource or service. However, this is not a straightforward task, particularly since networks near final users are usually shaped by heterogeneous nodes, sometimes even in very dense scenarios, which may demand or offer a resource at any given moment. In this paper, we present DEN2NE, a novel algorithm designed for the automatic distribution and reallocation of resources in distributed environments. The algorithm has been implemented with six different criteria in order to adapt it to the specific use case under consideration. The results obtained from DEN2DE are promising, owing to its adaptability and its average execution time, which follows a linear distribution in relation to the topology size.

4.
Cureus ; 16(8): e66779, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39268273

RESUMEN

The integration of fog computing into healthcare promises significant advancements in real-time data analytics and patient care by decentralizing data processing closer to the source. This shift, however, introduces complex regulatory, privacy, and security challenges that are not adequately addressed by existing frameworks designed for centralized systems. The distributed nature of fog computing complicates the uniform application of security measures and compliance with diverse international regulations, raising concerns about data privacy, security vulnerabilities, and legal accountability. This review explores these challenges in depth, discussing the implications of fog computing's decentralized architecture for data privacy, the difficulties in achieving consistent security across dispersed nodes, and the complexities of ensuring compliance in multi-jurisdictional environments. It also examines specific regulatory frameworks, including Health Insurance Portability and Accountability (HIPAA) in the United States, General Data Protection Regulation (GDPR) in the European Union, and emerging laws in Asia and Brazil, highlighting the gaps and the need for regulatory evolution to better accommodate the nuances of fog computing. The review advocates for a proactive regulatory approach, emphasizing the development of specific guidelines, international collaboration, and public-private partnerships to enhance compliance and support innovation. By embedding privacy and security by design and leveraging advanced technologies, healthcare providers can navigate the regulatory landscape effectively, ensuring that fog computing realizes its full potential as a transformative healthcare technology without compromising patient trust or data integrity.

5.
Sensors (Basel) ; 24(16)2024 Aug 15.
Artículo en Inglés | MEDLINE | ID: mdl-39204979

RESUMEN

In the era of ubiquitous computing, the challenges imposed by the increasing demand for real-time data processing, security, and energy efficiency call for innovative solutions. The emergence of fog computing has provided a promising paradigm to address these challenges by bringing computational resources closer to data sources. Despite its advantages, the fog computing characteristics pose challenges in heterogeneous environments in terms of resource allocation and management, provisioning, security, and connectivity, among others. This paper introduces COGNIFOG, a novel cognitive fog framework currently under development, which was designed to leverage intelligent, decentralized decision-making processes, machine learning algorithms, and distributed computing principles to enable the autonomous operation, adaptability, and scalability across the IoT-edge-cloud continuum. By integrating cognitive capabilities, COGNIFOG is expected to increase the efficiency and reliability of next-generation computing environments, potentially providing a seamless bridge between the physical and digital worlds. Preliminary experimental results with a limited set of connectivity-related COGNIFOG building blocks show promising improvements in network resource utilization in a real-world-based IoT scenario. Overall, this work paves the way for further developments on the framework, which are aimed at making it more intelligent, resilient, and aligned with the ever-evolving demands of next-generation computing environments.

6.
Sensors (Basel) ; 24(15)2024 Aug 05.
Artículo en Inglés | MEDLINE | ID: mdl-39124116

RESUMEN

Effective air quality monitoring and forecasting are essential for safeguarding public health, protecting the environment, and promoting sustainable development in smart cities. Conventional systems are cloud-based, incur high costs, lack accurate Deep Learning (DL)models for multi-step forecasting, and fail to optimize DL models for fog nodes. To address these challenges, this paper proposes a Fog-enabled Air Quality Monitoring and Prediction (FAQMP) system by integrating the Internet of Things (IoT), Fog Computing (FC), Low-Power Wide-Area Networks (LPWANs), and Deep Learning (DL) for improved accuracy and efficiency in monitoring and forecasting air quality levels. The three-layered FAQMP system includes a low-cost Air Quality Monitoring (AQM) node transmitting data via LoRa to the Fog Computing layer and then the cloud layer for complex processing. The Smart Fog Environmental Gateway (SFEG) in the FC layer introduces efficient Fog Intelligence by employing an optimized lightweight DL-based Sequence-to-Sequence (Seq2Seq) Gated Recurrent Unit (GRU) attention model, enabling real-time processing, accurate forecasting, and timely warnings of dangerous AQI levels while optimizing fog resource usage. Initially, the Seq2Seq GRU Attention model, validated for multi-step forecasting, outperformed the state-of-the-art DL methods with an average RMSE of 5.5576, MAE of 3.4975, MAPE of 19.1991%, R2 of 0.6926, and Theil's U1 of 0.1325. This model is then made lightweight and optimized using post-training quantization (PTQ), specifically dynamic range quantization, which reduced the model size to less than a quarter of the original, improved execution time by 81.53% while maintaining forecast accuracy. This optimization enables efficient deployment on resource-constrained fog nodes like SFEG by balancing performance and computational efficiency, thereby enhancing the effectiveness of the FAQMP system through efficient Fog Intelligence. The FAQMP system, supported by the EnviroWeb application, provides real-time AQI updates, forecasts, and alerts, aiding the government in proactively addressing pollution concerns, maintaining air quality standards, and fostering a healthier and more sustainable environment.

7.
Cureus ; 16(7): e64263, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-39130982

RESUMEN

Fog computing is a decentralized computing infrastructure that processes data at or near its source, reducing latency and bandwidth usage. This technology is gaining traction in healthcare due to its potential to enhance real-time data processing and decision-making capabilities in critical medical scenarios. A systematic review of existing literature on fog computing in healthcare was conducted. The review included searches in major databases such as PubMed, IEEE Xplore, Scopus, and Google Scholar. The search terms used were "fog computing in healthcare," "real-time diagnostics and fog computing," "continuous patient monitoring fog computing," "predictive analytics fog computing," "interoperability in fog computing healthcare," "scalability issues fog computing healthcare," and "security challenges fog computing healthcare." Articles published between 2010 and 2023 were considered. Inclusion criteria encompassed peer-reviewed articles, conference papers, and review articles focusing on the applications of fog computing in healthcare. Exclusion criteria were articles not available in English, those not related to healthcare applications, and those lacking empirical data. Data extraction focused on the applications of fog computing in real-time diagnostics, continuous monitoring, predictive analytics, and the identified challenges of interoperability, scalability, and security. Fog computing significantly enhances diagnostic capabilities by facilitating real-time data analysis, crucial for urgent diagnostics such as stroke detection, by processing data closer to its source. It also improves monitoring during surgeries by enabling real-time processing of vital signs and physiological parameters, thereby enhancing patient safety. In chronic disease management, continuous data collection and analysis through wearable devices allow for proactive disease management and timely adjustments to treatment plans. Additionally, fog computing supports telemedicine by enabling real-time communication between remote specialists and patients, thereby improving access to specialist care in underserved regions. Fog computing offers transformative potential in healthcare, improving diagnostic precision, patient monitoring, and personalized treatment. Addressing the challenges of interoperability, scalability, and security will be crucial for fully realizing the benefits of fog computing in healthcare, leading to a more connected and efficient healthcare environment.

8.
PeerJ Comput Sci ; 10: e2128, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38983206

RESUMEN

Fog computing has emerged as a prospective paradigm to address the computational requirements of IoT applications, extending the capabilities of cloud computing to the network edge. Task scheduling is pivotal in enhancing energy efficiency, optimizing resource utilization and ensuring the timely execution of tasks within fog computing environments. This article presents a comprehensive review of the advancements in task scheduling methodologies for fog computing systems, covering priority-based, greedy heuristics, metaheuristics, learning-based, hybrid heuristics, and nature-inspired heuristic approaches. Through a systematic analysis of relevant literature, we highlight the strengths and limitations of each approach and identify key challenges facing fog computing task scheduling, including dynamic environments, heterogeneity, scalability, resource constraints, security concerns, and algorithm transparency. Furthermore, we propose future research directions to address these challenges, including the integration of machine learning techniques for real-time adaptation, leveraging federated learning for collaborative scheduling, developing resource-aware and energy-efficient algorithms, incorporating security-aware techniques, and advancing explainable AI methodologies. By addressing these challenges and pursuing these research directions, we aim to facilitate the development of more robust, adaptable, and efficient task-scheduling solutions for fog computing environments, ultimately fostering trust, security, and sustainability in fog computing systems and facilitating their widespread adoption across diverse applications and domains.

9.
Sensors (Basel) ; 24(13)2024 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-39001087

RESUMEN

The growing importance of edge and fog computing in the modern IT infrastructure is driven by the rise of decentralized applications. However, resource allocation within these frameworks is challenging due to varying device capabilities and dynamic network conditions. Conventional approaches often result in poor resource use and slowed advancements. This study presents a novel strategy for enhancing resource allocation in edge and fog computing by integrating machine learning with the blockchain for reliable trust management. Our proposed framework, called CyberGuard, leverages the blockchain's inherent immutability and decentralization to establish a trustworthy and transparent network for monitoring and verifying edge and fog computing transactions. CyberGuard combines the Trust2Vec model with conventional machine-learning models like SVM, KNN, and random forests, creating a robust mechanism for assessing trust and security risks. Through detailed optimization and case studies, CyberGuard demonstrates significant improvements in resource allocation efficiency and overall system performance in real-world scenarios. Our results highlight CyberGuard's effectiveness, evidenced by a remarkable accuracy, precision, recall, and F1-score of 98.18%, showcasing the transformative potential of our comprehensive approach in edge and fog computing environments.

10.
Sensors (Basel) ; 24(11)2024 May 24.
Artículo en Inglés | MEDLINE | ID: mdl-38894160

RESUMEN

Satellite fog computing (SFC) achieves computation, caching, and other functionalities through collaboration among fog nodes. Satellites can provide real-time and reliable satellite-to-ground fusion services by pre-caching content that users may request in advance. However, due to the high-speed mobility of satellites, the complexity of user-access conditions poses a new challenge in selecting optimal caching locations and improving caching efficiency. Motivated by this, in this paper, we propose a real-time caching scheme based on a Double Deep Q-Network (Double DQN). The overarching objective is to enhance the cache hit rate. The simulation results demonstrate that the algorithm proposed in this paper improves the data hit rate by approximately 13% compared to methods without reinforcement learning assistance.

11.
Front Artif Intell ; 7: 1397480, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38845684

RESUMEN

The rapid proliferation of Internet of Things (IoT) devices across various industries has revolutionized the way we interact with technology. However, this widespread adoption has also brought about significant security challenges that must be addressed to ensure the integrity and confidentiality of data transmitted and processed by IoT systems. This survey paper delves into the diverse array of security threats faced by IoT devices and networks, ranging from data breaches and unauthorized access to physical tampering and denial-of-service attacks. By examining the vulnerabilities inherent in IoT ecosystems, we highlight the importance of implementing robust security measures to safeguard sensitive information and ensure the reliable operation of connected devices. Furthermore, we explore cutting-edge technologies such as blockchain, edge computing, and machine learning as potential solutions to enhance the security posture of IoT deployments. Through a comprehensive analysis of existing security frameworks and best practices, this paper aims to provide valuable insights for researchers, practitioners, and policymakers seeking to fortify the resilience of IoT systems in an increasingly interconnected world.

12.
JMIR Biomed Eng ; 9: e50175, 2024 Mar 06.
Artículo en Inglés | MEDLINE | ID: mdl-38875671

RESUMEN

BACKGROUND: The increasing adoption of telehealth Internet of Things (IoT) devices in health care informatics has led to concerns about energy use and data processing efficiency. OBJECTIVE: This paper introduces an innovative model that integrates telehealth IoT devices with a fog and cloud computing-based platform, aiming to enhance energy efficiency in telehealth IoT systems. METHODS: The proposed model incorporates adaptive energy-saving strategies, localized fog nodes, and a hybrid cloud infrastructure. Simulation analyses were conducted to assess the model's effectiveness in reducing energy consumption and enhancing data processing efficiency. RESULTS: Simulation results demonstrated significant energy savings, with a 2% reduction in energy consumption achieved through adaptive energy-saving strategies. The sample size for the simulation was 10-40, providing statistical robustness to the findings. CONCLUSIONS: The proposed model successfully addresses energy and data processing challenges in telehealth IoT scenarios. By integrating fog computing for local processing and a hybrid cloud infrastructure, substantial energy savings are achieved. Ongoing research will focus on refining the energy conservation model and exploring additional functional enhancements for broader applicability in health care and industrial contexts.

13.
BMC Med Imaging ; 24(1): 123, 2024 May 27.
Artículo en Inglés | MEDLINE | ID: mdl-38797827

RESUMEN

The quick proliferation of pandemic diseases has been imposing many concerns on the international health infrastructure. To combat pandemic diseases in smart cities, Artificial Intelligence of Things (AIoT) technology, based on the integration of artificial intelligence (AI) with the Internet of Things (IoT), is commonly used to promote efficient control and diagnosis during the outbreak, thereby minimizing possible losses. However, the presence of multi-source institutional data remains one of the major challenges hindering the practical usage of AIoT solutions for pandemic disease diagnosis. This paper presents a novel framework that utilizes multi-site data fusion to boost the accurateness of pandemic disease diagnosis. In particular, we focus on a case study of COVID-19 lesion segmentation, a crucial task for understanding disease progression and optimizing treatment strategies. In this study, we propose a novel multi-decoder segmentation network for efficient segmentation of infections from cross-domain CT scans in smart cities. The multi-decoder segmentation network leverages data from heterogeneous domains and utilizes strong learning representations to accurately segment infections. Performance evaluation of the multi-decoder segmentation network was conducted on three publicly accessible datasets, demonstrating robust results with an average dice score of 89.9% and an average surface dice of 86.87%. To address scalability and latency issues associated with centralized cloud systems, fog computing (FC) emerges as a viable solution. FC brings resources closer to the operator, offering low latency and energy-efficient data management and processing. In this context, we propose a unique FC technique called PANDFOG to deploy the multi-decoder segmentation network on edge nodes for practical and clinical applications of automated COVID-19 pneumonia analysis. The results of this study highlight the efficacy of the multi-decoder segmentation network in accurately segmenting infections from cross-domain CT scans. Moreover, the proposed PANDFOG system demonstrates the practical deployment of the multi-decoder segmentation network on edge nodes, providing real-time access to COVID-19 segmentation findings for improved patient monitoring and clinical decision-making.


Asunto(s)
COVID-19 , Aprendizaje Profundo , Pandemias , Tomografía Computarizada por Rayos X , Humanos , Tomografía Computarizada por Rayos X/métodos , SARS-CoV-2 , Ciudades , Internet de las Cosas
14.
Sensors (Basel) ; 24(8)2024 Apr 22.
Artículo en Inglés | MEDLINE | ID: mdl-38676279

RESUMEN

This study uses a wind turbine case study as a subdomain of Industrial Internet of Things (IIoT) to showcase an architecture for implementing a distributed digital twin in which all important aspects of a predictive maintenance solution in a DT use a fog computing paradigm, and the typical predictive maintenance DT is improved to offer better asset utilization and management through real-time condition monitoring, predictive analytics, and health management of selected components of wind turbines in a wind farm. Digital twin (DT) is a technology that sits at the intersection of Internet of Things, Cloud Computing, and Software Engineering to provide a suitable tool for replicating physical objects in the digital space. This can facilitate the implementation of asset management in manufacturing systems through predictive maintenance solutions leveraged by machine learning (ML). With DTs, a solution architecture can easily use data and software to implement asset management solutions such as condition monitoring and predictive maintenance using acquired sensor data from physical objects and computing capabilities in the digital space. While DT offers a good solution, it is an emerging technology that could be improved with better standards, architectural framework, and implementation methodologies. Researchers in both academia and industry have showcased DT implementations with different levels of success. However, DTs remain limited in standards and architectures that offer efficient predictive maintenance solutions with real-time sensor data and intelligent DT capabilities. An appropriate feedback mechanism is also needed to improve asset management operations.

15.
PeerJ Comput Sci ; 10: e1933, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38660154

RESUMEN

The robust development of the blockchain distributed ledger, the Internet of Things (IoT), and fog computing-enabled connected devices and nodes has changed our lifestyle nowadays. Due to this, the increased rate of device sales and utilization increases the demand for edge computing technology with collaborative procedures. However, there is a well-established paradigm designed to optimize various distinct quality-of-service requirements, including bandwidth, latency, transmission power, delay, duty cycle, throughput, response, and edge sense, and bring computation and data storage closer to the devices and edges, along with ledger security and privacy during transmission. In this article, we present a systematic review of blockchain Hyperledger enabling fog and edge computing, which integrates as an outsourcing computation over the serverless consortium network environment. The main objective of this article is to classify recently published articles and survey reports on the current status in the domain of edge distributed computing and outsourcing computation, such as fog and edge. In addition, we proposed a blockchain-Hyperledger Sawtooth-enabled serverless edge-based distributed outsourcing computation architecture. This theoretical architecture-based solution delivers robust data security in terms of integrity, transparency, provenance, and privacy-protected preservation in the immutable storage to store the outsourcing computational ledgers. This article also highlights the changes between the proposed taxonomy and the current system based on distinct parameters, such as system security and privacy. Finally, a few open research issues and limitations with promising future directions are listed for future research work.

16.
PeerJ Comput Sci ; 10: e1986, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38660156

RESUMEN

The execution of delay-aware applications can be effectively handled by various computing paradigms, including the fog computing, edge computing, and cloudlets. Cloud computing offers services in a centralized way through a cloud server. On the contrary, the fog computing paradigm offers services in a dispersed manner providing services and computational facilities near the end devices. Due to the distributed provision of resources by the fog paradigm, this architecture is suitable for large-scale implementation of applications. Furthermore, fog computing offers a reduction in delay and network load as compared to cloud architecture. Resource distribution and load balancing are always important tasks in deploying efficient systems. In this research, we have proposed heuristic-based approach that achieves a reduction in network consumption and delays by efficiently utilizing fog resources according to the load generated by the clusters of edge nodes. The proposed algorithm considers the magnitude of data produced at the edge clusters while allocating the fog resources. The results of the evaluations performed on different scales confirm the efficacy of the proposed approach in achieving optimal performance.

17.
PeerJ Comput Sci ; 10: e1833, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38660213

RESUMEN

With the emergence of Internet of Things (IoT) technology, a huge amount of data is generated, which is costly to transfer to the cloud data centers in terms of security, bandwidth, and latency. Fog computing is an efficient paradigm for locally processing and manipulating IoT-generated data. It is difficult to configure the fog nodes to provide all of the services required by the end devices because of the static configuration, poor processing, and storage capacities. To enhance fog nodes' capabilities, it is essential to reconfigure them to accommodate a broader range and variety of hosted services. In this study, we focus on the placement of fog services and their dynamic reconfiguration in response to the end-device requests. Due to its growing successes and popularity in the IoT era, the Decision Tree (DT) machine learning model is implemented to predict the occurrence of requests and events in advance. The DT model enables the fog nodes to predict requests for a specific service in advance and reconfigure the fog node accordingly. The performance of the proposed model is evaluated in terms of high throughput, minimized energy consumption, and dynamic fog node smart switching. The simulation results demonstrate a notable increase in the fog node hit ratios, scaling up to 99% for the majority of services concurrently with a substantial reduction in miss ratios. Furthermore, the energy consumption is greatly reduced by over 50% as compared to a static node.

18.
Sensors (Basel) ; 24(7)2024 Mar 31.
Artículo en Inglés | MEDLINE | ID: mdl-38610447

RESUMEN

In Portugal, more than 98% of domestic cooking oil is disposed of improperly every day. This avoids recycling/reconverting into another energy. Is also may become a potential harmful contaminant of soil and water. Driven by the utility of recycled cooking oil, and leveraging the exponential growth of ubiquitous computing approaches, we propose an IoT smart solution for domestic used cooking oil (UCO) collection bins. We call this approach SWAN, which stands for Smart Waste Accumulation Network. It is deployed and evaluated in Portugal. It consists of a countrywide network of collection bin units, available in public areas. Two metrics are considered to evaluate the system's success: (i) user engagement, and (ii) used cooking oil collection efficiency. The presented system should (i) perform under scenarios of temporary communication network failures, and (ii) be scalable to accommodate an ever-growing number of installed collection units. Thus, we choose a disruptive approach from the traditional cloud computing paradigm. It relies on edge node infrastructure to process, store, and act upon the locally collected data. The communication appears as a delay-tolerant task, i.e., an edge computing solution. We conduct a comparative analysis revealing the benefits of the edge computing enabled collection bin vs. a cloud computing solution. The studied period considers four years of collected data. An exponential increase in the amount of used cooking oil collected is identified, with the developed solution being responsible for surpassing the national collection totals of previous years. During the same period, we also improved the collection process as we were able to more accurately estimate the optimal collection and system's maintenance intervals.

19.
Diagnostics (Basel) ; 14(6)2024 Mar 15.
Artículo en Inglés | MEDLINE | ID: mdl-38535044

RESUMEN

Dengue is a distinctive and fatal infectious disease that spreads through female mosquitoes called Aedes aegypti. It is a notable concern for developing countries due to its low diagnosis rate. Dengue has the most astounding mortality level as compared to other diseases due to tremendous platelet depletion. Hence, it can be categorized as a life-threatening fever as compared to the same class of fevers. Additionally, it has been shown that dengue fever shares many of the same symptoms as other flu-based fevers. On the other hand, the research community is closely monitoring the popular research fields related to IoT, fog, and cloud computing for the diagnosis and prediction of diseases. IoT, fog, and cloud-based technologies are used for constructing a number of health care systems. Accordingly, in this study, a DengueFog monitoring system was created based on fog computing for prediction and detection of dengue sickness. Additionally, the proposed DengueFog system includes a weighted random forest (WRF) classifier to monitor and predict the dengue infection. The proposed system's efficacy was evaluated using data on dengue infection. This dataset was gathered between 2016 and 2018 from several hospitals in the Delhi-NCR region. The accuracy, F-value, recall, precision, error rate, and specificity metrics were used to assess the simulation results of the suggested monitoring system. It was demonstrated that the proposed DengueFog monitoring system with WRF outperforms the traditional classifiers.

20.
Front Big Data ; 7: 1358486, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38449564

RESUMEN

As the volume and velocity of Big Data continue to grow, traditional cloud computing approaches struggle to meet the demands of real-time processing and low latency. Fog computing, with its distributed network of edge devices, emerges as a compelling solution. However, efficient task scheduling in fog computing remains a challenge due to its inherently multi-objective nature, balancing factors like execution time, response time, and resource utilization. This paper proposes a hybrid Genetic Algorithm (GA)-Particle Swarm Optimization (PSO) algorithm to optimize multi-objective task scheduling in fog computing environments. The hybrid approach combines the strengths of GA and PSO, achieving effective exploration and exploitation of the search space, leading to improved performance compared to traditional single-algorithm approaches. The proposed hybrid algorithm results improved the execution time by 85.68% when compared with GA algorithm, by 84% when compared with Hybrid PWOA and by 51.03% when compared with PSO algorithm as well as it improved the response time by 67.28% when compared with GA algorithm, by 54.24% when compared with Hybrid PWOA and by 75.40% when compared with PSO algorithm as well as it improved the completion time by 68.69% when compared with GA algorithm, by 98.91% when compared with Hybrid PWOA and by 75.90% when compared with PSO algorithm when various tasks inputs are given. The proposed hybrid algorithm results also improved the execution time by 84.87% when compared with GA algorithm, by 88.64% when compared with Hybrid PWOA and by 85.07% when compared with PSO algorithm it improved the response time by 65.92% when compared with GA algorithm, by 80.51% when compared with Hybrid PWOA and by 85.26% when compared with PSO algorithm as well as it improved the completion time by 67.60% when compared with GA algorithm, by 81.34% when compared with Hybrid PWOA and by 85.23% when compared with PSO algorithm when various fog nodes are given.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...