Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 666
Filter
1.
Cogn Neurodyn ; 18(4): 1799-1810, 2024 Aug.
Article in English | MEDLINE | ID: mdl-39104679

ABSTRACT

Facial expression recognition has made a significant progress as a result of the advent of more and more convolutional neural networks (CNN). However, with the improvement of CNN, the models continues to get deeper and larger so as to a greater focus on the high-level features of the image and the low-level features tend to be lost. Because of the reason above, the dependence of low-level features between different areas of the face often cannot be summarized. In response to this problem, we propose a novel network based on the CNN model. To extract long-range dependencies of low-level features, multiple attention mechanisms has been introduced into the network. In this paper, the patch attention mechanism is designed to obtain the dependence between low-level features of facial expressions firstly. After fusion, the feature maps are input to the backbone network incorporating convolutional block attention module (CBAM) to enhance the feature extraction ability and improve the accuracy of facial expression recognition, and achieve competitive results on three datasets CK+ (98.10%), JAFFE (95.12%) and FER2013 (73.50%). Further, according to the PA Net designed in this paper, a hardware friendly implementation scheme is designed based on memristor crossbars, which is expected to provide a software and hardware co-design scheme for edge computing of personal and wearable electronic products.

2.
Sci Rep ; 14(1): 18506, 2024 Aug 09.
Article in English | MEDLINE | ID: mdl-39122773

ABSTRACT

This paper aims to increase the Unmanned Aerial Vehicle's (UAV) capacity for target tracking. First, a control model based on fuzzy logic is created, which modifies the UAV's flight attitude in response to the target's motion status and changes in the surrounding environment. Then, an edge computing-based target tracking framework is created. By deploying edge devices around the UAV, the calculation of target recognition and position prediction is transferred from the central processing unit to the edge nodes. Finally, the latest Vision Transformer model is adopted for target recognition, the image is divided into uniform blocks, and then the attention mechanism is used to capture the relationship between different blocks to realize real-time image analysis. To anticipate the position, the particle filter algorithm is used with historical data and sensor inputs to produce a high-precision estimate of the target position. The experimental results in different scenes show that the average target capture time of the algorithm based on fuzzy logic control is shortened by 20% compared with the traditional proportional-integral-derivative (PID) method, from 5.2 s of the traditional PID to 4.2 s. The average tracking error is reduced by 15%, from 0.8 m of traditional PID to 0.68 m. Meanwhile, in the case of environmental change and target motion change, this algorithm shows better robustness, and the fluctuation range of tracking error is only half of that of traditional PID. This shows that the fuzzy logic control theory is successfully applied to the UAV target tracking field, which proves the effectiveness of this method in improving the target tracking performance.

3.
Sensors (Basel) ; 24(15)2024 Jul 25.
Article in English | MEDLINE | ID: mdl-39123877

ABSTRACT

Computer Vision (CV) has become increasingly important for Single-Board Computers (SBCs) due to their widespread deployment in addressing real-world problems. Specifically, in the context of smart cities, there is an emerging trend of developing end-to-end video analytics solutions designed to address urban challenges such as traffic management, disaster response, and waste management. However, deploying CV solutions on SBCs presents several pressing challenges (e.g., limited computation power, inefficient energy management, and real-time processing needs) hindering their use at scale. Graphical Processing Units (GPUs) and software-level developments have emerged recently in addressing these challenges to enable the elevated performance of SBCs; however, it is still an active area of research. There is a gap in the literature for a comprehensive review of such recent and rapidly evolving advancements on both software and hardware fronts. The presented review provides a detailed overview of the existing GPU-accelerated edge-computing SBCs and software advancements including algorithm optimization techniques, packages, development frameworks, and hardware deployment specific packages. This review provides a subjective comparative analysis based on critical factors to help applied Artificial Intelligence (AI) researchers in demonstrating the existing state of the art and selecting the best suited combinations for their specific use-case. At the end, the paper also discusses potential limitations of the existing SBCs and highlights the future research directions in this domain.

4.
Sensors (Basel) ; 24(15)2024 Jul 27.
Article in English | MEDLINE | ID: mdl-39123922

ABSTRACT

Interest in deploying deep reinforcement learning (DRL) models on low-power edge devices, such as Autonomous Mobile Robots (AMRs) and Internet of Things (IoT) devices, has seen a significant rise due to the potential of performing real-time inference by eliminating the latency and reliability issues incurred from wireless communication and the privacy benefits of processing data locally. Deploying such energy-intensive models on power-constrained devices is not always feasible, however, which has led to the development of model compression techniques that can reduce the size and computational complexity of DRL policies. Policy distillation, the most popular of these methods, can be used to first lower the number of network parameters by transferring the behavior of a large teacher network to a smaller student model before deploying these students at the edge. This works well with deterministic policies that operate using discrete actions. However, many real-world tasks that are power constrained, such as in the field of robotics, are formulated using continuous action spaces, which are not supported. In this work, we improve the policy distillation method to support the compression of DRL models designed to solve these continuous control tasks, with an emphasis on maintaining the stochastic nature of continuous DRL algorithms. Experiments show that our methods can be used effectively to compress such policies up to 750% while maintaining or even exceeding their teacher's performance by up to 41% in solving two popular continuous control tasks.

5.
Sensors (Basel) ; 24(15)2024 Aug 05.
Article in English | MEDLINE | ID: mdl-39124122

ABSTRACT

The rapid advancement of technology has greatly expanded the capabilities of unmanned aerial vehicles (UAVs) in wireless communication and edge computing domains. The primary objective of UAVs is the seamless transfer of video data streams to emergency responders. However, live video data streaming is inherently latency dependent, wherein the value of the video frames diminishes with any delay in the stream. This becomes particularly critical during emergencies, where live video streaming provides vital information about the current conditions. Edge computing seeks to address this latency issue in live video streaming by bringing computing resources closer to users. Nonetheless, the mobile nature of UAVs necessitates additional trajectory supervision alongside the management of computation and networking resources. Consequently, efficient system optimization is required to maximize the overall effectiveness of the collaborative system with limited UAV resources. This study explores a scenario where multiple UAVs collaborate with end users and edge servers to establish an emergency response system. The proposed idea takes a comprehensive approach by considering the entire emergency response system from the incident site to video distribution at the user level. It includes an adaptive resource management strategy, leveraging deep reinforcement learning by simultaneously addressing video streaming latency, UAV and user mobility factors, and varied bandwidth resources.

6.
Sensors (Basel) ; 24(15)2024 Aug 05.
Article in English | MEDLINE | ID: mdl-39124124

ABSTRACT

A complete low-power, low-cost and wireless solution for bridge structural health monitoring is presented. This work includes monitoring nodes with modular hardware design and low power consumption based on a control and resource management board called CoreBoard, and a specific board for sensorization called SensorBoard is presented. The firmware is presented as a design of FreeRTOS parallelised tasks that carry out the management of the hardware resources and implement the Random Decrement Technique to minimize the amount of data to be transmitted over the NB-IoT network in a secure way. The presented solution is validated through the characterization of its energy consumption, which guarantees an autonomy higher than 10 years with a daily 8 min monitoring periodicity, and two deployments in a pilot laboratory structure and the Eduardo Torroja bridge in Posadas (Córdoba, Spain). The results are compared with two different calibrated commercial systems, obtaining an error lower than 1.72% in modal analysis frequencies. The architecture and the results obtained place the presented design as a new solution in the state of the art and, thanks to its autonomy, low cost and the graphical device management interface presented, allow its deployment and integration in the current IoT paradigm.

7.
Sensors (Basel) ; 24(14)2024 Jul 14.
Article in English | MEDLINE | ID: mdl-39065960

ABSTRACT

End-to-end disparity estimation algorithms based on cost volume deployed in edge-end neural network accelerators have the problem of structural adaptation and need to ensure accuracy under the condition of adaptation operator. Therefore, this paper proposes a novel disparity calculation algorithm that uses low-rank approximation to approximately replace 3D convolution and transposed 3D convolution, WReLU to reduce data compression caused by the activation function, and unimodal cost volume filtering and a confidence estimation network to regularize cost volume. It alleviates the problem of disparity-matching cost distribution being far away from the true distribution and greatly reduces the computational complexity and number of parameters of the algorithm while improving accuracy. Experimental results show that compared with a typical disparity estimation network, the absolute error of the proposed algorithm is reduced by 38.3%, the three-pixel error is reduced to 1.41%, and the number of parameters is reduced by 67.3%. The calculation accuracy is better than that of other algorithms, it is easier to deploy, and it has strong structural adaptability and better practicability.

8.
Sensors (Basel) ; 24(14)2024 Jul 16.
Article in English | MEDLINE | ID: mdl-39066008

ABSTRACT

Unmanned aerial vehicles (UAVs) have increasingly become integral to multi-access edge computing (MEC) due to their flexibility and cost-effectiveness, especially in the B5G and 6G eras. This paper aims to enhance the quality of experience (QoE) in large-scale UAV-MEC networks by minimizing the shrinkage ratio through optimal decision-making in computation mode selection for each user device (UD), UAV flight trajectory, bandwidth allocation, and computing resource allocation at edge servers. However, the interdependencies among UAV trajectory, binary task offloading mode, and computing/network resource allocation across numerous IoT nodes pose significant challenges. To address these challenges, we formulate the shrinkage ratio minimization problem as a mixed-integer nonlinear programming (MINLP) problem and propose a two-tier optimization strategy. To reduce the scale of the optimization problem, we first design a low-complexity UAV partition coverage algorithm based on the Welzl method and determine the UAV flight trajectory by solving a traveling salesman problem (TSP). Subsequently, we develop a coordinate descent (CD)-based method and an alternating direction method of multipliers (ADMM)-based method for network bandwidth and computing resource allocation in the MEC system. Extensive simulations demonstrate that the CD-based method is simple to implement and highly efficient in large-scale UAV-MEC networks, reducing the time complexity by three orders of magnitude compared to convex optimization methods. Meanwhile, the ADMM-based joint optimization method achieves approximately an 8% reduction in shrinkage ratio optimization compared to baseline methods.

9.
Sensors (Basel) ; 24(14)2024 Jul 18.
Article in English | MEDLINE | ID: mdl-39066074

ABSTRACT

Edge servers frequently manage their own offline digital twin (DT) services, in addition to caching online digital twin services. However, current research often overlooks the impact of offline caching services on memory and computation resources, which can hinder the efficiency of online service task processing on edge servers. In this study, we concentrated on service caching and task offloading within a collaborative edge computing system by emphasizing the integrated quality of service (QoS) for both online and offline edge services. We considered the resource usage of both online and offline services, along with incoming online requests. To maximize the overall QoS utility, we established an optimization objective that rewards the throughput of online services while penalizing offline services that miss their soft deadlines. We formulated this as a utility maximization problem, which was proven to be NP-hard. To tackle this complexity, we reframed the optimization problem as a Markov decision process (MDP) and introduced a joint optimization algorithm for service caching and task offloading by leveraging the deep Q-network (DQN). Comprehensive experiments revealed that our algorithm enhanced the utility by at least 14.01% compared with the baseline algorithms.

10.
Sci Rep ; 14(1): 16383, 2024 Jul 16.
Article in English | MEDLINE | ID: mdl-39013972

ABSTRACT

Resource optimization, timely data capture, and efficient unmanned aerial vehicle (UAV) operations are of utmost importance for mission success. Latency, bandwidth constraints, and scalability problems are the problems that conventional centralized processing architectures encounter. In addition, optimizing for robust communication between ground stations and UAVs while protecting data privacy and security is a daunting task in and of itself. Employing edge computing infrastructure, artificial intelligence-driven decision-making, and dynamic task offloading mechanisms, this research proposes the dynamic task offloading edge-aware optimization framework (DTOE-AOF) for UAV operations optimization. Edge computing and artificial intelligence (AI) algorithms integrate to decrease latency, increase mission efficiency, and conserve onboard resources. This system dynamically assigns computing duties to edge nodes and UAVs according to proximity, available resources, and the urgency of the tasks. Reduced latency, increased mission efficiency, and onboard resource conservation result from dynamic task offloading edge-aware implementation framework (DTOE-AIF)'s integration of AI algorithms with edge computing. DTOE-AOF is useful in many fields, such as precision agriculture, emergency management, infrastructure inspection, and monitoring. UAVs powered by AI and outfitted with DTOE-AOF can swiftly survey the damage, find survivors, and launch rescue missions. By comparing DTOE-AOF to conventional centralized methods, thorough simulation research confirms that it improves mission efficiency, response time, and resource utilization.

11.
Heliyon ; 10(12): e32660, 2024 Jun 30.
Article in English | MEDLINE | ID: mdl-38994112

ABSTRACT

The article explores the potential of 5G-enabled Unmanned Aerial Vehicles (UAVs) in establishing opportunistic networks to improve network resource management, reduce energy use, and boost operational efficiency. The proposed framework utilizes 5G-enabled drones and edge command and control software to provide energy-efficient network topologies. As a result, UAVs operate edge computing for efficient data collecting and processing. This invention enhances network performance using modern Artificial Intelligence (AI) algorithms to improve UAV networking capabilities while conserving energy. An empirical investigation shows a significant improvement in network performance measures when using 5G technology compared to older 2.4 GHz systems. The communication failure rate decreased by 50 %, from 12 % to 6 %. The round-trip time was lowered by 58.3 %, from 120 Ms to 50 Ms. The payload efficiency improved by 13.3 %, dropping from 15 % to 13 %. The data transmission rate increased significantly from 1 Gbps to 5 Gbps, representing a 400 % boost. The numerical findings highlight the significant impact that 5G technology may have on UAV operations. Testing on a 5G-enabled UAV confirms the effectiveness of our technique in several domains, including precision agriculture, disaster response, and environmental monitoring. The solution seriously improves UAV network performance by reducing energy consumption and using peripheral network command-and-control software. Our results emphasize the versatile networking capacities of 5G-enabled drones, which provide new opportunities for UAV applications.

12.
Sensors (Basel) ; 24(13)2024 Jun 27.
Article in English | MEDLINE | ID: mdl-39000960

ABSTRACT

With the maturity of artificial intelligence (AI) technology, applications of AI in edge computing will greatly promote the development of industrial technology. However, the existing studies on the edge computing framework for the Industrial Internet of Things (IIoT) still face several challenges, such as deep hardware and software coupling, diverse protocols, difficult deployment of AI models, insufficient computing capabilities of edge devices, and sensitivity to delay and energy consumption. To solve the above problems, this paper proposes a software-defined AI-oriented three-layer IIoT edge computing framework and presents the design and implementation of an AI-oriented edge computing system, aiming to support device access, enable the acceptance and deployment of AI models from the cloud, and allow the whole process from data acquisition to model training to be completed at the edge. In addition, this paper proposes a time series-based method for device selection and computation offloading in the federated learning process, which selectively offloads the tasks of inefficient nodes to the edge computing center to reduce the training delay and energy consumption. Finally, experiments carried out to verify the feasibility and effectiveness of the proposed method are reported. The model training time with the proposed method is generally 30% to 50% less than that with the random device selection method, and the training energy consumption under the proposed method is generally 35% to 55% less.

13.
Sensors (Basel) ; 24(13)2024 Jun 28.
Article in English | MEDLINE | ID: mdl-39000987

ABSTRACT

This paper introduces the novel design and implementation of a low-power wireless monitoring system designed for nuclear power plants, aiming to enhance safety and operational efficiency. By utilizing advanced signal-processing techniques and energy-efficient technologies, the system supports real-time, continuous monitoring without the need for frequent battery replacements. This addresses the high costs and risks associated with traditional wired monitoring methods. The system focuses on acoustic and ultrasonic analysis, capturing sound using microphones and processing these signals through heterodyne frequency conversion for effective signal management, accommodating low-power consumption through down-conversion. Integrated with edge computing, the system processes data locally at the sensor level, optimizing response times to anomalies and reducing network load. Practical implementation shows significant reductions in maintenance overheads and environmental impact, thereby enhancing the reliability and safety of nuclear power plant operations. The study also sets the groundwork for future integration of sophisticated machine learning algorithms to advance predictive maintenance capabilities in nuclear energy management.

14.
Sensors (Basel) ; 24(13)2024 Jul 02.
Article in English | MEDLINE | ID: mdl-39001087

ABSTRACT

The growing importance of edge and fog computing in the modern IT infrastructure is driven by the rise of decentralized applications. However, resource allocation within these frameworks is challenging due to varying device capabilities and dynamic network conditions. Conventional approaches often result in poor resource use and slowed advancements. This study presents a novel strategy for enhancing resource allocation in edge and fog computing by integrating machine learning with the blockchain for reliable trust management. Our proposed framework, called CyberGuard, leverages the blockchain's inherent immutability and decentralization to establish a trustworthy and transparent network for monitoring and verifying edge and fog computing transactions. CyberGuard combines the Trust2Vec model with conventional machine-learning models like SVM, KNN, and random forests, creating a robust mechanism for assessing trust and security risks. Through detailed optimization and case studies, CyberGuard demonstrates significant improvements in resource allocation efficiency and overall system performance in real-world scenarios. Our results highlight CyberGuard's effectiveness, evidenced by a remarkable accuracy, precision, recall, and F1-score of 98.18%, showcasing the transformative potential of our comprehensive approach in edge and fog computing environments.

15.
Sensors (Basel) ; 24(13)2024 Jul 04.
Article in English | MEDLINE | ID: mdl-39001116

ABSTRACT

This study investigates the dynamic deployment of unmanned aerial vehicles (UAVs) using edge computing in a forest fire scenario. We consider the dynamically changing characteristics of forest fires and the corresponding varying resource requirements. Based on this, this paper models a two-timescale UAV dynamic deployment scheme by considering the dynamic changes in the number and position of UAVs. In the slow timescale, we use a gate recurrent unit (GRU) to predict the number of future users and determine the number of UAVs based on the resource requirements. UAVs with low energy are replaced accordingly. In the fast timescale, a deep-reinforcement-learning-based UAV position deployment algorithm is designed to enable the low-latency processing of computational tasks by adjusting the UAV positions in real time to meet the ground devices' computational demands. The simulation results demonstrate that the proposed scheme achieves better prediction accuracy. The number and position of UAVs can be adapted to resource demand changes and reduce task execution delays.

16.
Sensors (Basel) ; 24(13)2024 Jul 05.
Article in English | MEDLINE | ID: mdl-39001165

ABSTRACT

The development of contactless methods to assess the degree of personal hygiene in elderly people is crucial for detecting frailty and providing early intervention to prevent complete loss of autonomy, cognitive impairment, and hospitalisation. The unobtrusive nature of the technology is essential in the context of maintaining good quality of life. The use of cameras and edge computing with sensors provides a way of monitoring subjects without interrupting their normal routines, and has the advantages of local data processing and improved privacy. This work describes the development an intelligent system that takes the RGB frames of a video as input to classify the occurrence of brushing teeth, washing hands, and fixing hair. No action activity is considered. The RGB frames are first processed by two Mediapipe algorithms to extract body keypoints related to the pose and hands, which represent the features to be classified. The optimal feature extractor results from the most complex Mediapipe pose estimator combined with the most complex hand keypoint regressor, which achieves the best performance even when operating at one frame per second. The final classifier is a Light Gradient Boosting Machine classifier that achieves more than 94% weighted F1-score under conditions of one frame per second and observation times of seven seconds or more. When the observation window is enlarged to ten seconds, the F1-scores for each class oscillate between 94.66% and 96.35%.


Subject(s)
Algorithms , Frailty , Humans , Frailty/diagnosis , Aged , Monitoring, Physiologic/methods , Monitoring, Physiologic/instrumentation , Female , Male , Video Recording/methods , Machine Learning
17.
Front Artif Intell ; 7: 1354742, 2024.
Article in English | MEDLINE | ID: mdl-39006803

ABSTRACT

Cardiac disease is considered as the one of the deadliest diseases that constantly increases the globe's mortality rate. Since a lot of expertise is required for an accurate prediction of heart disease, designing an intelligent predictive system for cardiac diseases remains to be complex and tricky. Internet of Things based health regulation systems are a relatively recent technology. In addition, novel Edge and Fog device concepts are presented to advance prediction results. However, the main problem with the current systems is that they are unable to meet the demands of effective diagnosis systems due to their poor prediction capabilities. To overcome this problem, this research proposes a novel framework called HAWKFOGS which innovatively integrates the deep learning for a practical diagnosis of cardiac problems using edge and fog computing devices. The current datasets were gathered from different subjects using IoT devices interfaced with the electrocardiography and blood pressure sensors. The data are then predicted as normal and abnormal using the Logistic Chaos based Harris Hawk Optimized Enhanced Gated Recurrent Neural Networks. The ablation experiments are carried out using IoT nodes interfaced with medical sensors and fog gateways based on Embedded Jetson Nano devices. The suggested algorithm's performance is measured. Additionally, Model Building Time is computed to validate the suggested model's response. Compared to the other algorithms, the suggested model yielded the best results in terms of accuracy (99.7%), precision (99.65%), recall (99.7%), specificity (99.7%). F1-score (99.69%) and used the least amount of Model Building Time (1.16 s) to predict cardiac diseases.

18.
Water Environ Res ; 96(7): e11074, 2024 Jul.
Article in English | MEDLINE | ID: mdl-39015947

ABSTRACT

Digital twins have been gaining an immense interest in various fields over the last decade. Bringing conventional process simulation models into (near) real time are thought to provide valuable insights for operators, decision makers, and stakeholders in many industries. The objective of this paper is to describe two methods for implementing digital twins at water resource recovery facilities and highlight and discuss their differences and preferable use situations, with focus on the automated data transfer from the real process. Case 1 uses a tailor-made infrastructure for automated data transfer between the facility and the digital twin. Case 2 uses edge computing for rapid automated data transfer. The data transfer lag from process to digital twin is low compared to the simulation frequency in both systems. The presented digital twin objectives can be achieved using either of the presented methods. The method of Case 1 is better suited for automatic recalibration of model parameters, although workarounds exist for the method in Case 2. The method of Case 2 is well suited for objectives such as soft sensors due to its integration with the SCADA system and low latency. The objective of the digital twin, and the required latency of the system, should guide the choice of method. PRACTITIONER POINTS: Various methods can be used for automated data transfer between the physical system and a digital twin. Delays in the data transfer differ depending on implementation method. The digital twin objective determines the required simulation frequency. Implementation method should be chosen based on the required simulation frequency.


Subject(s)
Automation , Models, Theoretical , Computer Simulation
19.
Heliyon ; 10(13): e33792, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-39040324

ABSTRACT

A smart healthcare system (SHS) is a health service system that employs advanced technologies such as wearable devices, the Internet of Things (IoT), and mobile internet to dynamically access information and connect people and institutions related to healthcare, thereby actively managing and responding to medical ecosystem needs. Edge computing (EC) plays a significant role in SHS as it enables real-time data processing and analysis at the data source, which reduces latency and improves medical intervention speed. However, the integration of patient information, including electronic health records (EHRs), into the SHS framework induces security and privacy concerns. To address these issues, an intelligent EC framework was proposed in this study. The objective of this study is to accurately identify security threats and ensure secure data transmission in the SHS environment. The proposed EC framework leverages the effectiveness of Salp Swarm Optimization and Radial Basis Functional Neural Network (SS-RBFN) for enhancing security and data privacy. The proposed methodology commences with the collection of healthcare information, which is then pre-processed to ensure the consistency and quality of the database for further analysis. Subsequently, the SS-RBFN algorithm was trained using the pre-processed database to distinguish between normal and malicious data streams accurately, offering continuous monitoring in the SHS environment. Additionally, a Rivest-Shamir-Adelman (RSA) approach was applied to safeguard data against security threats during transmission to cloud storage. The proposed model was trained and validated using the IoT-based healthcare database available at Kaggle, and the experimental results demonstrated that it achieved 99.87 % accuracy, 99.76 % precision, 99.49 % f-measure, 98.99 % recall, 97.37 % throughput, and 1.2s latency. Furthermore, the results achieved by the proposed model were compared with the existing models to validate its effectiveness in enhancing security.

20.
MethodsX ; 13: 102834, 2024 Dec.
Article in English | MEDLINE | ID: mdl-39071997

ABSTRACT

The use of technology in healthcare is one of the most critical application areas today. With the development of medical applications, people's quality of life has improved. However, it is impractical and unnecessary for medium-risk people to receive specialized daily hospital monitoring. Due to their health status, they will be exposed to a high risk of severe health damage or even life-threatening conditions without monitoring. Therefore, remote, real-time, low-cost, wearable, and effective monitoring is ideal for this problem. Many researchers mentioned that their studies could use electrocardiogram (ECG) detection to discover emergencies. However, how to respond to discovered emergencies in household life is still a research gap in this field.•This paper proposes a real-time monitoring of ECG signals and sending them to the cloud for Sudden Cardiac Death (SCD) prediction.•Unlike previous studies, the proposed system has an additional emergency response mechanism to alert nearby community healthcare workers when SCD is predicted to occur.

SELECTION OF CITATIONS
SEARCH DETAIL