Your browser doesn't support javascript.
loading
Biosensor-Driven IoT Wearables for Accurate Body Motion Tracking and Localization.
Almujally, Nouf Abdullah; Khan, Danyal; Al Mudawi, Naif; Alonazi, Mohammed; Alazeb, Abdulwahab; Algarni, Asaad; Jalal, Ahmad; Liu, Hui.
Afiliação
  • Almujally NA; Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia.
  • Khan D; Faculty of Computing ad AI, Air University, E-9, Islamabad 44000, Pakistan.
  • Al Mudawi N; Department of Computer Science, College of Computer Science and Information System, Najran University, Najran 55461, Saudi Arabia.
  • Alonazi M; Department of Information Systems, College of Computer Engineering and Sciences, Prince Sattam bin Abdulaziz University, Al-Kharj 16273, Saudi Arabia.
  • Alazeb A; Department of Computer Science, College of Computer Science and Information System, Najran University, Najran 55461, Saudi Arabia.
  • Algarni A; Department of Computer Sciences, Faculty of Computing and Information Technology, Northern Border University, Rafha 91911, Saudi Arabia.
  • Jalal A; Faculty of Computing ad AI, Air University, E-9, Islamabad 44000, Pakistan.
  • Liu H; Cognitive Systems Lab, University of Bremen, 28359 Bremen, Germany.
Sensors (Basel) ; 24(10)2024 May 10.
Article em En | MEDLINE | ID: mdl-38793886
ABSTRACT
The domain of human locomotion identification through smartphone sensors is witnessing rapid expansion within the realm of research. This domain boasts significant potential across various sectors, including healthcare, sports, security systems, home automation, and real-time location tracking. Despite the considerable volume of existing research, the greater portion of it has primarily concentrated on locomotion activities. Comparatively less emphasis has been placed on the recognition of human localization patterns. In the current study, we introduce a system by facilitating the recognition of both human physical and location-based patterns. This system utilizes the capabilities of smartphone sensors to achieve its objectives. Our goal is to develop a system that can accurately identify different human physical and localization activities, such as walking, running, jumping, indoor, and outdoor activities. To achieve this, we perform preprocessing on the raw sensor data using a Butterworth filter for inertial sensors and a Median Filter for Global Positioning System (GPS) and then applying Hamming windowing techniques to segment the filtered data. We then extract features from the raw inertial and GPS sensors and select relevant features using the variance threshold feature selection method. The extrasensory dataset exhibits an imbalanced number of samples for certain activities. To address this issue, the permutation-based data augmentation technique is employed. The augmented features are optimized using the Yeo-Johnson power transformation algorithm before being sent to a multi-layer perceptron for classification. We evaluate our system using the K-fold cross-validation technique. The datasets used in this study are the Extrasensory and Sussex Huawei Locomotion (SHL), which contain both physical and localization activities. Our experiments demonstrate that our system achieves high accuracy with 96% and 94% over Extrasensory and SHL in physical activities and 94% and 91% over Extrasensory and SHL in the location-based activities, outperforming previous state-of-the-art methods in recognizing both types of activities.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Técnicas Biossensoriais / Sistemas de Informação Geográfica / Dispositivos Eletrônicos Vestíveis Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Técnicas Biossensoriais / Sistemas de Informação Geográfica / Dispositivos Eletrônicos Vestíveis Limite: Humans Idioma: En Ano de publicação: 2024 Tipo de documento: Article