Automatic Timed Up-and-Go Sub-Task Segmentation for Parkinson's Disease Patients Using Video-Based Activity Classification.
IEEE Trans Neural Syst Rehabil Eng
; 26(11): 2189-2199, 2018 11.
Article
en En
| MEDLINE
| ID: mdl-30334764
ABSTRACT
The timed up-and-go (TUG) test has been widely accepted as a standard assessment for measuring the basic functional mobility of patients with Parkinson's disease. Several basic mobility sub-tasks "Sit," "Sit-to-Stand," "Walk," "Turn," "Walk-Back," and "Sit-Back" are included in a TUG test. It has been shown that the time costs of these sub-tasks are useful clinical parameters for the assessment of Parkinson's disease. Several automatic methods have been proposed to segment and time these sub-tasks in a TUG test. However, these methods usually require either well-controlled environments for the TUG video recording or information from special devices, such as wearable inertial sensors, ambient sensors, or depth cameras. In this paper, an automatic TUG sub-task segmentation method using video-based activity classification is proposed and validated in a study with 24 Parkinson's disease patients. Videos used in this paper are recorded in semi-controlled environments with various backgrounds. The state-of-the-art deep learning-base 2-D human pose estimation technologies are used for feature extraction. A support vector machine and a long short-term memory network are then used for the activity classification and the subtask segmentation. Our method can be used to automatically acquire clinical parameters for the assessment of Parkinson's disease using TUG videos-only, leading to the possibility of remote monitoring of the patients' condition.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Enfermedad de Parkinson
/
Limitación de la Movilidad
Tipo de estudio:
Prognostic_studies
Límite:
Aged
/
Female
/
Humans
/
Male
/
Middle aged
Idioma:
En
Revista:
IEEE Trans Neural Syst Rehabil Eng
Asunto de la revista:
ENGENHARIA BIOMEDICA
/
REABILITACAO
Año:
2018
Tipo del documento:
Article