Your browser doesn't support javascript.
loading
Event-Based Gesture Recognition With Dynamic Background Suppression Using Smartphone Computational Capabilities.
Maro, Jean-Matthieu; Ieng, Sio-Hoi; Benosman, Ryad.
Afiliação
  • Maro JM; Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France.
  • Ieng SH; Sorbonne Université, INSERM, CNRS, Institut de la Vision, Paris, France.
  • Benosman R; CHNO des Quinze-Vingts, INSERM-DGOS CIC 1423, Paris, France.
Front Neurosci ; 14: 275, 2020.
Article em En | MEDLINE | ID: mdl-32327968
ABSTRACT
In this paper, we introduce a framework for dynamic gesture recognition with background suppression operating on the output of a moving event-based camera. The system is developed to operate in real-time using only the computational capabilities of a mobile phone. It introduces a new development around the concept of time-surfaces. It also presents a novel event-based methodology to dynamically remove backgrounds that uses the high temporal resolution properties of event-based cameras. To our knowledge, this is the first Android event-based framework for vision-based recognition of dynamic gestures running on a smartphone without off-board processing. We assess the performances by considering several scenarios in both indoors and outdoors, for static and dynamic conditions, in uncontrolled lighting conditions. We also introduce a new event-based dataset for gesture recognition with static and dynamic backgrounds (made publicly available). The set of gestures has been selected following a clinical trial to allow human-machine interaction for the visually impaired and older adults. We finally report comparisons with prior work that addressed event-based gesture recognition reporting comparable results, without the use of advanced classification techniques nor power greedy hardware.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2020 Tipo de documento: Article