Your browser doesn't support javascript.
loading
Model-Based 3D Gaze Estimation Using a TOF Camera.
Shen, Kuanxin; Li, Yingshun; Guo, Zhannan; Gao, Jintao; Wu, Yingjian.
Afiliação
  • Shen K; School of Chemical Process Automation, Shenyang University of Technology, Liaoyang 111003, China.
  • Li Y; School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China.
  • Guo Z; School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China.
  • Gao J; School of Chemical Process Automation, Shenyang University of Technology, Liaoyang 111003, China.
  • Wu Y; School of Chemical Process Automation, Shenyang University of Technology, Liaoyang 111003, China.
Sensors (Basel) ; 24(4)2024 Feb 06.
Article em En | MEDLINE | ID: mdl-38400227
ABSTRACT
Among the numerous gaze-estimation methods currently available, appearance-based methods predominantly use RGB images as input and employ convolutional neural networks (CNNs) to detect facial images to regressively obtain gaze angles or gaze points. Model-based methods require high-resolution images to obtain a clear eyeball geometric model. These methods face significant challenges in outdoor environments and practical application scenarios. This paper proposes a model-based gaze-estimation algorithm using a low-resolution 3D TOF camera. This study uses infrared images instead of RGB images as input to overcome the impact of varying illumination intensity in the environment on gaze estimation. We utilized a trained YOLOv8 neural network model to detect eye landmarks in captured facial images. Combined with the depth map from a time-of-flight (TOF) camera, we calculated the 3D coordinates of the canthus points of a single eye of the subject. Based on this, we fitted a 3D geometric model of the eyeball to determine the subject's gaze angle. Experimental validation showed that our method achieved a root mean square error of 6.03° and 4.83° in the horizontal and vertical directions, respectively, for the detection of the subject's gaze angle. We also tested the proposed method in a real car driving environment, achieving stable driver gaze detection at various locations inside the car, such as the dashboard, driver mirror, and the in-vehicle screen.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article