Your browser doesn't support javascript.
loading
A Multilayer Perceptron-Based Spherical Visual Compass Using Global Features.
Du, Yao; Mateo, Carlos; Tahri, Omar.
Afiliação
  • Du Y; Université Bourgogne, 21000 Dijon, France.
  • Mateo C; ICB UMR CNRS 6303, Université Bourgogne, 21000 Dijon, France.
  • Tahri O; ICB UMR CNRS 6303, Université Bourgogne, 21000 Dijon, France.
Sensors (Basel) ; 24(7)2024 Mar 31.
Article em En | MEDLINE | ID: mdl-38610457
ABSTRACT
This paper presents a visual compass method utilizing global features, specifically spherical moments. One of the primary challenges faced by photometric methods employing global features is the variation in the image caused by the appearance and disappearance of regions within the camera's field of view as it moves. Additionally, modeling the impact of translational motion on the values of global features poses a significant challenge, as it is dependent on scene depths, particularly for non-planar scenes. To address these issues, this paper combines the utilization of image masks to mitigate abrupt changes in global feature values and the application of neural networks to tackle the modeling challenge posed by translational motion. By employing masks at various locations within the image, multiple estimations of rotation corresponding to the motion of each selected region can be obtained. Our contribution lies in offering a rapid method for implementing numerous masks on the image with real-time inference speed, rendering it suitable for embedded robot applications. Extensive experiments have been conducted on both real-world and synthetic datasets generated using Blender. The results obtained validate the accuracy, robustness, and real-time performance of the proposed method compared to a state-of-the-art method.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article