Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Cybern ; PP2022 Nov 29.
Artigo em Inglês | MEDLINE | ID: mdl-36446005

RESUMO

When humans interact with each other, eye gaze movements have to support motor control as well as communication. On the one hand, we need to fixate the task goal to retrieve visual information required for safe and precise action-execution. On the other hand, gaze movements fulfil the purpose of communication, both for reading the intention of our interaction partners, as well as to signal our action intentions to others. We study this Gaze Dialogue between two participants working on a collaborative task involving two types of actions: 1) individual action and 2) action-in-interaction. We recorded the eye-gaze data of both participants during the interaction sessions in order to build a computational model, the Gaze Dialogue, encoding the interplay of the eye movements during the dyadic interaction. The model also captures the correlation between the different gaze fixation points and the nature of the action. This knowledge is used to infer the type of action performed by an individual. We validated the model against the recorded eye-gaze behavior of one subject, taking the eye-gaze behavior of the other subject as the input. Finally, we used the model to design a humanoid robot controller that provides interpersonal gaze coordination in human-robot interaction scenarios. During the interaction, the robot is able to: 1) adequately infer the human action from gaze cues; 2) adjust its gaze fixation according to the human eye-gaze behavior; and 3) signal nonverbal cues that correlate with the robot's own action intentions.

2.
Front Neurorobot ; 13: 36, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31214011

RESUMO

The focus of research in biped locomotion has moved toward real-life scenario applications, like walking on uneven terrain, passing through doors, climbing stairs and ladders. As a result, we are witnessing significant advances in the locomotion of biped robots, enabling them to move in hazardous environments while simultaneously accomplishing complex manipulation tasks. Yet, considering walking in an unknown environment, the efficiency of humanoid robots is still far from being comparable with the human. Currently, bipeds are very sensitive to external changes and they have severe constraints for adaptation of walk to conditions from such a complex environment. Promising approaches for efficient generation and realization of walking in a complex environment are based on biological solutions that have been developed for many years of evolution. This work presents one such human-inspired methodology for path planning and realization of biped walk appropriate for motion in a complex unfamiliar environment. Path planning results in calculating clothoid curves that represent well the human-like walking path. The robot walk is realized by the composition of parametric motion primitives. Such an approach enables on-line modification of planned path and walk parameters at any moment, instantly. To establish the relationship between high-level path planner and the low-level joint motion realization, we had to find a way to extract the parameters of the clothoid paths that can be linked with the parameters of the walk and consequently to motion primitive parameters. This enabled the robot to adopt its walking for avoiding the obstacles and for a smooth transition between different paths. In this paper we provide a complete framework that integrates the following components: (i) bio-inspired online path planning, (ii) path-dependent automatic calculation of high-level gait parameters (step length, walking speed, direction, and the height of the foot sole), and (iii) automatic calculation of low-level joint movements and corresponding control terms (driving motor voltage) through the adaptation of motion primitives which realize walking pattern and preserves the dynamic balance of the robot.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA