Machine learning-driven self-discovery of the robot body morphology.
Sci Robot
; 8(85): eadh0972, 2023 Dec 13.
Article
em En
| MEDLINE
| ID: mdl-38091427
The morphology of a robot is typically assumed to be known, and data from external measuring devices are used mainly for its kinematic calibration. In contrast, we take an agent-centric perspective and ponder the vaguely explored question of whether a robot could learn elements of its morphology by itself, relying on minimal prior knowledge and depending only on unorganized proprioceptive signals. To answer this question, we propose a mutual information-based representation of the relationships between the proprioceptive signals of a robot, which we call proprioceptive information graphs (π-graphs). Leveraging the fact that the information structure of the sensorimotor apparatus is dependent on the embodiment of the robot, we use the π-graph to look for pairwise signal relationships that reflect the underlying kinematic first-order principles applicable to the robot's structure. In our discussion, we show that analysis of the π-graph leads to the inference of two fundamental elements of the robot morphology: its mechanical topology and corresponding kinematic description, that is, the location and orientation of the robot's joints. Results from a robot manipulator, a hexapod, and a humanoid robot show that the correct topology and kinematic description can be effectively inferred from their π-graph either offline or online, regardless of the number of links and body configuration.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
Sci Robot
Ano de publicação:
2023
Tipo de documento:
Article