Your browser doesn't support javascript.
loading
Bumble bees display cross-modal object recognition between visual and tactile senses.
Solvi, Cwyn; Gutierrez Al-Khudhairy, Selene; Chittka, Lars.
Afiliação
  • Solvi C; School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK. cwyn.solvi@mq.edu.au.
  • Gutierrez Al-Khudhairy S; Department of Biological Sciences, Macquarie University, North Ryde, NSW 2109, Australia.
  • Chittka L; School of Biological and Chemical Sciences, Queen Mary University of London, London E1 4NS, UK.
Science ; 367(6480): 910-912, 2020 02 21.
Article em En | MEDLINE | ID: mdl-32079771
Many animals can associate object shapes with incentives. However, such behavior is possible without storing images of shapes in memory that are accessible to more than one sensory modality. One way to explore whether there are modality-independent internal representations of object shapes is to investigate cross-modal recognition-experiencing an object in one sensory modality and later recognizing it in another. We show that bumble bees trained to discriminate two differently shaped objects (cubes and spheres) using only touch (in darkness) or vision (in light, but barred from touching the objects) could subsequently discriminate those same objects using only the other sensory information. Our experiments demonstrate that bumble bees possess the ability to integrate sensory information in a way that requires modality-independent internal representations.
Assuntos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Reconhecimento Visual de Modelos / Abelhas / Reconhecimento Fisiológico de Modelo / Percepção do Tato Limite: Animals Idioma: En Revista: Science Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Reconhecimento Visual de Modelos / Abelhas / Reconhecimento Fisiológico de Modelo / Percepção do Tato Limite: Animals Idioma: En Revista: Science Ano de publicação: 2020 Tipo de documento: Article