Humans ignore motion and stereo cues in favor of a fictional stable world.
Curr Biol
; 16(4): 428-32, 2006 Feb 21.
Article
em En
| MEDLINE
| ID: mdl-16488879
As we move through the world, our eyes acquire a sequence of images. The information from this sequence is sufficient to determine the structure of a three-dimensional scene, up to a scale factor determined by the distance that the eyes have moved. Previous evidence shows that the human visual system accounts for the distance the observer has walked and the separation of the eyes when judging the scale, shape, and distance of objects. However, in an immersive virtual-reality environment, observers failed to notice when a scene expanded or contracted, despite having consistent information about scale from both distance walked and binocular vision. This failure led to large errors in judging the size of objects. The pattern of errors cannot be explained by assuming a visual reconstruction of the scene with an incorrect estimate of interocular separation or distance walked. Instead, it is consistent with a Bayesian model of cue integration in which the efficacy of motion and disparity cues is greater at near viewing distances. Our results imply that observers are more willing to adjust their estimate of interocular separation or distance walked than to accept that the scene has changed in size.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Percepção Espacial
Tipo de estudo:
Prognostic_studies
Limite:
Humans
Idioma:
En
Revista:
Curr Biol
Assunto da revista:
BIOLOGIA
Ano de publicação:
2006
Tipo de documento:
Article
País de afiliação:
Reino Unido