Your browser doesn't support javascript.
loading
Predictive processing of scenes and objects.
Peelen, Marius V; Berlot, Eva; de Lange, Floris P.
Afiliação
  • Peelen MV; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.
  • Berlot E; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.
  • de Lange FP; Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands.
Nat Rev Psychol ; 3: 13-26, 2024 Jan.
Article em En | MEDLINE | ID: mdl-38989004
ABSTRACT
Real-world visual input consists of rich scenes that are meaningfully composed of multiple objects which interact in complex, but predictable, ways. Despite this complexity, we recognize scenes, and objects within these scenes, from a brief glance at an image. In this review, we synthesize recent behavioral and neural findings that elucidate the mechanisms underlying this impressive ability. First, we review evidence that visual object and scene processing is partly implemented in parallel, allowing for a rapid initial gist of both objects and scenes concurrently. Next, we discuss recent evidence for bidirectional interactions between object and scene processing, with scene information modulating the visual processing of objects, and object information modulating the visual processing of scenes. Finally, we review evidence that objects also combine with each other to form object constellations, modulating the processing of individual objects within the object pathway. Altogether, these findings can be understood by conceptualizing object and scene perception as the outcome of a joint probabilistic inference, in which "best guesses" about objects act as priors for scene perception and vice versa, in order to concurrently optimize visual inference of objects and scenes.

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article