Your browser doesn't support javascript.
loading
A unified computational framework for visual attention dynamics.
Zanca, Dario; Gori, Marco; Rufa, Alessandra.
Afiliação
  • Zanca D; University of Florence, Florence, Italy; University of Siena, Siena, Italy. Electronic address: dario.zanca@unifi.it.
  • Gori M; University of Siena, Siena, Italy.
  • Rufa A; University of Siena, Siena, Italy.
Prog Brain Res ; 249: 183-188, 2019.
Article em En | MEDLINE | ID: mdl-31325977
ABSTRACT
Eye movements are an essential part of human vision as they drive the fovea and, consequently, selective visual attention toward a region of interest in space. Free visual exploration is an inherently stochastic process depending on image statistics but also individual variability of cognitive and attentive state. We propose a theory of free visual exploration entirely formulated within the framework of physics and based on the general Principle of Least Action. Within this framework, differential laws describing eye movements emerge in accordance with bottom-up functional principles. In addition, we integrate top-down semantic information captured by deep convolutional neural networks pre-trained for the classification of common objects. To stress the model, we used a wide collection of images including basic features as well as high level semantic content. Results in a task of saliency prediction validate the theory.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Atenção / Percepção Visual / Redes Neurais de Computação / Movimentos Oculares / Modelos Teóricos Tipo de estudo: Prognostic_studies Limite: Humans Idioma: En Revista: Prog Brain Res Ano de publicação: 2019 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Atenção / Percepção Visual / Redes Neurais de Computação / Movimentos Oculares / Modelos Teóricos Tipo de estudo: Prognostic_studies Limite: Humans Idioma: En Revista: Prog Brain Res Ano de publicação: 2019 Tipo de documento: Article