Your browser doesn't support javascript.
loading
Habituation Reflects Optimal Exploration Over Noisy Perceptual Samples.
Cao, Anjie; Raz, Gal; Saxe, Rebecca; Frank, Michael C.
Affiliation
  • Cao A; Department of Psychology, Stanford University.
  • Raz G; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology.
  • Saxe R; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology.
  • Frank MC; Department of Psychology, Stanford University.
Top Cogn Sci ; 15(2): 290-302, 2023 04.
Article in En | MEDLINE | ID: mdl-36322897
ABSTRACT
From birth, humans constantly make decisions about what to look at and for how long. Yet, the mechanism behind such decision-making remains poorly understood. Here, we present the rational action, noisy choice for habituation (RANCH) model. RANCH is a rational learning model that takes noisy perceptual samples from stimuli and makes sampling decisions based on expected information gain (EIG). The model captures key patterns of looking time documented in developmental research habituation and dishabituation. We evaluated the model with adult looking time collected from a paradigm analogous to the infant habituation paradigm. We compared RANCH with baseline models (no learning model, no perceptual noise model) and models with alternative linking hypotheses (Surprisal, KL divergence). We showed that (1) learning and perceptual noise are critical assumptions of the model, and (2) Surprisal and KL are good proxies for EIG under the current learning context.
Subject(s)
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Habituation, Psychophysiologic / Learning Type of study: Prognostic_studies Limits: Adult / Humans / Infant Language: En Journal: Top Cogn Sci Year: 2023 Document type: Article

Full text: 1 Collection: 01-internacional Database: MEDLINE Main subject: Habituation, Psychophysiologic / Learning Type of study: Prognostic_studies Limits: Adult / Humans / Infant Language: En Journal: Top Cogn Sci Year: 2023 Document type: Article