Your browser doesn't support javascript.
loading
Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity.
Liu, Ran; Azabou, Mehdi; Dabagia, Max; Lin, Chi-Heng; Azar, Mohammad Gheshlaghi; Hengen, Keith B; Valko, Michal; Dyer, Eva L.
Affiliation
  • Liu R; Georgia Tech.
  • Azabou M; Georgia Tech.
  • Dabagia M; Georgia Tech.
  • Lin CH; Georgia Tech.
  • Azar MG; DeepMind.
  • Hengen KB; Washington Univ. in St. Louis.
  • Valko M; DeepMind.
  • Dyer EL; Georgia Tech.
Adv Neural Inf Process Syst ; 34: 10587-10599, 2021 Dec.
Article in En | MEDLINE | ID: mdl-36467015
Meaningful and simplified representations of neural activity can yield insights into how and what information is being processed within a neural circuit. However, without labels, finding representations that reveal the link between the brain and behavior can be challenging. Here, we introduce a novel unsupervised approach for learning disentangled representations of neural activity called Swap-VAE. Our approach combines a generative modeling framework with an instance-specific alignment loss that tries to maximize the representational similarity between transformed views of the input (brain state). These transformed (or augmented) views are created by dropping out neurons and jittering samples in time, which intuitively should lead the network to a representation that maintains both temporal consistency and invariance to the specific neurons used to represent the neural state. Through evaluations on both synthetic data and neural recordings from hundreds of neurons in different primate brains, we show that it is possible to build representations that disentangle neural datasets along relevant latent dimensions linked to behavior.

Full text: 1 Database: MEDLINE Type of study: Prognostic_studies Language: En Journal: Adv Neural Inf Process Syst Year: 2021 Type: Article

Full text: 1 Database: MEDLINE Type of study: Prognostic_studies Language: En Journal: Adv Neural Inf Process Syst Year: 2021 Type: Article