Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros








Base de dados
Assunto principal
Intervalo de ano de publicação
1.
Nat Neurosci ; 27(5): 988-999, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38499855

RESUMO

A fundamental human cognitive feat is to interpret linguistic instructions in order to perform novel tasks without explicit task experience. Yet, the neural computations that might be used to accomplish this remain poorly understood. We use advances in natural language processing to create a neural model of generalization based on linguistic instructions. Models are trained on a set of common psychophysical tasks, and receive instructions embedded by a pretrained language model. Our best models can perform a previously unseen task with an average performance of 83% correct based solely on linguistic instructions (that is, zero-shot learning). We found that language scaffolds sensorimotor representations such that activity for interrelated tasks shares a common geometry with the semantic representations of instructions, allowing language to cue the proper composition of practiced skills in unseen settings. We show how this model generates a linguistic description of a novel task it has identified using only motor feedback, which can subsequently guide a partner model to perform the task. Our models offer several experimentally testable predictions outlining how linguistic information must be represented to facilitate flexible and general cognition in the human brain.


Assuntos
Neurônios , Humanos , Neurônios/fisiologia , Modelos Neurológicos , Idioma , Generalização Psicológica/fisiologia , Processamento de Linguagem Natural , Aprendizagem/fisiologia , Redes Neurais de Computação , Encéfalo/fisiologia , Rede Nervosa/fisiologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA