RESUMO
Finding sought visual targets requires our brains to flexibly combine working memory information about what we are looking for with visual information about what we are looking at. To investigate the neural computations involved in finding visual targets, we recorded neural responses in inferotemporal cortex (IT) and perirhinal cortex (PRH) as macaque monkeys performed a task that required them to find targets in sequences of distractors. We found similar amounts of total task-specific information in both areas; however, information about whether a target was in view was more accessible using a linear read-out or, equivalently, was more untangled in PRH. Consistent with the flow of information from IT to PRH, we also found that task-relevant information arrived earlier in IT. PRH responses were well-described by a functional model in which computations in PRH untangle input from IT by combining neurons with asymmetric tuning correlations for target matches and distractors.
Assuntos
Comportamento Apetitivo/fisiologia , Atenção/fisiologia , Memória de Curto Prazo/fisiologia , Reconhecimento Visual de Modelos/fisiologia , Lobo Temporal/fisiologia , Percepção Visual/fisiologia , Potenciais de Ação , Animais , Cognição/fisiologia , Macaca mulatta , Masculino , Modelos Neurológicos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Desempenho Psicomotor/fisiologia , Lobo Temporal/citologia , Córtex Visual/fisiologiaRESUMO
OBJECTIVE: To describe a highly quantitative facial function-measuring tool that yields accurate, objective measures of facial position in significantly less time than existing methods. METHODS: Facial Assessment by Computer Evaluation (FACE) software was designed for facial analysis. Outputs report the static facial landmark positions and dynamic facial movements relevant in facial reanimation. Fifty individuals underwent facial movement analysis using Photoshop-based measurements and the new software; comparisons of agreement and efficiency were made. Comparisons were made between individuals with normal facial animation and patients with paralysis to gauge sensitivity to abnormal movements. RESULTS: Facial measurements were matched using FACE software and Photoshop-based measures at rest and during expressions. The automated assessments required significantly less time than Photoshop-based assessments.FACE measurements easily revealed differences between individuals with normal facial animation and patients with facial paralysis. CONCLUSIONS: FACE software produces accurate measurements of facial landmarks and facial movements and is sensitive to paralysis. Given its efficiency, it serves as a useful tool in the clinical setting for zonal facial movement analysis in comprehensive facial nerve rehabilitation programs.