Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 1.161
Filtrar
Más filtros

Intervalo de año de publicación
1.
J Neurosci ; 44(25)2024 Jun 19.
Artículo en Inglés | MEDLINE | ID: mdl-38729759

RESUMEN

Attentional control over sensory processing has been linked to neural alpha oscillations and related inhibition of cerebral cortex. Despite the wide consensus on the functional relevance of alpha oscillations for attention, precise neural mechanisms of how alpha oscillations shape perception and how this top-down modulation is implemented in cortical networks remain unclear. Here, we tested the hypothesis that alpha oscillations in frontal eye fields (FEFs) are causally involved in the top-down regulation of visual processing in humans (male and female). We applied sham-controlled, intermittent transcranial alternating current stimulation (tACS) over bilateral FEF at either 10 Hz (alpha) or 40 Hz (gamma) to manipulate attentional preparation in a visual discrimination task. Under each stimulation condition, we measured psychometric functions for contrast perception and introduced a novel linear mixed modeling approach for statistical control of neurosensory side effects of the electric stimulation. tACS at alpha frequency reduced the slope of the psychometric function, resulting in improved subthreshold and impaired superthreshold contrast perception. Side effects on the psychometric functions were complex and showed large interindividual variability. Controlling for the impact of side effects on the psychometric parameters by using covariates in the linear mixed model analysis reduced this variability and strengthened the perceptual effect. We propose that alpha tACS over FEF mimicked a state of endogenous attention by strengthening a fronto-occipitoparietal network in the alpha band. We speculate that this network modulation enhanced phasic gating in occipitoparietal cortex leading to increased variability of single-trial psychometric thresholds, measurable as a reduction of psychometric slope.


Asunto(s)
Ritmo alfa , Atención , Estimulación Transcraneal de Corriente Directa , Percepción Visual , Humanos , Femenino , Masculino , Atención/fisiología , Estimulación Transcraneal de Corriente Directa/métodos , Adulto , Percepción Visual/fisiología , Adulto Joven , Ritmo alfa/fisiología , Lóbulo Frontal/fisiología , Estimulación Luminosa/métodos , Campos Visuales/fisiología
2.
Proc Natl Acad Sci U S A ; 119(49): e2205515119, 2022 12 06.
Artículo en Inglés | MEDLINE | ID: mdl-36442123

RESUMEN

Attention describes the ability to selectively process a particular aspect of the environment at the expense of others. Despite the significance of selective processing, the types and scopes of attentional mechanisms in nonprimate species remain underexplored. We trained four carrion crows in Posner spatial cueing tasks using two separate protocols where the attention-capturing cues are shown at different times before target onset at either the same or a different location as the impending target. To probe automatic bottom-up, or exogenous, attention, two naïve crows were tested with a cue that had no predictive value concerning the location of the subsequent target. To examine volitional top-down, or endogenous, attention, the other two crows were tested with the previously learned cues that predicted the impending target location. Comparing the performance for valid (cue and target at same location) and invalid (cue and target at opposing locations) cues in the nonpredictive cue condition showed a transient, mild reaction time advantage signifying exogenous attention. In contrast, there was a strong and long-lasting performance advantage for the valid conditions with predictive cues indicating endogenous attention. Together, these results demonstrate that crows possess two different attention mechanisms (exogenous and endogenous). These findings signify that crows possess a substantial attentional capacity and robust cognitive control over attention allocation.


Asunto(s)
Cuervos , Animales , Señales (Psicología) , Aprendizaje , Tiempo de Reacción
3.
J Neurosci ; 43(21): 3876-3894, 2023 05 24.
Artículo en Inglés | MEDLINE | ID: mdl-37185101

RESUMEN

Natural sounds contain rich patterns of amplitude modulation (AM), which is one of the essential sound dimensions for auditory perception. The sensitivity of human hearing to AM measured by psychophysics takes diverse forms depending on the experimental conditions. Here, we address with a single framework the questions of why such patterns of AM sensitivity have emerged in the human auditory system and how they are realized by our neural mechanisms. Assuming that optimization for natural sound recognition has taken place during human evolution and development, we examined its effect on the formation of AM sensitivity by optimizing a computational model, specifically, a multilayer neural network, for natural sound (namely, everyday sounds and speech sounds) recognition and simulating psychophysical experiments in which the AM sensitivity of the model was assessed. Relatively higher layers in the model optimized to sounds with natural AM statistics exhibited AM sensitivity similar to that of humans, although the model was not designed to reproduce human-like AM sensitivity. Moreover, simulated neurophysiological experiments on the model revealed a correspondence between the model layers and the auditory brain regions. The layers in which human-like psychophysical AM sensitivity emerged exhibited substantial neurophysiological similarity with the auditory midbrain and higher regions. These results suggest that human behavioral AM sensitivity has emerged as a result of optimization for natural sound recognition in the course of our evolution and/or development and that it is based on a stimulus representation encoded in the neural firing rates in the auditory midbrain and higher regions.SIGNIFICANCE STATEMENT This study provides a computational paradigm to bridge the gap between the behavioral properties of human sensory systems as measured in psychophysics and neural representations as measured in nonhuman neurophysiology. This was accomplished by combining the knowledge and techniques in psychophysics, neurophysiology, and machine learning. As a specific target modality, we focused on the auditory sensitivity to sound AM. We built an artificial neural network model that performs natural sound recognition and simulated psychophysical and neurophysiological experiments in the model. Quantitative comparison of a machine learning model with human and nonhuman data made it possible to integrate the knowledge of behavioral AM sensitivity and neural AM tunings from the perspective of optimization to natural sound recognition.


Asunto(s)
Corteza Auditiva , Sonido , Humanos , Percepción Auditiva/fisiología , Encéfalo/fisiología , Audición , Mesencéfalo/fisiología , Estimulación Acústica , Corteza Auditiva/fisiología
4.
Neuroimage ; 293: 120626, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38677632

RESUMEN

Spatio-temporal patterns of evoked brain activity contain information that can be used to decode and categorize the semantic content of visual stimuli. However, this procedure can be biased by low-level image features independently of the semantic content present in the stimuli, prompting the need to understand the robustness of different models regarding these confounding factors. In this study, we trained machine learning models to distinguish between concepts included in the publicly available THINGS-EEG dataset using electroencephalography (EEG) data acquired during a rapid serial visual presentation paradigm. We investigated the contribution of low-level image features to decoding accuracy in a multivariate model, utilizing broadband data from all EEG channels. Additionally, we explored a univariate model obtained through data-driven feature selection applied to the spatial and frequency domains. While the univariate models exhibited better decoding accuracy, their predictions were less robust to the confounding effect of low-level image statistics. Notably, some of the models maintained their accuracy even after random replacement of the training dataset with semantically unrelated samples that presented similar low-level content. In conclusion, our findings suggest that model optimization impacts sensitivity to confounding factors, regardless of the resulting classification performance. Therefore, the choice of EEG features for semantic decoding should ideally be informed by criteria beyond classifier performance, such as the neurobiological mechanisms under study.


Asunto(s)
Electroencefalografía , Semántica , Humanos , Electroencefalografía/métodos , Femenino , Masculino , Adulto , Adulto Joven , Aprendizaje Automático , Encéfalo/fisiología
5.
J Neurophysiol ; 132(3): 643-652, 2024 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-39015076

RESUMEN

We frequently interact with textured surfaces with both our feet and hands. Like texture's importance for grasping, texture perception via the foot sole might provide important signals about the stability of a surface, aiding in maintaining balance. However, how textures are perceived by the foot, and especially under the high forces experienced during walking, is unknown. The current study builds on extensive research investigating texture perception at the hand by presenting everyday textures to the foot while stepping onto them, exploring them with the foot while sitting, and exploring them with the hand. Participants rated each texture along three perceptual dimensions: roughness, hardness, and stickiness. Participants also rated how stable their posture felt when standing upon each texture. Results show that perceptual ratings of each textural dimension were highly correlated across conditions. Hardness exhibited the greatest consistency and stickiness the weakest. Moreover, correlations between stepping and exploration with the foot were lower than those between exploration with the foot and exploration with the hand, suggesting that mode of interaction (high vs. low force) impacts perception more than body region used (foot vs. hand). On an individual level, correlations between conditions were higher than those between participants, suggesting that differences are greater between individuals than between mode of interaction or body region. When investigating the relationship to perceived stability, only hardness contributed significantly, with harder surfaces rated as more stable. Overall, tactile perception appears consistent across body regions and interaction modes, although differences in perception are greater during walking.NEW & NOTEWORTHY We frequently interact with textured surfaces using our feet, but little is known about how textures on the foot sole are perceived as compared with the hand. Here, we show that roughness, hardness, and stickiness ratings are broadly consistent when stepping on textures, exploring them with the foot sole, or with the hand. Hardness also contributes to perceived stability.


Asunto(s)
Pie , Mano , Percepción del Tacto , Caminata , Humanos , Caminata/fisiología , Masculino , Femenino , Pie/fisiología , Percepción del Tacto/fisiología , Adulto , Mano/fisiología , Adulto Joven , Sedestación
6.
J Neurophysiol ; 132(3): 666-677, 2024 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-39015072

RESUMEN

Nonhuman primates (NHPs), especially rhesus macaques, have significantly contributed to our understanding of the neural computations underlying human vision. Besides the established homologies in the visual brain areas between these species and our ability to probe detailed neural mechanisms in monkeys at multiple scales, NHPs' ability to perform human-like visual behavior makes them an extremely appealing animal model of human vision. Traditionally, such behavioral studies have been conducted in controlled laboratory settings, offering experimenters tight control over variables like luminance, eye movements, and auditory interference. However, in-lab experiments have several constraints, including limited experimental time, the need for dedicated human experimenters, additional lab space requirements, invasive surgeries for headpost implants, and extra time and training for chairing and head restraints. To overcome these limitations, we propose adopting home-cage behavioral training and testing of NHPs, enabling the administration of many vision-based behavioral tasks simultaneously across multiple monkeys with reduced human personnel requirements, no NHP head restraint, and monkeys' unrestricted access to experiments. In this article, we present a portable, low-cost, easy-to-use kiosk system developed to conduct home-cage vision-based behavioral tasks in NHPs. We provide details of its operation and build to enable more open-source development of this technology. Furthermore, we present validation results using behavioral measurements performed in the lab and in NHP home cages, demonstrating the system's reliability and potential to enhance the efficiency and flexibility of NHP behavioral research.NEW & NOTEWORTHY Training nonhuman primates (NHPs) for vision-based behavioral tasks in a laboratory setting is a time-consuming process and comes with many limitations. To overcome these challenges, we have developed an affordable, open-source, wireless, touchscreen training system that can be placed in the NHPs' housing environment. This system enables NHPs to work at their own pace. It provides a platform to implement continuous behavioral training protocols without major experimenter intervention and eliminates the need for other standard practices like NHP chair training, collar placement, and head restraints. Hence, these kiosks ultimately contribute to animal welfare and therefore better-quality neuroscience in the long run. In addition, NHPs quickly learn complex behavioral tasks using this system, making it a promising tool for wireless electrophysiological research in naturalistic, unrestricted environments to probe the relation between brain and behavior.


Asunto(s)
Conducta Animal , Macaca mulatta , Animales , Conducta Animal/fisiología , Masculino , Percepción Visual/fisiología , Visión Ocular/fisiología
7.
J Neurophysiol ; 132(2): 389-402, 2024 Aug 01.
Artículo en Inglés | MEDLINE | ID: mdl-38863427

RESUMEN

Everyday actions like moving the head, walking around, and grasping objects are typically self-controlled. This presents a problem when studying the signals encoding such actions because active self-movement is difficult to control experimentally. Available techniques demand repeatable trials, but each action is unique, making it difficult to measure fundamental properties like psychophysical thresholds. We present a novel paradigm that recovers both precision and bias of self-movement signals with minimal constraint on the participant. The paradigm relies on linking image motion to previous self-movement, and two experimental phases to extract the signal encoding the latter. The paradigm takes care of a hidden source of external noise not previously accounted for in techniques that link display motion to self-movement in real time (e.g., virtual reality). We use head rotations as an example of self-movement, and show that the precision of the signals encoding head movement depends on whether they are being used to judge visual motion or auditory motion. We find that perceived motion is slowed during head movement in both cases. The "nonimage" signals encoding active head rotation (motor commands, proprioception, and vestibular cues) are therefore biased toward lower speeds and/or displacements. In a second experiment, we trained participants to rotate their heads at different rates and found that the imprecision of the head rotation signal rises proportionally with head speed (Weber's law). We discuss the findings in terms of the different motion cues used by vision and hearing, and the implications they have for Bayesian models of motion perception.NEW & NOTEWORTHY We present a psychophysical technique for measuring the precision of signals encoding active self-movements. Using head movements, we show that 1) precision is greater when active head rotation is performed using visual comparison stimuli versus auditory; 2) precision decreases with head speed (Weber's law); 3) perceived speed is lower during head rotation. The findings may reflect the steps needed to convert different cues into common units, and challenge standard Bayesian models of motion perception.


Asunto(s)
Movimientos de la Cabeza , Percepción de Movimiento , Humanos , Movimientos de la Cabeza/fisiología , Adulto , Masculino , Femenino , Percepción de Movimiento/fisiología , Propiocepción/fisiología , Adulto Joven , Rotación , Percepción Auditiva/fisiología
8.
J Neurophysiol ; 131(1): 38-63, 2024 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-37965933

RESUMEN

Human speech and vocalizations in animals are rich in joint spectrotemporal (S-T) modulations, wherein acoustic changes in both frequency and time are functionally related. In principle, the primate auditory system could process these complex dynamic sounds based on either an inseparable representation of S-T features or, alternatively, a separable representation. The separability hypothesis implies an independent processing of spectral and temporal modulations. We collected comparative data on the S-T hearing sensitivity in humans and macaque monkeys to a wide range of broadband dynamic spectrotemporal ripple stimuli employing a yes-no signal-detection task. Ripples were systematically varied, as a function of density (spectral modulation frequency), velocity (temporal modulation frequency), or modulation depth, to cover a listener's full S-T modulation sensitivity, derived from a total of 87 psychometric ripple detection curves. Audiograms were measured to control for normal hearing. Determined were hearing thresholds, reaction time distributions, and S-T modulation transfer functions (MTFs), both at the ripple detection thresholds and at suprathreshold modulation depths. Our psychophysically derived MTFs are consistent with the hypothesis that both monkeys and humans employ analogous perceptual strategies: S-T acoustic information is primarily processed separable. Singular value decomposition (SVD), however, revealed a small, but consistent, inseparable spectral-temporal interaction. Finally, SVD analysis of the known visual spatiotemporal contrast sensitivity function (CSF) highlights that human vision is space-time inseparable to a much larger extent than is the case for S-T sensitivity in hearing. Thus, the specificity with which the primate brain encodes natural sounds appears to be less strict than is required to adequately deal with natural images.NEW & NOTEWORTHY We provide comparative data on primate audition of naturalistic sounds comprising hearing thresholds, reaction time distributions, and spectral-temporal modulation transfer functions. Our psychophysical experiments demonstrate that auditory information is primarily processed in a spectral-temporal-independent manner by both monkeys and humans. Singular value decomposition of known visual spatiotemporal contrast sensitivity, in comparison to our auditory spectral-temporal sensitivity, revealed a striking contrast in how the brain encodes natural sounds as opposed to natural images, as vision appears to be space-time inseparable.


Asunto(s)
Percepción del Habla , Percepción del Tiempo , Animales , Humanos , Haplorrinos , Percepción Auditiva , Audición , Estimulación Acústica/métodos
9.
Artículo en Inglés | MEDLINE | ID: mdl-39256251

RESUMEN

Stochastic resonance (SR) is the phenomenon wherein the introduction of a suitable level of noise enhances the detection of subthreshold signals in non linear systems. It manifests across various physical and biological systems, including the human brain. Psychophysical experiments have confirmed the behavioural impact of stochastic resonance on auditory, somatic, and visual perception. Aging renders the brain more susceptible to noise, possibly causing differences in the  SR phenomenon between young and elderly individuals. This study investigates the impact of noise on motion detection accuracy throughout the lifespan, with 214 participants ranging in age from 18 to 82. Our objective was to determine the optimal noise level to induce an SR-like response in both young and old populations. Consistent with existing literature, our findings reveal a diminishing advantage with age, indicating that the efficacy of noise addition progressively diminishes. Additionally, as individuals age, peak performance is achieved with lower levels of noise. This study provides the first insight into how SR changes across the lifespan of healthy adults and establishes a foundation for understanding the pathological alterations in perceptual processes associated with aging.

10.
Cogn Affect Behav Neurosci ; 24(1): 100-110, 2024 02.
Artículo en Inglés | MEDLINE | ID: mdl-38263367

RESUMEN

The sense of body ownership is the feeling that one's body belongs to oneself. To study body ownership, researchers use bodily illusions, such as the rubber hand illusion (RHI), which involves experiencing a visible rubber hand as part of one's body when the rubber hand is stroked simultaneously with the hidden real hand. The RHI is based on a combination of vision, touch, and proprioceptive information following the principles of multisensory integration. It has been posited that texture incongruence between rubber hand and real hand weakens the RHI, but the underlying mechanisms remain poorly understood. To investigate this, we recently developed a novel psychophysical RHI paradigm. Based on fitting psychometric functions, we discovered the RHI resulted in shifts in the point of subjective equality when the rubber hand and the real hand were stroked with matching materials. We analysed these datasets further by using signal detection theory analysis, which distinguishes between the participants' sensitivity to visuotactile stimulation and the associated perceptual bias. We found that texture incongruence influences the RHI's perceptual bias but not its sensitivity to visuotactile stimulation. We observed that the texture congruence bias effect was the strongest in shorter visuotactile asynchronies (50-100 ms) and weaker in longer asynchronies (200 ms). These results suggest texture-related perceptual bias is most prominent when the illusion's sensitivity is at its lowest. Our findings shed light on the intricate interactions between top-down and bottom-up processes in body ownership, the links between body ownership and multisensory integration, and the impact of texture congruence on the RHI.


Asunto(s)
Ilusiones , Percepción del Tacto , Humanos , Ilusiones/fisiología , Mano/fisiología , Percepción del Tacto/fisiología , Tacto , Propiocepción/fisiología , Imagen Corporal , Percepción Visual/fisiología
11.
Proc Biol Sci ; 291(2018): 20232867, 2024 Mar 13.
Artículo en Inglés | MEDLINE | ID: mdl-38471562

RESUMEN

A delayed foveal mask affects perception of peripheral stimuli. The effect is determined by the timing of the mask and by the similarity with the peripheral stimulus. A congruent mask enhances performance, while an incongruent one impairs it. It is hypothesized that foveal masks disrupt a feedback mechanism reaching the foveal cortex. This mechanism could be part of a broader circuit associated with mental imagery, but this hypothesis has not as yet been tested. We investigated the link between mental imagery and foveal feedback. We tested the relationship between performance fluctuations caused by the foveal mask-measured in terms of discriminability (d') and criterion (C)-and the scores from two questionnaires designed to assess mental imagery vividness (VVIQ) and another exploring object imagery, spatial imagery and verbal cognitive styles (OSIVQ). Contrary to our hypotheses, no significant correlations were found between VVIQ and the mask's impact on d' and C. Neither the object nor spatial subscales of OSIVQ correlated with the mask's impact. In conclusion, our findings do not substantiate the existence of a link between foveal feedback and mental imagery. Further investigation is needed to determine whether mask interference might occur with more implicit measures of imagery.


Asunto(s)
Imaginación , Percepción Visual , Fóvea Central , Encuestas y Cuestionarios , Personalidad
12.
J Exp Biol ; 227(12)2024 Jun 15.
Artículo en Inglés | MEDLINE | ID: mdl-38752337

RESUMEN

'Biological motion' refers to the distinctive kinematics observed in many living organisms, where visually perceivable points on the animal move at fixed distances from each other. Across the animal kingdom, many species have developed specialized visual circuitry to recognize such biological motion and to discriminate it from other patterns. Recently, this ability has been observed in the distributed visual system of jumping spiders. These eight-eyed animals use six eyes to perceive motion, while the remaining two (the principal anterior medial eyes) are shifted across the visual scene to further inspect detected objects. When presented with a biologically moving stimulus and a random one, jumping spiders turn to face the latter, clearly demonstrating the ability to discriminate between them. However, it remains unclear whether the principal eyes are necessary for this behavior, whether all secondary eyes can perform this discrimination, or whether a single eye-pair is specialized for this task. Here, we systematically tested the ability of jumping spiders to discriminate between biological and random visual stimuli by testing each eye-pair alone. Spiders were able to discriminate stimuli only when the anterior lateral eyes were unblocked, and performed at chance levels in other configurations. Interestingly, spiders showed a preference for biological motion over random stimuli - unlike in past work. We therefore propose a new model describing how specialization of the anterior lateral eyes for detecting biological motion contributes to multi-eye integration in this system. This integration generates more complex behavior through the combination of simple, single-eye responses. We posit that this in-built modularity may be a solution to the limited resources of these invertebrates' brains, constituting a novel approach to visual processing.


Asunto(s)
Percepción de Movimiento , Arañas , Animales , Percepción de Movimiento/fisiología , Arañas/fisiología , Ojo , Femenino
13.
J Exp Biol ; 227(7)2024 Apr 01.
Artículo en Inglés | MEDLINE | ID: mdl-38586934

RESUMEN

In many animals, ultraviolet (UV) vision guides navigation, foraging, and communication, but few studies have addressed the contribution of UV signals to colour vision, or measured UV discrimination thresholds using behavioural experiments. Here, we tested UV colour vision in an anemonefish (Amphiprion ocellaris) using a five-channel (RGB-V-UV) LED display. We first determined that the maximal sensitivity of the A. ocellaris UV cone was ∼386 nm using microspectrophotometry. Three additional cone spectral sensitivities had maxima at ∼497, 515 and ∼535 nm. We then behaviourally measured colour discrimination thresholds by training anemonefish to distinguish a coloured target pixel from grey distractor pixels of varying intensity. Thresholds were calculated for nine sets of colours with and without UV signals. Using a tetrachromatic vision model, we found that anemonefish were better (i.e. discrimination thresholds were lower) at discriminating colours when target pixels had higher UV chromatic contrast. These colours caused a greater stimulation of the UV cone relative to other cone types. These findings imply that a UV component of colour signals and cues improves their detectability, which likely increases the prominence of anemonefish body patterns for communication and the silhouette of zooplankton prey.


Asunto(s)
Visión de Colores , Perciformes , Animales , Color , Células Fotorreceptoras Retinianas Conos/fisiología , Percepción de Color/fisiología , Rayos Ultravioleta
14.
Exp Brain Res ; 242(2): 385-402, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-38135820

RESUMEN

Vestibular contributions to linear motion (i.e., translation) perception mediated by the otoliths have yet to be fully characterized. To quantify the maximal extent that non-vestibular cues can contribute to translation perception, we assessed vestibular perceptual thresholds in two patients with complete bilateral vestibular ablation to compare to our data in 12 young (< 40 years), healthy controls. Vestibular thresholds were assessed for naso-occipital ("x-translation"), inter-aural ("y-translation"), and superior-inferior ("z-translation") translations in three body orientations (upright, supine, side-lying). Overall, in our patients with bilateral complete vestibular loss, thresholds were elevated ~ 2-45 times relative to healthy controls. No systematic differences in vestibular perceptual thresholds were noted between motions that differed only with respect to their orientation relative to the head (i.e., otoliths) in patients with bilateral vestibular loss. In addition, bilateral loss patients tended to show a larger impairment in the perception of earth-vertical translations (i.e., motion parallel to gravity) relative to earth-horizontal translations, which suggests increased contribution of the vestibular system for earth-vertical motions. However, differences were also noted between the two patients. Finally, with the exception of side-lying x-translations, no consistent effects of body orientation in our bilateral loss patients were seen independent from those resulting from changes in the plane of translation relative to gravity. Overall, our data confirm predominant vestibular contributions to whole-body direction-recognition translation tasks and provide fundamental insights into vestibular contributions to translation motion perception.


Asunto(s)
Percepción de Movimiento , Vestíbulo del Laberinto , Humanos , Movimiento (Física) , Gravitación
15.
Audiol Neurootol ; 29(5): 374-381, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38471471

RESUMEN

INTRODUCTION: Gap detection tests are crucial clinical tools for identifying auditory processing disorders that result from abnormalities in the central auditory nervous system. These tests assess the ability to resolve temporal information in sounds, which aids in the diagnosis of auditory temporal processing issues. This study explores the directional effects of marker frequencies on gap detection tasks with respect to the conditions of long and short frequency disparity (separation). METHODS: We measured the gap detection thresholds (GDTs) using four across-channel narrowband noise conditions (1-2, 2-1, 1-4, and 4-1 kHz). A within-subject study design involved 29 healthy individuals with normal hearing. Stimuli were presented monaurally using headphones routed via a calibrated audiometer. RESULTS: The condition with long frequency disparity and a low leading frequency (1-4 kHz) exhibited higher GDTs compared to the other across-channel conditions. However, we did not observe this effect in the other condition with long frequency disparity and a high leading frequency (4-1 kHz), which did not show significant differences from the two conditions with short frequency disparity. CONCLUSION: The study findings suggest that the combined effects of the spectral characteristics of the gap markers, including frequency disparity and order of presentation, influence the temporal resolution ability of auditory gap detection. Clinicians evaluating a patient suspected to have central auditory disorders should recognize that the across-channel GDTs may not consistently increase as the frequency separation between the markers increases.


Asunto(s)
Umbral Auditivo , Humanos , Masculino , Femenino , Adulto , Adulto Joven , Percepción Auditiva/fisiología , Estimulación Acústica
16.
Cereb Cortex ; 33(4): 1361-1382, 2023 02 07.
Artículo en Inglés | MEDLINE | ID: mdl-35417918

RESUMEN

To address the question which neocortical layers and cell types are important for the perception of a sensory stimulus, we performed multielectrode recordings in the barrel cortex of head-fixed mice performing a single-whisker go/no-go detection task with vibrotactile stimuli of differing intensities. We found that behavioral detection probability decreased gradually over the course of each session, which was well explained by a signal detection theory-based model that posits stable psychometric sensitivity and a variable decision criterion updated after each reinforcement, reflecting decreasing motivation. Analysis of multiunit activity demonstrated highest neurometric sensitivity in layer 4, which was achieved within only 30 ms after stimulus onset. At the level of single neurons, we observed substantial heterogeneity of neurometric sensitivity within and across layers, ranging from nonresponsiveness to approaching or even exceeding psychometric sensitivity. In all cortical layers, putative inhibitory interneurons on average proffered higher neurometric sensitivity than putative excitatory neurons. In infragranular layers, neurons increasing firing rate in response to stimulation featured higher sensitivities than neurons decreasing firing rate. Offline machine-learning-based analysis of videos of behavioral sessions showed that mice performed better when not moving, which at the neuronal level, was reflected by increased stimulus-evoked firing rates.


Asunto(s)
Neuronas , Vibrisas , Animales , Vibrisas/fisiología , Neuronas/fisiología , Interneuronas , Corteza Somatosensorial/fisiología
17.
Somatosens Mot Res ; : 1-8, 2024 May 29.
Artículo en Inglés | MEDLINE | ID: mdl-38812257

RESUMEN

AIM OF THE STUDY: Brain-computer interfaces (BCIs) may help patients with severe neurological deficits communicate with the external world. Based on microelectrocorticography (µECoG) data recorded from the primary somatosensory cortex (S1) of unrestrained behaving rats, this study attempts to decode lever presses in a psychophysical detection task by using machine learning algorithms. MATERIALS AND METHODS: 16-channel Pt-Ir microelectrode arrays were implanted on the S1 of two rats, and µECoG was recorded during a vibrotactile yes/no detection task. For this task, the rats were trained to press the right lever when they detected the vibrotactile stimulus and the left lever when they did not. The multichannel µECoG data was analysed offline by time-frequency methods and its features were used for binary classification of the lever press at each trial. Several machine learning algorithms were tested as such. RESULTS: The psychophysical sensitivities (A') were similar and low for both rats (0.58). Rat 2 (B'': -0.11) had higher bias for the right lever than Rat 1 (B'': - 0.01). The lever presses could be predicted with accuracies over 66% with all the tested algorithms, and the highest average accuracy (78%) was with the support vector machine. CONCLUSION: According to the recent studies, sensory feedback increases the benefit of the BCIs. The current proof-of-concept study shows that lever presses can be decoded from the S1; therefore, this area may be utilised for a bidirectional BCI in the future.

18.
Artículo en Inglés | MEDLINE | ID: mdl-39294391

RESUMEN

PURPOSE: Visual acuity is a psychophysical threshold that we want to determine as precisely and efficiently as possible. The Freiburg Vision Test FrACT employs the automated Bayesian "Best PEST" algorithm for this purpose: the next optotype size is always selected to be at threshold based on the information acquired so far, thereby maximizing information gain. METHODS: We assessed the test-retest Limits of Agreement (LoA, Bland & Altman 1986) across 6 to 48 trials in 2 × 78 runs involving 26 participants; visual acuity (in part artificially reduced) ranged from 1.22 to -0.59 LogMAR. RESULTS: LoA exhibited a steep decline from ± 0.46 LogMAR at six trials to ± 0.17 at 18 trials; with more trials, LoA showed less change, reaching ± 0.12 LogMAR at 48 trials. LoA did not significantly change over the wide acuity range assessed here. CONCLUSION: These findings suggest that 18 trials represent an efficient balance between precision and burden on the participant and examiner. This observation holds for the eight response alternatives used in this study (8 Landolt C orientations) and is anticipated to apply to the ten Sloan letters as well. With only four choices (e.g., tumbling E), more trials will be necessary. KEY MESSAGES: What is known When assessing visual acuity, a tradeoff between precision and effort is necessary. What is new A run length of 18 trials is a good compromise between effort and precision for an 8-alternative task (the Landolt C). With 18 trials a 95% confidence interval of ± 0.17 LogMAR for test-retest is found. The test-retest precision is independent of the acuity level over the 1.5 LogMAR range studied here.

19.
Learn Behav ; 2024 Mar 11.
Artículo en Inglés | MEDLINE | ID: mdl-38468107

RESUMEN

Anglada-Tort et al. Current Biology, 33, 1472-1486.e12, (2023) conducted a large-scale iterative learning study with cross-cultural human participants to understand how musical structure emerges. Together with archaeological, developmental, historical cross-cultural music data, and cross-species studies we can begin to elucidate the origins of music.

20.
Perception ; 53(2): 75-92, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-37946509

RESUMEN

During coordinated flight and centrifugation, pilots show interindividual variability in perceived roll tilt. The study explored how this variability is related to perceptual and cognitive functions. Twelve pilots underwent three 6-min centrifugations on two occasions (G levels: 1.1G, 1.8G, and 2.5G; gondola tilts: 25°, 56°, and 66°). The subjective visual horizontal (SVH) was measured with an adjustable luminous line and the pilots gave estimates of experienced G level. Afterward, they were interrogated regarding the relationship between G level and roll tilt and adjusted the line to numerically mentioned angles. Generally, the roll tilt during centrifugation was underestimated, and there was a large interindividual variability. Both knowledge on the relationship between G level and bank angle, and ability to adjust the line according to given angles contributed to the prediction of SVH in a multiple regression model. However, in most cases, SVH was substantial smaller than predictions based on specific abilities.


Asunto(s)
Pilotos , Humanos , Centrifugación
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA