Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
bioRxiv ; 2023 Aug 30.
Artículo en Inglés | MEDLINE | ID: mdl-36711722

RESUMEN

The potential negative impact of head movement during fMRI has long been appreciated. Although a variety of prospective and retrospective approaches have been developed to help mitigate these effects, reducing head movement in the first place remains the most appealing strategy for optimizing data quality. Real-time interventions, in which participants are provided feedback regarding their scan-to-scan motion, have recently shown promise in reducing motion during resting state fMRI. However, whether feedback might similarly reduce motion during task-based fMRI is an open question. In particular, it is unclear whether participants can effectively monitor motion feedback while attending to task-related demands. Here we assessed whether a combination of real-time and between-run feedback could reduce head motion during task-based fMRI. During an auditory word repetition task, 78 adult participants (aged 19-81) were pseudorandomly assigned to receive feedback or not. Feedback was provided FIRMM software that used real-time calculation of realignment parameters to estimate participant motion. We quantified movement using framewise displacement (FD). We found that motion feedback resulted in a statistically significant reduction in participant head motion, with a small-to-moderate effect size (reducing average FD from 0.347 to 0.282). Reductions were most apparent in high-motion events. We conclude that under some circumstances real-time feedback may reduce head motion during task-based fMRI, although its effectiveness may depend on the specific participant population and task demands of a given study.

2.
J Neurosci ; 42(3): 435-442, 2022 01 19.
Artículo en Inglés | MEDLINE | ID: mdl-34815317

RESUMEN

In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans (n = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS.SIGNIFICANCE STATEMENT In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity.


Asunto(s)
Corteza Auditiva/fisiología , Lenguaje , Lectura de los Labios , Red Nerviosa/fisiología , Percepción del Habla/fisiología , Corteza Visual/fisiología , Percepción Visual/fisiología , Adulto , Anciano , Anciano de 80 o más Años , Corteza Auditiva/diagnóstico por imagen , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Red Nerviosa/diagnóstico por imagen , Corteza Visual/diagnóstico por imagen , Adulto Joven
3.
Neurobiol Lang (Camb) ; 1(4): 452-473, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-34327333

RESUMEN

Understanding spoken words requires the rapid matching of a complex acoustic stimulus with stored lexical representations. The degree to which brain networks supporting spoken word recognition are affected by adult aging remains poorly understood. In the current study we used fMRI to measure the brain responses to spoken words in two conditions: an attentive listening condition, in which no response was required, and a repetition task. Listeners were 29 young adults (aged 19-30 years) and 32 older adults (aged 65-81 years) without self-reported hearing difficulty. We found largely similar patterns of activity during word perception for both young and older adults, centered on the bilateral superior temporal gyrus. As expected, the repetition condition resulted in significantly more activity in areas related to motor planning and execution (including the premotor cortex and supplemental motor area) compared to the attentive listening condition. Importantly, however, older adults showed significantly less activity in probabilistically defined auditory cortex than young adults when listening to individual words in both the attentive listening and repetition tasks. Age differences in auditory cortex activity were seen selectively for words (no age differences were present for 1-channel vocoded speech, used as a control condition), and could not be easily explained by accuracy on the task, movement in the scanner, or hearing sensitivity (available on a subset of participants). These findings indicate largely similar patterns of brain activity for young and older adults when listening to words in quiet, but suggest less recruitment of auditory cortex by the older adults.

4.
Am J Physiol Endocrinol Metab ; 300(1): E202-10, 2011 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-21045176

RESUMEN

Neurokinin B (NKB) and its cognate receptor neurokinin 3 (NK3R) play a critical role in reproduction. NKB and NK3R are coexpressed with dynorphin (Dyn) and kisspeptin (Kiss1) genes in neurons of the arcuate nucleus (Arc). However, the mechanisms of action of NKB as a cotransmitter with kisspeptin and dynorphin remain poorly understood. We explored the role of NKB in the control of LH secretion in the female rat as follows. 1) We examined the effect of an NKB agonist (senktide, 600 pmol, administered into the lateral cerebral ventricle) on luteinizing hormone (LH) secretion. In the presence of physiological levels of estradiol (E(2)), senktide induced a profound increase in serum levels of LH and a 10-fold increase in the number of Kiss1 neurons expressing c-fos in the Arc (P < 0.01 for both). 2) We mapped the distribution of NKB and NK3R mRNAs in the central forebrain and found that both are widely expressed, with intense expression in several hypothalamic nuclei that control reproduction, including the Arc. 3) We studied the effect of E(2) on the expression of NKB and NK3R mRNAs in the Arc and found that E(2) inhibits the expression of both genes (P < 0.01) and that the expression of NKB and NK3R reaches its nadir on the afternoon of proestrus (when circulating levels of E(2) are high). These observations suggest that NKB/NK3R signaling in Kiss1/NKB/Dyn-producing neurons in the Arc has a pivotal role in the control of gonadotropin-releasing hormone (GnRH)/LH secretion and its regulation by E(2)-dependent negative feedback in the rat.


Asunto(s)
Núcleo Arqueado del Hipotálamo/metabolismo , Hormona Liberadora de Gonadotropina/metabolismo , Neuroquinina B/metabolismo , Neuronas/metabolismo , Proteínas/metabolismo , Receptores de Neuroquinina-3/metabolismo , Transducción de Señal , Animales , Núcleo Arqueado del Hipotálamo/citología , Núcleo Arqueado del Hipotálamo/efectos de los fármacos , Estradiol/metabolismo , Ciclo Estral/metabolismo , Retroalimentación Fisiológica , Femenino , Regulación de la Expresión Génica , Kisspeptinas , Hormona Luteinizante/sangre , Neuroquinina B/agonistas , Neuroquinina B/genética , Neuronas/efectos de los fármacos , Especificidad de Órganos , Fragmentos de Péptidos/farmacología , Prosencéfalo/citología , Prosencéfalo/metabolismo , Proteínas/genética , Proteínas Proto-Oncogénicas c-fos/genética , Proteínas Proto-Oncogénicas c-fos/metabolismo , ARN Mensajero/metabolismo , Ratas , Ratas Wistar , Receptores de Neuroquinina-3/agonistas , Receptores de Neuroquinina-3/genética , Transducción de Señal/efectos de los fármacos , Sustancia P/análogos & derivados , Sustancia P/farmacología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...