RESUMEN
BACKGROUND: People living with chronic obstructive pulmonary disease (COPD) have an increased risk of experiencing cardiovascular (CV) events, particularly after an exacerbation. Such CV burden is not yet known for incident COPD patients. We examined the risk of severe CV events in incident COPD patients in periods following either moderate and/or severe exacerbations. METHODS: Persons aged ≥ 40 years with an incident COPD diagnosis from the PHARMO Data Network were included. Exposed time periods included 1-7, 8-14, 15-30, 31-180 and 181-365 days following an exacerbation. Moderate exacerbations were defined as those managed in outpatient settings; severe exacerbations as those requiring hospitalisation. The outcome was a composite of time to first severe CV event (acute coronary syndrome, heart failure decompensation, cerebral ischaemia, or arrhythmia) or death. Hazard ratios (HR) were estimated for association between each exposed period and outcome. RESULTS: 8020 patients with newly diagnosed COPD were identified. 2234 patients (28%) had ≥ 1 exacerbation, 631 patients (8%) had a non-fatal CV event, and 461 patients (5%) died during a median follow-up of 36 months. The risk of experiencing the composite outcome was increased following a moderate/severe exacerbation as compared to time periods of stable disease [range of HR: from 15.3 (95% confidence interval 11.8-20.0) in days 1-7 to 1.3 (1.0-1.8) in days 181-365]. After a moderate exacerbation, the risk was increased over the first 180 days [HR 2.5 (1.3-4.8) in days 1-7 to 1.6 (1.3-2.1) in days 31-180]. After a severe exacerbation, the risk increased substantially and remained higher over the year following the exacerbation [HR 48.6 (36.9-64.0) in days 1-7 down to 1.6 (1.0-2.6) in days 181-365]. Increase in risk concerned all categories of severe CV events. CONCLUSIONS: Among incident COPD patients, we observed a substantial risk increase of severe CV events or all-cause death following either a moderate or severe exacerbation of COPD. Increase in risk was highest in the initial period following an exacerbation. These findings highlight the significant cardiopulmonary burden among people living with COPD even with a new diagnosis.
Asunto(s)
Enfermedades Cardiovasculares , Enfermedad Pulmonar Obstructiva Crónica , Humanos , Estudios de Cohortes , Países Bajos/epidemiología , Enfermedad Pulmonar Obstructiva Crónica/diagnóstico , Enfermedad Pulmonar Obstructiva Crónica/epidemiología , Enfermedades Cardiovasculares/diagnóstico , Enfermedades Cardiovasculares/epidemiología , Progresión de la EnfermedadRESUMEN
BACKGROUND AND OBJECTIVES: Frequent blood donation depletes iron stores of blood donors. Iron depletion may lead to anaemia, but the health effects of iron depletion without anaemia in healthy blood donors are not well understood. We studied in the FinDonor cohort whether worsening of self-rated health of blood donors during the study period was associated with biomarkers for iron levels or other self-reported changes in lifestyle. MATERIALS AND METHODS: We included 1416 participants from the cohort who answered an 89-item questionnaire on their health and lifestyle during their enrolment visit and again at the end of the study. We performed multivariate logistic regression to test if blood donation-related factors affected the probability of reporting worsened health. To set these findings into a more holistic context of health, we subsequently analysed all other questionnaire items with a data-driven exploratory analysis. RESULTS: We found that donation frequency in men and post-menopausal women and ferritin level only in men was associated negatively with worsened health between questionnaires. In the exploratory analysis, stable physical condition was the only questionnaire item that was associated negatively with worsened health in both women and men. CONCLUSION: Our results suggest that low ferritin level is associated with worsened health even in non-anaemic repeat donors, although we find that when health is analysed more holistically, ferritin and other factors primarily related to blood donation lose their importance.
Asunto(s)
Anemia Ferropénica , Donantes de Sangre , Estudios de Cohortes , Femenino , Ferritinas , Humanos , Hierro , MasculinoRESUMEN
BACKGROUND AND OBJECTIVES: There is increasing evidence that frequent blood donation depletes the iron stores of some blood donors. The FinDonor 10 000 study was set up to study iron status and factors affecting iron stores in Finnish blood donors. In Finland, iron supplementation for at-risk groups has been in place since the 1980s. MATERIAL AND METHODS: A total of 2584 blood donors (N = 8003 samples) were recruited into the study alongside standard donation at three donation sites in the capital region of Finland between 5/2015 and 12/2017. All participants were asked to fill out a questionnaire about their health and lifestyle. Blood samples were collected from the sample pouch of whole blood collection set, kept in cool temperature and processed centrally. Whole blood count, CRP, ferritin and sTFR were measured from the samples, and DNA was isolated for GWAS studies. RESULTS: Participant demographics, albeit in general similar to the general blood donor population in Finland, indicated some bias towards older and more frequent donors. Participation in the study increased median donation frequency of the donors. Analysis of the effect of time lag from the sampling to the analysis and the time of day when sample was drawn revealed small but significant time-dependent changes. CONCLUSION: The FinDonor cohort now provides us with tools to identify potential donor groups at increased risk of iron deficiency and factors explaining this risk. The increase in donation frequency during the study suggests that scientific projects can be used to increase the commitment of blood donors.
Asunto(s)
Donantes de Sangre/estadística & datos numéricos , Ferritinas/sangre , Hierro/sangre , Adulto , Estudios de Cohortes , Femenino , Finlandia , Humanos , Deficiencias de Hierro , Masculino , Persona de Mediana EdadRESUMEN
Stimuli may induce only partial consciousness-an intermediate between null and full consciousness-where the presence but not identity of an object can be reported. The differences in the neuronal basis of full and partial consciousness are poorly understood. We investigated if evoked and oscillatory activity could dissociate full from partial conscious perception. We recorded human cortical activity with magnetoencephalography (MEG) during a visual perception task in which stimulus could be either partially or fully perceived. Partial consciousness was associated with an early increase in evoked activity and theta/low-alpha-band oscillations while full consciousness was also associated with late evoked activity and beta-band oscillations. Full from partial consciousness was dissociated by stronger evoked activity and late increase in theta oscillations that were localized to higher-order visual regions and posterior parietal and prefrontal cortices. Our results reveal both evoked activity and theta oscillations dissociate partial and full consciousness.
Asunto(s)
Ondas Encefálicas/fisiología , Corteza Cerebral/fisiología , Estado de Conciencia/fisiología , Potenciales Evocados/fisiología , Percepción Visual/fisiología , Adulto , Mapeo Encefálico , Femenino , Humanos , Magnetoencefalografía , Masculino , Adulto JovenRESUMEN
HLA matching is a prerequisite for successful allogeneic hematopoietic stem cell transplantation (HSCT) because it lowers the occurrence and severity of graft-versus-host disease (GVHD). However, matching a few alleles of the classic HLA genes only may not ensure matching of the entire MHC region. HLA haplotype matching has been reported to be beneficial in HSCT because of the variation relevant to GVHD risk in the non-HLA region. Because polymorphism in the MHC is highly population specific, we hypothesized that donors from the Finnish registry are more likely to be matched at a higher level for the Finnish patients than donors from other registries. In the present study we determined 25 single nucleotide polymorphisms (SNPs) of the complement component 4 (C4) gene in the γ-block segment of MHC from 115 Finnish HSCT patients and their Finnish (nâ¯=â¯201) and non-Finnish (nâ¯=â¯280) donor candidates. Full matching of HLA alleles and C4 SNPs, independently or additively, occurred more likely in the Finnish-Finnish group as compared with the Finnish-non-Finnish group (P < .003). This was most striking in cases with HLA haplotypes typical of the Finnish population. Patients with ancestral HLA haplotypes (AH) were more likely to find a full HLA and C4 matched donor, regardless of donor origin, as compared with patients without AH (P < .0001). Despite the clear differences at the population level, we could not find a statistical association between C4 matching and clinical outcome. The results suggest that screening C4 SNPs can be advantageous when an extended MHC matching or HLA haplotype matching in HSCT is required. This study also supports the need for small population-specific stem cell registries.
Asunto(s)
Complemento C4/genética , Trasplante de Células Madre Hematopoyéticas/métodos , Histocompatibilidad/inmunología , Donante no Emparentado , Adulto , Complemento C4/inmunología , Finlandia , Haplotipos/genética , Haplotipos/inmunología , Humanos , Polimorfismo de Nucleótido Simple , Sistema de RegistrosRESUMEN
Visuospatial attention prioritizes processing of attended visual stimuli. It is characterized by lateralized alpha-band (8-14 Hz) amplitude suppression in visual cortex and increased neuronal activity in a network of frontal and parietal areas. It has remained unknown what mechanisms coordinate neuronal processing among frontoparietal network and visual cortices and implement the attention-related modulations of alpha-band amplitudes and behavior. We investigated whether large-scale network synchronization could be such a mechanism. We recorded human cortical activity with magnetoencephalography (MEG) during a visuospatial attention task. We then identified the frequencies and anatomical networks of inter-areal phase synchronization from source localized MEG data. We found that visuospatial attention is associated with robust and sustained long-range synchronization of cortical oscillations exclusively in the high-alpha (10-14 Hz) frequency band. This synchronization connected frontal, parietal and visual regions and was observed concurrently with amplitude suppression of low-alpha (6-9 Hz) band oscillations in visual cortex. Furthermore, stronger high-alpha phase synchronization was associated with decreased reaction times to attended stimuli and larger suppression of alpha-band amplitudes. These results thus show that high-alpha band phase synchronization is functionally significant and could coordinate the neuronal communication underlying the implementation of visuospatial attention.
Asunto(s)
Atención/fisiología , Corteza Cerebral/fisiología , Sincronización Cortical/fisiología , Adulto , Femenino , Humanos , Imagen por Resonancia Magnética , Magnetoencefalografía , Masculino , Estimulación Luminosa , Percepción Visual/fisiología , Adulto JovenRESUMEN
Inter-areal functional connectivity (FC), neuronal synchronization in particular, is thought to constitute a key systems-level mechanism for coordination of neuronal processing and communication between brain regions. Evidence to support this hypothesis has been gained largely using invasive electrophysiological approaches. In humans, neuronal activity can be non-invasively recorded only with magneto- and electroencephalography (MEG/EEG), which have been used to assess FC networks with high temporal resolution and whole-scalp coverage. However, even in source-reconstructed MEG/EEG data, signal mixing, or "source leakage", is a significant confounder for FC analyses and network localization. Signal mixing leads to two distinct kinds of false-positive observations: artificial interactions (AI) caused directly by mixing and spurious interactions (SI) arising indirectly from the spread of signals from true interacting sources to nearby false loci. To date, several interaction metrics have been developed to solve the AI problem, but the SI problem has remained largely intractable in MEG/EEG all-to-all source connectivity studies. Here, we advance a novel approach for correcting SIs in FC analyses using source-reconstructed MEG/EEG data. Our approach is to bundle observed FC connections into hyperedges by their adjacency in signal mixing. Using realistic simulations, we show here that bundling yields hyperedges with good separability of true positives and little loss in the true positive rate. Hyperedge bundling thus significantly decreases graph noise by minimizing the false-positive to true-positive ratio. Finally, we demonstrate the advantage of edge bundling in the visualization of large-scale cortical networks with real MEG data. We propose that hypergraphs yielded by bundling represent well the set of true cortical interactions that are detectable and dissociable in MEG/EEG connectivity analysis.
Asunto(s)
Encéfalo/fisiología , Electroencefalografía/métodos , Magnetoencefalografía/métodos , Red Nerviosa/fisiología , Procesamiento de Señales Asistido por Computador , Mapeo Encefálico/métodos , Simulación por Computador , Humanos , Modelos NeurológicosRESUMEN
We introduce here phase transfer entropy (Phase TE) as a measure of directed connectivity among neuronal oscillations. Phase TE quantifies the transfer entropy between phase time-series extracted from neuronal signals by filtering for instance. To validate the measure, we used coupled Neuronal Mass Models to both evaluate the characteristics of Phase TE and compare its performance with that of a real-valued TE implementation. We showed that Phase TE detects the strength and direction of connectivity even in the presence of such amounts of noise and linear mixing that typically characterize MEG and EEG recordings. Phase TE performed well across a wide range of analysis lags and sample sizes. Comparisons between Phase TE and real-valued TE estimates showed that Phase TE is more robust to nuisance parameters and considerably more efficient computationally. In addition, Phase TE accurately untangled bidirectional frequency band specific interaction patterns that confounded real-valued TE. Finally, we found that surrogate data can be used to construct appropriate null-hypothesis distributions and to estimate statistical significance of Phase TE. These results hence suggest that Phase TE is well suited for the estimation of directed phase-based connectivity in large-scale investigations of the human functional connectome.
Asunto(s)
Ondas Encefálicas/fisiología , Teoría de la Información , Modelos Neurológicos , Red Nerviosa/fisiología , Interpretación Estadística de Datos , Electroencefalografía , Humanos , MagnetoencefalografíaRESUMEN
Purpose: This study evaluated the long-term safety of roflumilast in patients with chronic obstructive pulmonary disease or chronic bronchitis using electronic healthcare databases from Germany, Norway, Sweden, and the United States (US). Patients and Methods: The study population consisted of patients aged ≥40 years who had been exposed to roflumilast and a matched cohort unexposed to roflumilast. The matching was based on sex, age, calendar year of cohort entry date (2010-2011, 2012, or 2013), and a propensity score that included variables such as demographics, markers of chronic obstructive pulmonary disease (COPD) severity and morbidity, and comorbidities. In comparison to the unexposed matched cohort (never use), three exposure definitions were used for the exposed matched cohort: ever use, use status (current, recent, past use), and cumulative duration of use. The main outcome was 5-year all-cause mortality. Cox regression models were used to estimate crude and adjusted hazard ratios (HRs) and 95% confidence intervals (CI). Results: 112,541 unexposed and 23,239 exposed patients across countries were included. Some variables remained unbalanced after matching, indicating higher COPD disease severity among the exposed patients. Adjusted HRs of 5-year all-cause mortality for "ever use" of roflumilast, compared to "never use", were 1.12 (95% CI, 1.08-1.17) in Germany, 1.00 (95% CI, 0.92-1.08) in Norway, 0.98 (95% CI, 0.92-1.04) in Sweden, and 1.16 (95% CI, 1.12-1.20) in the US. Compared to never users, there was a decrease in 5-year mortality risk observed among "current users" in Germany (HR: 0.93, 95% CI: 0.88-0.98), Norway (HR: 0.77, 95% CI: 0.67-0.87), and Sweden (HR: 0.80, 95% CI: 0.73-0.88). Conclusion: There was no observed increase in 5-year mortality risk with the use of roflumilast in Sweden or Norway. A small increase in 5-year mortality risk was observed in Germany and the US in the ever versus never comparison, likely due to residual confounding by indication.
Asunto(s)
Aminopiridinas , Benzamidas , Ciclopropanos , Bases de Datos Factuales , Inhibidores de Fosfodiesterasa 4 , Enfermedad Pulmonar Obstructiva Crónica , Humanos , Ciclopropanos/efectos adversos , Ciclopropanos/uso terapéutico , Enfermedad Pulmonar Obstructiva Crónica/tratamiento farmacológico , Enfermedad Pulmonar Obstructiva Crónica/mortalidad , Enfermedad Pulmonar Obstructiva Crónica/diagnóstico , Masculino , Femenino , Inhibidores de Fosfodiesterasa 4/efectos adversos , Inhibidores de Fosfodiesterasa 4/uso terapéutico , Benzamidas/efectos adversos , Benzamidas/uso terapéutico , Persona de Mediana Edad , Anciano , Aminopiridinas/uso terapéutico , Aminopiridinas/efectos adversos , Factores de Tiempo , Resultado del Tratamiento , Factores de Riesgo , Estados Unidos/epidemiología , Bronquitis Crónica/tratamiento farmacológico , Bronquitis Crónica/mortalidad , Bronquitis Crónica/epidemiología , Medición de Riesgo , Alemania , Adulto , Suecia/epidemiología , Anciano de 80 o más AñosRESUMEN
Motor responses to visual stimuli have shorter latencies for controlling than for initiating movement. The shorter latencies observed for movement control are notably believed to reflect the involvement of forward models when controlling moving limbs. We assessed whether controlling a moving limb is a "requisite" to observe shortened response latencies. The latency of button-press responses to a visual stimulus was compared between conditions involving or not involving the control of a moving object, but never involving any actual control of a body segment. When the motor response controlled a moving object, response latencies were significantly shorter and less variable, probably reflecting a faster sensorimotor processing (as assessed fitting a LATER model to our data). These results suggest that when the task at hand entails a control component, the sensorimotor processing of visual information is hastened, and this even if the task does not require to actually control a moving limb.
RESUMEN
Poor parallel letter-string processing in developmental dyslexia was taken as evidence of poor visual attention (VA) span, that is, a limitation of visual attentional resources that affects multi-character processing. However, the use of letter stimuli in oral report tasks was challenged on its capacity to highlight a VA span disorder. In particular, report of poor letter/digit-string processing but preserved symbol-string processing was viewed as evidence of poor visual-to-phonology code mapping, in line with the phonological theory of developmental dyslexia. We assessed here the visual-to-phonological-code mapping disorder hypothesis. In Experiment 1, letter-string, digit-string and colour-string processing was assessed to disentangle a phonological versus visual familiarity account of the letter/digit versus symbol dissociation. Against a visual-to-phonological-code mapping disorder but in support of a familiarity account, results showed poor letter/digit-string processing but preserved colour-string processing in dyslexic children. In Experiment 2, two tasks of letter-string report were used, one of which was performed simultaneously to a high-taxing phonological task. Results show that dyslexic children are similarly impaired in letter-string report whether a concurrent phonological task is simultaneously performed or not. Taken together, these results provide strong evidence against a phonological account of poor letter-string processing in developmental dyslexia.
Asunto(s)
Trastorno por Déficit de Atención con Hiperactividad/etiología , Dislexia/complicaciones , Fonética , Lectura , Percepción Visual/fisiología , Adolescente , Análisis de Varianza , Trastorno por Déficit de Atención con Hiperactividad/diagnóstico , Estudios de Casos y Controles , Niño , Preescolar , Percepción de Color , Femenino , Humanos , Masculino , Pruebas Neuropsicológicas , Reconocimiento Visual de Modelos , Estadística como AsuntoRESUMEN
The iron status of blood donors is a subject of concern for blood establishments. The Finnish Red Cross Blood Service addresses iron loss in blood donors by proposing systematic iron supplementation for demographic at-risk donor groups. We measured blood count, ferritin and soluble transferrin receptor (sTfR) and acquired lifestyle and health information from 2200 blood donors of the FinDonor 10000 cohort. We used modern data analysis methods to estimate iron status and factors affecting it with a special focus on the effects of the blood service's iron supplementation policy. Low ferritin (< 15 µg/L), an indicator of low iron stores, was present in 20.6% of pre-menopausal women, 10.6% of post-menopausal women and 6% of men. Anemia co-occurred with iron deficiency more frequently in pre-menopausal women (21 out of 25 cases) than in men (3/6) or post-menopausal women (1/2). In multivariable regression analyses, lifestyle, dietary, and blood donation factors explained up to 38% of the variance in ferritin levels but only ~10% of the variance in sTfR levels. Days since previous donation were positively associated with ferritin levels in all groups while the number of donations during the past 2 years was negatively associated with ferritin levels in pre-menopausal women and men. FRCBS-provided iron supplementation was negatively associated with ferritin levels in men only. Relative importance analyses showed that donation activity accounted for most of the explained variance in ferritin levels while iron supplementation explained less than 1%. Variation in ferritin levels was not significantly associated with variation in self-reported health. Donation activity was the most important factor affecting blood donor iron levels, far ahead of e.g. red-meat consumption or iron supplementation. Importantly, self-reported health of donors with lower iron stores was not lower than self-reported health of donors with higher iron stores.
Asunto(s)
Donantes de Sangre/estadística & datos numéricos , Dieta , Suplementos Dietéticos , Ferritinas/sangre , Compuestos de Hierro/uso terapéutico , Receptores de Transferrina/sangre , Adolescente , Adulto , Factores de Edad , Anciano , Anemia Ferropénica/sangre , Femenino , Estado de Salud , Humanos , Estilo de Vida , Masculino , Persona de Mediana Edad , Factores Sexuales , Adulto JovenRESUMEN
It has not been well documented that MEG/EEG functional connectivity graphs estimated with zero-lag-free interaction metrics are severely confounded by a multitude of spurious interactions (SI), i.e., the false-positive "ghosts" of true interactions [1], [2]. These SI are caused by the multivariate linear mixing between sources, and thus they pose a severe challenge to the validity of connectivity analysis. Due to the complex nature of signal mixing and the SI problem, there is a need to intuitively demonstrate how the SI are discovered and how they can be attenuated using a novel approach that we termed hyperedge bundling. Here we provide a dataset with software with which the readers can perform simulations in order to better understand the theory and the solution to SI. We include the supplementary material of [1] that is not directly relevant to the hyperedge bundling per se but reflects important properties of the MEG source model and the functional connectivity graphs. For example, the gyri of dorsal-lateral cortices are the most accurately modeled areas; the sulci of inferior temporal, frontal and the insula have the least modeling accuracy. Importantly, we found the interaction estimates are heavily biased by the modeling accuracy between regions, which means the estimates cannot be straightforwardly interpreted as the coupling between brain regions. This raise a red flag that the conventional method of thresholding graphs by estimate values is rather suboptimal: because the measured topology of the graph reflects the geometric property of source-model instead of the cortical interactions under investigation.
RESUMEN
The visual attention (VA) span deficit hypothesis of developmental dyslexia posits that impaired multiple element processing can be responsible for poor reading outcomes. In VA span impaired dyslexic children, poor performance on letter report tasks is associated with reduced parietal activations for multiple letter processing. While this hints towards a non-specific, attention-based dysfunction, it is still unclear whether reduced parietal activity generalizes to other types of stimuli. Furthermore, putative links between reduced parietal activity and reduced ventral occipito-temporal (vOT) in dyslexia have yet to be explored. Using functional magnetic resonance imaging, we measured brain activity in 12 VA span impaired dyslexic adults and 12 adult skilled readers while they carried out a categorization task on single or multiple alphanumeric or non-alphanumeric characters. While healthy readers activated parietal areas more strongly for multiple than single element processing (right-sided for alphanumeric and bilateral for non-alphanumeric), similar stronger multiple element right parietal activations were absent for dyslexic participants. Contrasts between skilled and dyslexic readers revealed significantly reduced right superior parietal lobule (SPL) activity for dyslexic readers regardless of stimuli type. Using a priori anatomically defined regions of interest, we showed that neural activity was reduced for dyslexic participants in both SPL and vOT bilaterally. Finally, we used multiple regressions to test whether SPL activity was related to vOT activity in each group. In the left hemisphere, SPL activity covaried with vOT activity for both normal and dyslexic readers. In contrast, in the right hemisphere, SPL activity covaried with vOT activity only for dyslexic readers. These results bring critical support to the VA interpretation of the VA Span deficit. In addition, they offer a new insight on how deficits in automatic vOT based word recognition could arise in developmental dyslexia.
RESUMEN
A steady increase in reading speed is the hallmark of normal reading acquisition. However, little is known of the influence of visual attention capacity on children's reading speed. The number of distinct visual elements that can be simultaneously processed at a glance (dubbed the visual attention span), predicts single-word reading speed in both normal reading and dyslexic children. However, the exact processes that account for the relationship between the visual attention span and reading speed remain to be specified. We used the Theory of Visual Attention to estimate visual processing speed and visual short-term memory capacity from a multiple letter report task in eight and nine year old children. The visual attention span and text reading speed were also assessed. Results showed that visual processing speed and visual short term memory capacity predicted the visual attention span. Furthermore, visual processing speed predicted reading speed, but visual short term memory capacity did not. Finally, the visual attention span mediated the effect of visual processing speed on reading speed. These results suggest that visual attention capacity could constrain reading speed in elementary school children.
Asunto(s)
Atención , Lectura , Visión Ocular , Niño , Preescolar , Humanos , Modelos Biológicos , Factores de TiempoRESUMEN
The visual attention (VA) span deficit hypothesis of dyslexia posits that letter string deficits are a consequence of impaired visual processing. Alternatively, some have interpreted this deficit as resulting from a visual-to-phonology code mapping impairment. This study aims to disambiguate between the two interpretations by investigating performance in a non-verbal character string visual categorization task with verbal and non-verbal stimuli. Results show that VA span ability predicts performance for the non-verbal visual processing task in normal reading children. Furthermore, VA span impaired dyslexic children are also impaired for the categorization task independently of stimuli type. This supports the hypothesis that the underlying impairment responsible for the VA span deficit is visual, not verbal.
Asunto(s)
Atención/fisiología , Dislexia/fisiopatología , Dislexia/psicología , Conducta Verbal/fisiología , Percepción Visual/fisiología , Adolescente , Niño , Femenino , Humanos , Pruebas de Inteligencia , Clasificación Internacional de Enfermedades , Masculino , Pruebas Neuropsicológicas , Desempeño Psicomotor/fisiología , LecturaRESUMEN
The visual front-end of reading is most often associated with orthographic processing. The left ventral occipito-temporal cortex seems to be preferentially tuned for letter string and word processing. In contrast, little is known of the mechanisms responsible for pre-orthographic processing: the processing of character strings regardless of character type. While the superior parietal lobule has been shown to be involved in multiple letter processing, further data is necessary to extend these results to non-letter characters. The purpose of this study is to identify the neural correlates of pre-orthographic character string processing independently of character type. Fourteen skilled adult readers carried out multiple and single element visual categorization tasks with alphanumeric (AN) and non-alphanumeric (nAN) characters under fMRI. The role of parietal cortex in multiple element processing was further probed with a priori defined anatomical regions of interest (ROIs). Participants activated posterior parietal cortex more strongly for multiple than single element processing. ROI analyses showed that bilateral SPL/BA7 was more strongly activated for multiple than single element processing, regardless of character type. In contrast, no multiple element specific activity was found in inferior parietal lobules. These results suggests that parietal mechanisms are involved in pre-orthographic character string processing. We argue that in general, attentional mechanisms are involved in visual word recognition, as an early step of word visual analysis.