Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
Respir Res ; 24(1): 293, 2023 Nov 21.
Article in English | MEDLINE | ID: mdl-37990197

ABSTRACT

BACKGROUND: People living with chronic obstructive pulmonary disease (COPD) have an increased risk of experiencing cardiovascular (CV) events, particularly after an exacerbation. Such CV burden is not yet known for incident COPD patients. We examined the risk of severe CV events in incident COPD patients in periods following either moderate and/or severe exacerbations. METHODS: Persons aged ≥ 40 years with an incident COPD diagnosis from the PHARMO Data Network were included. Exposed time periods included 1-7, 8-14, 15-30, 31-180 and 181-365 days following an exacerbation. Moderate exacerbations were defined as those managed in outpatient settings; severe exacerbations as those requiring hospitalisation. The outcome was a composite of time to first severe CV event (acute coronary syndrome, heart failure decompensation, cerebral ischaemia, or arrhythmia) or death. Hazard ratios (HR) were estimated for association between each exposed period and outcome. RESULTS: 8020 patients with newly diagnosed COPD were identified. 2234 patients (28%) had ≥ 1 exacerbation, 631 patients (8%) had a non-fatal CV event, and 461 patients (5%) died during a median follow-up of 36 months. The risk of experiencing the composite outcome was increased following a moderate/severe exacerbation as compared to time periods of stable disease [range of HR: from 15.3 (95% confidence interval 11.8-20.0) in days 1-7 to 1.3 (1.0-1.8) in days 181-365]. After a moderate exacerbation, the risk was increased over the first 180 days [HR 2.5 (1.3-4.8) in days 1-7 to 1.6 (1.3-2.1) in days 31-180]. After a severe exacerbation, the risk increased substantially and remained higher over the year following the exacerbation [HR 48.6 (36.9-64.0) in days 1-7 down to 1.6 (1.0-2.6) in days 181-365]. Increase in risk concerned all categories of severe CV events. CONCLUSIONS: Among incident COPD patients, we observed a substantial risk increase of severe CV events or all-cause death following either a moderate or severe exacerbation of COPD. Increase in risk was highest in the initial period following an exacerbation. These findings highlight the significant cardiopulmonary burden among people living with COPD even with a new diagnosis.


Subject(s)
Cardiovascular Diseases , Pulmonary Disease, Chronic Obstructive , Humans , Cohort Studies , Netherlands/epidemiology , Pulmonary Disease, Chronic Obstructive/diagnosis , Pulmonary Disease, Chronic Obstructive/epidemiology , Cardiovascular Diseases/diagnosis , Cardiovascular Diseases/epidemiology , Disease Progression
2.
iScience ; 26(6): 106838, 2023 Jun 16.
Article in English | MEDLINE | ID: mdl-37250785

ABSTRACT

Motor responses to visual stimuli have shorter latencies for controlling than for initiating movement. The shorter latencies observed for movement control are notably believed to reflect the involvement of forward models when controlling moving limbs. We assessed whether controlling a moving limb is a "requisite" to observe shortened response latencies. The latency of button-press responses to a visual stimulus was compared between conditions involving or not involving the control of a moving object, but never involving any actual control of a body segment. When the motor response controlled a moving object, response latencies were significantly shorter and less variable, probably reflecting a faster sensorimotor processing (as assessed fitting a LATER model to our data). These results suggest that when the task at hand entails a control component, the sensorimotor processing of visual information is hastened, and this even if the task does not require to actually control a moving limb.

3.
Vox Sang ; 116(10): 1042-1050, 2021 Nov.
Article in English | MEDLINE | ID: mdl-33853204

ABSTRACT

BACKGROUND AND OBJECTIVES: Frequent blood donation depletes iron stores of blood donors. Iron depletion may lead to anaemia, but the health effects of iron depletion without anaemia in healthy blood donors are not well understood. We studied in the FinDonor cohort whether worsening of self-rated health of blood donors during the study period was associated with biomarkers for iron levels or other self-reported changes in lifestyle. MATERIALS AND METHODS: We included 1416 participants from the cohort who answered an 89-item questionnaire on their health and lifestyle during their enrolment visit and again at the end of the study. We performed multivariate logistic regression to test if blood donation-related factors affected the probability of reporting worsened health. To set these findings into a more holistic context of health, we subsequently analysed all other questionnaire items with a data-driven exploratory analysis. RESULTS: We found that donation frequency in men and post-menopausal women and ferritin level only in men was associated negatively with worsened health between questionnaires. In the exploratory analysis, stable physical condition was the only questionnaire item that was associated negatively with worsened health in both women and men. CONCLUSION: Our results suggest that low ferritin level is associated with worsened health even in non-anaemic repeat donors, although we find that when health is analysed more holistically, ferritin and other factors primarily related to blood donation lose their importance.


Subject(s)
Anemia, Iron-Deficiency , Blood Donors , Cohort Studies , Female , Ferritins , Humans , Iron , Male
4.
Conscious Cogn ; 78: 102863, 2020 02.
Article in English | MEDLINE | ID: mdl-31887533

ABSTRACT

Stimuli may induce only partial consciousness-an intermediate between null and full consciousness-where the presence but not identity of an object can be reported. The differences in the neuronal basis of full and partial consciousness are poorly understood. We investigated if evoked and oscillatory activity could dissociate full from partial conscious perception. We recorded human cortical activity with magnetoencephalography (MEG) during a visual perception task in which stimulus could be either partially or fully perceived. Partial consciousness was associated with an early increase in evoked activity and theta/low-alpha-band oscillations while full consciousness was also associated with late evoked activity and beta-band oscillations. Full from partial consciousness was dissociated by stronger evoked activity and late increase in theta oscillations that were localized to higher-order visual regions and posterior parietal and prefrontal cortices. Our results reveal both evoked activity and theta oscillations dissociate partial and full consciousness.


Subject(s)
Brain Waves/physiology , Cerebral Cortex/physiology , Consciousness/physiology , Evoked Potentials/physiology , Visual Perception/physiology , Adult , Brain Mapping , Female , Humans , Magnetoencephalography , Male , Young Adult
5.
Vox Sang ; 115(1): 36-46, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31657023

ABSTRACT

BACKGROUND AND OBJECTIVES: There is increasing evidence that frequent blood donation depletes the iron stores of some blood donors. The FinDonor 10 000 study was set up to study iron status and factors affecting iron stores in Finnish blood donors. In Finland, iron supplementation for at-risk groups has been in place since the 1980s. MATERIAL AND METHODS: A total of 2584 blood donors (N = 8003 samples) were recruited into the study alongside standard donation at three donation sites in the capital region of Finland between 5/2015 and 12/2017. All participants were asked to fill out a questionnaire about their health and lifestyle. Blood samples were collected from the sample pouch of whole blood collection set, kept in cool temperature and processed centrally. Whole blood count, CRP, ferritin and sTFR were measured from the samples, and DNA was isolated for GWAS studies. RESULTS: Participant demographics, albeit in general similar to the general blood donor population in Finland, indicated some bias towards older and more frequent donors. Participation in the study increased median donation frequency of the donors. Analysis of the effect of time lag from the sampling to the analysis and the time of day when sample was drawn revealed small but significant time-dependent changes. CONCLUSION: The FinDonor cohort now provides us with tools to identify potential donor groups at increased risk of iron deficiency and factors explaining this risk. The increase in donation frequency during the study suggests that scientific projects can be used to increase the commitment of blood donors.


Subject(s)
Blood Donors/statistics & numerical data , Ferritins/blood , Iron/blood , Adult , Cohort Studies , Female , Finland , Humans , Iron Deficiencies , Male , Middle Aged
6.
PLoS One ; 14(8): e0220862, 2019.
Article in English | MEDLINE | ID: mdl-31408501

ABSTRACT

The iron status of blood donors is a subject of concern for blood establishments. The Finnish Red Cross Blood Service addresses iron loss in blood donors by proposing systematic iron supplementation for demographic at-risk donor groups. We measured blood count, ferritin and soluble transferrin receptor (sTfR) and acquired lifestyle and health information from 2200 blood donors of the FinDonor 10000 cohort. We used modern data analysis methods to estimate iron status and factors affecting it with a special focus on the effects of the blood service's iron supplementation policy. Low ferritin (< 15 µg/L), an indicator of low iron stores, was present in 20.6% of pre-menopausal women, 10.6% of post-menopausal women and 6% of men. Anemia co-occurred with iron deficiency more frequently in pre-menopausal women (21 out of 25 cases) than in men (3/6) or post-menopausal women (1/2). In multivariable regression analyses, lifestyle, dietary, and blood donation factors explained up to 38% of the variance in ferritin levels but only ~10% of the variance in sTfR levels. Days since previous donation were positively associated with ferritin levels in all groups while the number of donations during the past 2 years was negatively associated with ferritin levels in pre-menopausal women and men. FRCBS-provided iron supplementation was negatively associated with ferritin levels in men only. Relative importance analyses showed that donation activity accounted for most of the explained variance in ferritin levels while iron supplementation explained less than 1%. Variation in ferritin levels was not significantly associated with variation in self-reported health. Donation activity was the most important factor affecting blood donor iron levels, far ahead of e.g. red-meat consumption or iron supplementation. Importantly, self-reported health of donors with lower iron stores was not lower than self-reported health of donors with higher iron stores.


Subject(s)
Blood Donors/statistics & numerical data , Diet , Dietary Supplements , Ferritins/blood , Iron Compounds/therapeutic use , Receptors, Transferrin/blood , Adolescent , Adult , Age Factors , Aged , Anemia, Iron-Deficiency/blood , Female , Health Status , Humans , Life Style , Male , Middle Aged , Sex Factors , Young Adult
7.
Biol Blood Marrow Transplant ; 25(5): 891-898, 2019 05.
Article in English | MEDLINE | ID: mdl-30592985

ABSTRACT

HLA matching is a prerequisite for successful allogeneic hematopoietic stem cell transplantation (HSCT) because it lowers the occurrence and severity of graft-versus-host disease (GVHD). However, matching a few alleles of the classic HLA genes only may not ensure matching of the entire MHC region. HLA haplotype matching has been reported to be beneficial in HSCT because of the variation relevant to GVHD risk in the non-HLA region. Because polymorphism in the MHC is highly population specific, we hypothesized that donors from the Finnish registry are more likely to be matched at a higher level for the Finnish patients than donors from other registries. In the present study we determined 25 single nucleotide polymorphisms (SNPs) of the complement component 4 (C4) gene in the γ-block segment of MHC from 115 Finnish HSCT patients and their Finnish (n = 201) and non-Finnish (n = 280) donor candidates. Full matching of HLA alleles and C4 SNPs, independently or additively, occurred more likely in the Finnish-Finnish group as compared with the Finnish-non-Finnish group (P < .003). This was most striking in cases with HLA haplotypes typical of the Finnish population. Patients with ancestral HLA haplotypes (AH) were more likely to find a full HLA and C4 matched donor, regardless of donor origin, as compared with patients without AH (P < .0001). Despite the clear differences at the population level, we could not find a statistical association between C4 matching and clinical outcome. The results suggest that screening C4 SNPs can be advantageous when an extended MHC matching or HLA haplotype matching in HSCT is required. This study also supports the need for small population-specific stem cell registries.


Subject(s)
Complement C4/genetics , Hematopoietic Stem Cell Transplantation/methods , Histocompatibility/immunology , Unrelated Donors , Adult , Complement C4/immunology , Finland , Haplotypes/genetics , Haplotypes/immunology , Humans , Polymorphism, Single Nucleotide , Registries
8.
Data Brief ; 18: 262-275, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29896515

ABSTRACT

It has not been well documented that MEG/EEG functional connectivity graphs estimated with zero-lag-free interaction metrics are severely confounded by a multitude of spurious interactions (SI), i.e., the false-positive "ghosts" of true interactions [1], [2]. These SI are caused by the multivariate linear mixing between sources, and thus they pose a severe challenge to the validity of connectivity analysis. Due to the complex nature of signal mixing and the SI problem, there is a need to intuitively demonstrate how the SI are discovered and how they can be attenuated using a novel approach that we termed hyperedge bundling. Here we provide a dataset with software with which the readers can perform simulations in order to better understand the theory and the solution to SI. We include the supplementary material of [1] that is not directly relevant to the hyperedge bundling per se but reflects important properties of the MEG source model and the functional connectivity graphs. For example, the gyri of dorsal-lateral cortices are the most accurately modeled areas; the sulci of inferior temporal, frontal and the insula have the least modeling accuracy. Importantly, we found the interaction estimates are heavily biased by the modeling accuracy between regions, which means the estimates cannot be straightforwardly interpreted as the coupling between brain regions. This raise a red flag that the conventional method of thresholding graphs by estimate values is rather suboptimal: because the measured topology of the graph reflects the geometric property of source-model instead of the cortical interactions under investigation.

9.
Neuroimage ; 173: 610-622, 2018 06.
Article in English | MEDLINE | ID: mdl-29378318

ABSTRACT

Inter-areal functional connectivity (FC), neuronal synchronization in particular, is thought to constitute a key systems-level mechanism for coordination of neuronal processing and communication between brain regions. Evidence to support this hypothesis has been gained largely using invasive electrophysiological approaches. In humans, neuronal activity can be non-invasively recorded only with magneto- and electroencephalography (MEG/EEG), which have been used to assess FC networks with high temporal resolution and whole-scalp coverage. However, even in source-reconstructed MEG/EEG data, signal mixing, or "source leakage", is a significant confounder for FC analyses and network localization. Signal mixing leads to two distinct kinds of false-positive observations: artificial interactions (AI) caused directly by mixing and spurious interactions (SI) arising indirectly from the spread of signals from true interacting sources to nearby false loci. To date, several interaction metrics have been developed to solve the AI problem, but the SI problem has remained largely intractable in MEG/EEG all-to-all source connectivity studies. Here, we advance a novel approach for correcting SIs in FC analyses using source-reconstructed MEG/EEG data. Our approach is to bundle observed FC connections into hyperedges by their adjacency in signal mixing. Using realistic simulations, we show here that bundling yields hyperedges with good separability of true positives and little loss in the true positive rate. Hyperedge bundling thus significantly decreases graph noise by minimizing the false-positive to true-positive ratio. Finally, we demonstrate the advantage of edge bundling in the visualization of large-scale cortical networks with real MEG data. We propose that hypergraphs yielded by bundling represent well the set of true cortical interactions that are detectable and dissociable in MEG/EEG connectivity analysis.


Subject(s)
Brain/physiology , Electroencephalography/methods , Magnetoencephalography/methods , Nerve Net/physiology , Signal Processing, Computer-Assisted , Brain Mapping/methods , Computer Simulation , Humans , Models, Neurological
10.
Neuroimage ; 165: 222-237, 2018 01 15.
Article in English | MEDLINE | ID: mdl-29074278

ABSTRACT

Visuospatial attention prioritizes processing of attended visual stimuli. It is characterized by lateralized alpha-band (8-14 Hz) amplitude suppression in visual cortex and increased neuronal activity in a network of frontal and parietal areas. It has remained unknown what mechanisms coordinate neuronal processing among frontoparietal network and visual cortices and implement the attention-related modulations of alpha-band amplitudes and behavior. We investigated whether large-scale network synchronization could be such a mechanism. We recorded human cortical activity with magnetoencephalography (MEG) during a visuospatial attention task. We then identified the frequencies and anatomical networks of inter-areal phase synchronization from source localized MEG data. We found that visuospatial attention is associated with robust and sustained long-range synchronization of cortical oscillations exclusively in the high-alpha (10-14 Hz) frequency band. This synchronization connected frontal, parietal and visual regions and was observed concurrently with amplitude suppression of low-alpha (6-9 Hz) band oscillations in visual cortex. Furthermore, stronger high-alpha phase synchronization was associated with decreased reaction times to attended stimuli and larger suppression of alpha-band amplitudes. These results thus show that high-alpha band phase synchronization is functionally significant and could coordinate the neuronal communication underlying the implementation of visuospatial attention.


Subject(s)
Attention/physiology , Cerebral Cortex/physiology , Cortical Synchronization/physiology , Adult , Female , Humans , Magnetic Resonance Imaging , Magnetoencephalography , Male , Photic Stimulation , Visual Perception/physiology , Young Adult
12.
Front Hum Neurosci ; 8: 479, 2014.
Article in English | MEDLINE | ID: mdl-25071509

ABSTRACT

The visual attention (VA) span deficit hypothesis of developmental dyslexia posits that impaired multiple element processing can be responsible for poor reading outcomes. In VA span impaired dyslexic children, poor performance on letter report tasks is associated with reduced parietal activations for multiple letter processing. While this hints towards a non-specific, attention-based dysfunction, it is still unclear whether reduced parietal activity generalizes to other types of stimuli. Furthermore, putative links between reduced parietal activity and reduced ventral occipito-temporal (vOT) in dyslexia have yet to be explored. Using functional magnetic resonance imaging, we measured brain activity in 12 VA span impaired dyslexic adults and 12 adult skilled readers while they carried out a categorization task on single or multiple alphanumeric or non-alphanumeric characters. While healthy readers activated parietal areas more strongly for multiple than single element processing (right-sided for alphanumeric and bilateral for non-alphanumeric), similar stronger multiple element right parietal activations were absent for dyslexic participants. Contrasts between skilled and dyslexic readers revealed significantly reduced right superior parietal lobule (SPL) activity for dyslexic readers regardless of stimuli type. Using a priori anatomically defined regions of interest, we showed that neural activity was reduced for dyslexic participants in both SPL and vOT bilaterally. Finally, we used multiple regressions to test whether SPL activity was related to vOT activity in each group. In the left hemisphere, SPL activity covaried with vOT activity for both normal and dyslexic readers. In contrast, in the right hemisphere, SPL activity covaried with vOT activity only for dyslexic readers. These results bring critical support to the VA interpretation of the VA Span deficit. In addition, they offer a new insight on how deficits in automatic vOT based word recognition could arise in developmental dyslexia.

13.
Neuroimage ; 85 Pt 2: 853-72, 2014 Jan 15.
Article in English | MEDLINE | ID: mdl-24007803

ABSTRACT

We introduce here phase transfer entropy (Phase TE) as a measure of directed connectivity among neuronal oscillations. Phase TE quantifies the transfer entropy between phase time-series extracted from neuronal signals by filtering for instance. To validate the measure, we used coupled Neuronal Mass Models to both evaluate the characteristics of Phase TE and compare its performance with that of a real-valued TE implementation. We showed that Phase TE detects the strength and direction of connectivity even in the presence of such amounts of noise and linear mixing that typically characterize MEG and EEG recordings. Phase TE performed well across a wide range of analysis lags and sample sizes. Comparisons between Phase TE and real-valued TE estimates showed that Phase TE is more robust to nuisance parameters and considerably more efficient computationally. In addition, Phase TE accurately untangled bidirectional frequency band specific interaction patterns that confounded real-valued TE. Finally, we found that surrogate data can be used to construct appropriate null-hypothesis distributions and to estimate statistical significance of Phase TE. These results hence suggest that Phase TE is well suited for the estimation of directed phase-based connectivity in large-scale investigations of the human functional connectome.


Subject(s)
Brain Waves/physiology , Information Theory , Models, Neurological , Nerve Net/physiology , Data Interpretation, Statistical , Electroencephalography , Humans , Magnetoencephalography
14.
PLoS One ; 8(4): e58097, 2013.
Article in English | MEDLINE | ID: mdl-23593117

ABSTRACT

A steady increase in reading speed is the hallmark of normal reading acquisition. However, little is known of the influence of visual attention capacity on children's reading speed. The number of distinct visual elements that can be simultaneously processed at a glance (dubbed the visual attention span), predicts single-word reading speed in both normal reading and dyslexic children. However, the exact processes that account for the relationship between the visual attention span and reading speed remain to be specified. We used the Theory of Visual Attention to estimate visual processing speed and visual short-term memory capacity from a multiple letter report task in eight and nine year old children. The visual attention span and text reading speed were also assessed. Results showed that visual processing speed and visual short term memory capacity predicted the visual attention span. Furthermore, visual processing speed predicted reading speed, but visual short term memory capacity did not. Finally, the visual attention span mediated the effect of visual processing speed on reading speed. These results suggest that visual attention capacity could constrain reading speed in elementary school children.


Subject(s)
Attention , Reading , Vision, Ocular , Child , Child, Preschool , Humans , Models, Biological , Time Factors
15.
Neuropsychologia ; 50(9): 2195-204, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22659111

ABSTRACT

The visual front-end of reading is most often associated with orthographic processing. The left ventral occipito-temporal cortex seems to be preferentially tuned for letter string and word processing. In contrast, little is known of the mechanisms responsible for pre-orthographic processing: the processing of character strings regardless of character type. While the superior parietal lobule has been shown to be involved in multiple letter processing, further data is necessary to extend these results to non-letter characters. The purpose of this study is to identify the neural correlates of pre-orthographic character string processing independently of character type. Fourteen skilled adult readers carried out multiple and single element visual categorization tasks with alphanumeric (AN) and non-alphanumeric (nAN) characters under fMRI. The role of parietal cortex in multiple element processing was further probed with a priori defined anatomical regions of interest (ROIs). Participants activated posterior parietal cortex more strongly for multiple than single element processing. ROI analyses showed that bilateral SPL/BA7 was more strongly activated for multiple than single element processing, regardless of character type. In contrast, no multiple element specific activity was found in inferior parietal lobules. These results suggests that parietal mechanisms are involved in pre-orthographic character string processing. We argue that in general, attentional mechanisms are involved in visual word recognition, as an early step of word visual analysis.


Subject(s)
Attention/physiology , Parietal Lobe/physiology , Reading , Visual Perception/physiology , Adult , Data Interpretation, Statistical , Evoked Potentials/physiology , Female , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Photic Stimulation , Psychomotor Performance/physiology , Reaction Time/physiology , Recognition, Psychology/physiology , Young Adult
16.
Dyslexia ; 18(2): 77-93, 2012 May.
Article in English | MEDLINE | ID: mdl-22434589

ABSTRACT

Poor parallel letter-string processing in developmental dyslexia was taken as evidence of poor visual attention (VA) span, that is, a limitation of visual attentional resources that affects multi-character processing. However, the use of letter stimuli in oral report tasks was challenged on its capacity to highlight a VA span disorder. In particular, report of poor letter/digit-string processing but preserved symbol-string processing was viewed as evidence of poor visual-to-phonology code mapping, in line with the phonological theory of developmental dyslexia. We assessed here the visual-to-phonological-code mapping disorder hypothesis. In Experiment 1, letter-string, digit-string and colour-string processing was assessed to disentangle a phonological versus visual familiarity account of the letter/digit versus symbol dissociation. Against a visual-to-phonological-code mapping disorder but in support of a familiarity account, results showed poor letter/digit-string processing but preserved colour-string processing in dyslexic children. In Experiment 2, two tasks of letter-string report were used, one of which was performed simultaneously to a high-taxing phonological task. Results show that dyslexic children are similarly impaired in letter-string report whether a concurrent phonological task is simultaneously performed or not. Taken together, these results provide strong evidence against a phonological account of poor letter-string processing in developmental dyslexia.


Subject(s)
Attention Deficit Disorder with Hyperactivity/etiology , Dyslexia/complications , Phonetics , Reading , Visual Perception/physiology , Adolescent , Analysis of Variance , Attention Deficit Disorder with Hyperactivity/diagnosis , Case-Control Studies , Child , Child, Preschool , Color Perception , Female , Humans , Male , Neuropsychological Tests , Pattern Recognition, Visual , Statistics as Topic
17.
Cortex ; 48(6): 768-73, 2012 Jun.
Article in English | MEDLINE | ID: mdl-21982580

ABSTRACT

The visual attention (VA) span deficit hypothesis of dyslexia posits that letter string deficits are a consequence of impaired visual processing. Alternatively, some have interpreted this deficit as resulting from a visual-to-phonology code mapping impairment. This study aims to disambiguate between the two interpretations by investigating performance in a non-verbal character string visual categorization task with verbal and non-verbal stimuli. Results show that VA span ability predicts performance for the non-verbal visual processing task in normal reading children. Furthermore, VA span impaired dyslexic children are also impaired for the categorization task independently of stimuli type. This supports the hypothesis that the underlying impairment responsible for the VA span deficit is visual, not verbal.


Subject(s)
Attention/physiology , Dyslexia/physiopathology , Dyslexia/psychology , Verbal Behavior/physiology , Visual Perception/physiology , Adolescent , Child , Female , Humans , Intelligence Tests , International Classification of Diseases , Male , Neuropsychological Tests , Psychomotor Performance/physiology , Reading
SELECTION OF CITATIONS
SEARCH DETAIL
...