Canonical maximization of coherence: A novel tool for investigation of neuronal interactions between two datasets.
Neuroimage
; 201: 116009, 2019 11 01.
Article
em En
| MEDLINE
| ID: mdl-31302256
Synchronization between oscillatory signals is considered to be one of the main mechanisms through which neuronal populations interact with each other. It is conventionally studied with mass-bivariate measures utilizing either sensor-to-sensor or voxel-to-voxel signals. However, none of these approaches aims at maximizing synchronization, especially when two multichannel datasets are present. Examples include cortico-muscular coherence (CMC), cortico-subcortical interactions or hyperscanning (where electroencephalographic EEG/magnetoencephalographic MEG activity is recorded simultaneously from two or more subjects). For all of these cases, a method which could find two spatial projections maximizing the strength of synchronization would be desirable. Here we present such method for the maximization of coherence between two sets of EEG/MEG/EMG (electromyographic)/LFP (local field potential) recordings. We refer to it as canonical Coherence (caCOH). caCOH maximizes the absolute value of the coherence between the two multivariate spaces in the frequency domain. This allows very fast optimization for many frequency bins. Apart from presenting details of the caCOH algorithm, we test its efficacy with simulations using realistic head modelling and focus on the application of caCOH to the detection of cortico-muscular coherence. For this, we used diverse multichannel EEG and EMG recordings and demonstrate the ability of caCOH to extract complex patterns of CMC distributed across spatial and frequency domains. Finally, we indicate other scenarios where caCOH can be used for the extraction of neuronal interactions.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Encéfalo
/
Músculo Esquelético
/
Modelos Neurológicos
/
Neurônios
Tipo de estudo:
Prognostic_studies
Limite:
Animals
/
Humans
Idioma:
En
Ano de publicação:
2019
Tipo de documento:
Article