Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
bioRxiv ; 2024 Jan 16.
Artigo em Inglês | MEDLINE | ID: mdl-38293188

RESUMO

Functional magnetic resonance imaging (fMRI) data are dominated by noise and artifacts, with only a small fraction of the variance relating to neural activity. Temporal independent component analysis (tICA) is a recently developed method that enables selective denoising of fMRI artifacts related to physiology such as respiration. However, an automated and easy to use pipeline for tICA has not previously been available; instead, two manual steps have been necessary: 1) setting the group spatial ICA dimensionality after MELODIC's Incremental Group-PCA (MIGP) and 2) labeling tICA components as artifacts versus signals. Moreover, guidance has been lacking as to how many subjects and timepoints are needed to adequately re-estimate the temporal ICA decomposition and what alternatives are available for smaller groups or even individual subjects. Here, we introduce a nine-step fully automated tICA pipeline which removes global artifacts from fMRI dense timeseries after sICA+FIX cleaning and MSMAll alignment driven by functionally relevant areal features. Additionally, we have developed an automated "reclean" Pipeline for improved spatial ICA (sICA) artifact removal. Two major automated components of the pipeline are 1) an automatic group spatial ICA (sICA) dimensionality selection for MIGP data enabled by fitting multiple Wishart distributions; 2) a hierarchical classifier to distinguish group tICA signal components from artifactual components, equipped with a combination of handcrafted features from domain expert knowledge and latent features obtained via self-supervised learning on spatial maps. We demonstrate that the dimensionality estimated for the MIGP data from HCP Young Adult 3T and 7T datasets is comparable to previous manual tICA estimates, and that the group sICA decomposition is highly reproducible. We also show that the tICA classifier achieved over 0.98 Precision-Recall Area Under Curve (PR-AUC) and that the correctly classified components account for over 95% of the tICA-represented variance on multiple held-out evaluation datasets including the HCP-Young Adult, HCP-Aging and HCP-Development datasets under various settings. Our automated tICA pipeline is now available as part of the HCP pipelines, providing a powerful and user-friendly tool for the neuroimaging community.

2.
Neuroimage ; 124(Pt B): 1102-1107, 2016 Jan 01.
Artigo em Inglês | MEDLINE | ID: mdl-25934470

RESUMO

ConnectomeDB is a database for housing and disseminating data about human brain structure, function, and connectivity, along with associated behavioral and demographic data. It is the main archive and dissemination platform for data collected under the WU-Minn consortium Human Connectome Project. Additional connectome-style study data is and will be made available in the database under current and future projects, including the Connectome Coordination Facility. The database currently includes multiple modalities of magnetic resonance imaging (MRI) and magnetoencephalograpy (MEG) data along with associated behavioral data. MRI modalities include structural, task, resting state and diffusion. MEG modalities include resting state and task. Imaging data includes unprocessed, minimally preprocessed and analysis data. Imaging data and much of the behavioral data are publicly available, subject to acceptance of data use terms, while access to some sensitive behavioral data is restricted to qualified investigators under a more stringent set of terms. ConnectomeDB is the public side of the WU-Minn HCP database platform. As such, it is geared towards public distribution, with a web-based user interface designed to guide users to the optimal set of data for their needs and a robust backend mechanism based on the commercial Aspera fasp service to enable high speed downloads. HCP data is also available via direct shipment of hard drives and Amazon S3.


Assuntos
Encéfalo/anatomia & histologia , Encéfalo/fisiologia , Conectoma , Bases de Dados Factuais , Disseminação de Informação/métodos , Acesso à Informação , Comportamento , Mapeamento Encefálico , Humanos , Internet , Imageamento por Ressonância Magnética , Magnetoencefalografia , Neuroimagem , Controle de Qualidade
3.
Neuroimage ; 80: 202-19, 2013 Oct 15.
Artigo em Inglês | MEDLINE | ID: mdl-23707591

RESUMO

The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study.


Assuntos
Encéfalo/anatomia & histologia , Encéfalo/fisiologia , Biologia Computacional/métodos , Conectoma/métodos , Mineração de Dados/métodos , Bases de Dados Factuais , Interface Usuário-Computador , Biologia Computacional/normas , Conectoma/normas , Mineração de Dados/normas , Sistemas de Gerenciamento de Base de Dados/normas , Humanos , Armazenamento e Recuperação da Informação/métodos , Armazenamento e Recuperação da Informação/normas , Modelos Anatômicos , Modelos Neurológicos , Rede Nervosa/anatomia & histologia , Rede Nervosa/fisiologia , Controle de Qualidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA