Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters











Database
Language
Publication year range
1.
Front Neuroinform ; 16: 883223, 2022.
Article in English | MEDLINE | ID: mdl-35784190

ABSTRACT

TheVirtualBrain, an open-source platform for large-scale network modeling, can be personalized to an individual using a wide range of neuroimaging modalities. With the growing number and scale of neuroimaging data sharing initiatives of both healthy and clinical populations comes an opportunity to create large and heterogeneous sets of dynamic network models to better understand individual differences in network dynamics and their impact on brain health. Here we present TheVirtualBrain-UK Biobank pipeline, a robust, automated and open-source brain image processing solution to address the expanding scope of TheVirtualBrain project. Our pipeline generates connectome-based modeling inputs compatible for use with TheVirtualBrain. We leverage the existing multimodal MRI processing pipeline from the UK Biobank made for use with a variety of brain imaging modalities. We add various features and changes to the original UK Biobank implementation specifically for informing large-scale network models, including user-defined parcellations for the construction of matching whole-brain functional and structural connectomes. Changes also include detailed reports for quality control of all modalities, a streamlined installation process, modular software packaging, updated software versions, and support for various publicly available datasets. The pipeline has been tested on various datasets from both healthy and clinical populations and is robust to the morphological changes observed in aging and dementia. In this paper, we describe these and other pipeline additions and modifications in detail, as well as how this pipeline fits into the TheVirtualBrain ecosystem.

2.
Neural Comput ; 32(5): 1018-1032, 2020 05.
Article in English | MEDLINE | ID: mdl-32187001

ABSTRACT

Multilayer neural networks have led to remarkable performance on many kinds of benchmark tasks in text, speech, and image processing. Nonlinear parameter estimation in hierarchical models is known to be subject to overfitting and misspecification. One approach to these estimation and related problems (e.g., saddle points, colinearity, feature discovery) is called Dropout. The Dropout algorithm removes hidden units according to a binomial random variable with probability p prior to each update, creating random "shocks" to the network that are averaged over updates (thus creating weight sharing). In this letter, we reestablish an older parameter search method and show that Dropout is a special case of this more general model, stochastic delta rule (SDR), published originally in 1990. Unlike Dropout, SDR redefines each weight in the network as a random variable with mean µwij and standard deviation σwij. Each weight random variable is sampled on each forward activation, consequently creating an exponential number of potential networks with shared weights (accumulated in the mean values). Both parameters are updated according to prediction error, thus resulting in weight noise injections that reflect a local history of prediction error and local model averaging. SDR therefore implements a more sensitive local gradient-dependent simulated annealing per weight converging in the limit to a Bayes optimal network. We run tests on standard benchmarks (CIFAR and ImageNet) using a modified version of DenseNet and show that SDR outperforms standard Dropout in top-5 validation error by approximately 13% with DenseNet-BC 121 on ImageNet and find various validation error improvements in smaller networks. We also show that SDR reaches the same accuracy that Dropout attains in 100 epochs in as few as 40 epochs, as well as improvements in training error by as much as 80%.


Subject(s)
Algorithms , Deep Learning , Machine Learning , Neural Networks, Computer , Bayes Theorem , Humans , Image Processing, Computer-Assisted/methods
SELECTION OF CITATIONS
SEARCH DETAIL