Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Neuroimage ; 255: 119171, 2022 07 15.
Artigo em Inglês | MEDLINE | ID: mdl-35413445

RESUMO

MRI has been extensively used to identify anatomical and functional differences in Autism Spectrum Disorder (ASD). Yet, many of these findings have proven difficult to replicate because studies rely on small cohorts and are built on many complex, undisclosed, analytic choices. We conducted an international challenge to predict ASD diagnosis from MRI data, where we provided preprocessed anatomical and functional MRI data from > 2,000 individuals. Evaluation of the predictions was rigorously blinded. 146 challengers submitted prediction algorithms, which were evaluated at the end of the challenge using unseen data and an additional acquisition site. On the best algorithms, we studied the importance of MRI modalities, brain regions, and sample size. We found evidence that MRI could predict ASD diagnosis: the 10 best algorithms reliably predicted diagnosis with AUC∼0.80 - far superior to what can be currently obtained using genotyping data in cohorts 20-times larger. We observed that functional MRI was more important for prediction than anatomical MRI, and that increasing sample size steadily increased prediction accuracy, providing an efficient strategy to improve biomarkers. We also observed that despite a strong incentive to generalise to unseen data, model development on a given dataset faces the risk of overfitting: performing well in cross-validation on the data at hand, but not generalising. Finally, we were able to predict ASD diagnosis on an external sample added after the end of the challenge (EU-AIMS), although with a lower prediction accuracy (AUC=0.72). This indicates that despite being based on a large multisite cohort, our challenge still produced biomarkers fragile in the face of dataset shifts.


Assuntos
Transtorno do Espectro Autista , Transtorno Autístico , Transtorno do Espectro Autista/diagnóstico por imagem , Transtorno Autístico/diagnóstico por imagem , Biomarcadores , Encéfalo/diagnóstico por imagem , Humanos , Imageamento por Ressonância Magnética/métodos
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2019: 556-561, 2019 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-31945960

RESUMO

Much attention has been given to automatic sleep staging algorithms in past years, but the detection of discrete events in sleep studies is also crucial for precise characterization of sleep patterns and possible diagnosis of sleep disorders. We propose here a deep learning model for automatic detection and annotation of arousals and leg movements. Both of these are commonly seen during normal sleep, while an excessive amount of either is linked to disrupted sleep patterns, excessive daytime sleepiness impacting quality of life, and various sleep disorders. Our model was trained on 1,485 subjects and tested on 1,000 separate recordings of sleep. We tested two different experimental setups and found optimal arousal detection was attained by including a recurrent neural network module in our default model with a dynamic default event window (F1 = 0.75), while optimal leg movement detection was attained using a static event window (F1 = 0.65). Our work show promise while still allowing for improvements. Specifically, future research will explore the proposed model as a general-purpose sleep analysis model.


Assuntos
Aprendizado Profundo , Polissonografia , Qualidade de Vida , Sono
4.
IEEE Trans Neural Syst Rehabil Eng ; 26(4): 758-769, 2018 04.
Artigo em Inglês | MEDLINE | ID: mdl-29641380

RESUMO

Sleep stage classification constitutes an important preliminary exam in the diagnosis of sleep disorders. It is traditionally performed by a sleep expert who assigns to each 30 s of the signal of a sleep stage, based on the visual inspection of signals such as electroencephalograms (EEGs), electrooculograms (EOGs), electrocardiograms, and electromyograms (EMGs). We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting handcrafted features, that exploits all multivariate and multimodal polysomnography (PSG) signals (EEG, EMG, and EOG), and that can exploit the temporal context of each 30-s window of data. For each modality, the first layer learns linear spatial filters that exploit the array of sensors to increase the signal-to-noise ratio, and the last layer feeds the learnt representation to a softmax classifier. Our model is compared to alternative automatic approaches based on convolutional networks or decisions trees. Results obtained on 61 publicly available PSG records with up to 20 EEG channels demonstrate that our network architecture yields the state-of-the-art performance. Our study reveals a number of insights on the spatiotemporal distribution of the signal of interest: a good tradeoff for optimal classification performance measured with balanced accuracy is to use 6 EEG with 2 EOG (left and right) and 3 EMG chin channels. Also exploiting 1 min of data before and after each data segment offers the strongest improvement when a limited number of channels are available. As sleep experts, our system exploits the multivariate and multimodal nature of PSG signals in order to deliver the state-of-the-art classification performance with a small computational cost.


Assuntos
Sistemas Computacionais , Aprendizado Profundo , Polissonografia/classificação , Fases do Sono , Algoritmos , Árvores de Decisões , Eletroencefalografia/classificação , Eletroencefalografia/estatística & dados numéricos , Eletromiografia/classificação , Eletromiografia/estatística & dados numéricos , Eletroculografia/classificação , Eletroculografia/estatística & dados numéricos , Sistemas Inteligentes , Humanos , Análise Multivariada , Polissonografia/estatística & dados numéricos , Processamento de Sinais Assistido por Computador
5.
Front Hum Neurosci ; 12: 88, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-29568267

RESUMO

Recent research has shown that auditory closed-loop stimulation can enhance sleep slow oscillations (SO) to improve N3 sleep quality and cognition. Previous studies have been conducted in lab environments. The present study aimed to validate and assess the performance of a novel ambulatory wireless dry-EEG device (WDD), for auditory closed-loop stimulation of SO during N3 sleep at home. The performance of the WDD to detect N3 sleep automatically and to send auditory closed-loop stimulation on SO were tested on 20 young healthy subjects who slept with both the WDD and a miniaturized polysomnography (part 1) in both stimulated and sham nights within a double blind, randomized and crossover design. The effects of auditory closed-loop stimulation on delta power increase were assessed after one and 10 nights of stimulation on an observational pilot study in the home environment including 90 middle-aged subjects (part 2).The first part, aimed at assessing the quality of the WDD as compared to a polysomnograph, showed that the sensitivity and specificity to automatically detect N3 sleep in real-time were 0.70 and 0.90, respectively. The stimulation accuracy of the SO ascending-phase targeting was 45 ± 52°. The second part of the study, conducted in the home environment, showed that the stimulation protocol induced an increase of 43.9% of delta power in the 4 s window following the first stimulation (including evoked potentials and SO entrainment effect). The increase of SO response to auditory stimulation remained at the same level after 10 consecutive nights. The WDD shows good performances to automatically detect in real-time N3 sleep and to send auditory closed-loop stimulation on SO accurately. These stimulation increased the SO amplitude during N3 sleep without any adaptation effect after 10 consecutive nights. This tool provides new perspectives to figure out novel sleep EEG biomarkers in longitudinal studies and can be interesting to conduct broad studies on the effects of auditory stimulation during sleep.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA