Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 31
Filtrar
1.
J Neural Eng ; 15(3): 031005, 2018 06.
Artículo en Inglés | MEDLINE | ID: mdl-29488902

RESUMEN

OBJECTIVE: Most current electroencephalography (EEG)-based brain-computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. APPROACH: We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. MAIN RESULTS: We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. SIGNIFICANCE: This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.


Asunto(s)
Algoritmos , Interfaces Cerebro-Computador/tendencias , Encéfalo/fisiología , Electroencefalografía/tendencias , Procesamiento de Señales Asistido por Computador , Animales , Aprendizaje Profundo/tendencias , Electroencefalografía/métodos , Humanos , Factores de Tiempo
2.
Methods Inf Med ; 54(3): 256-61, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-25762456

RESUMEN

INTRODUCTION: We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. BACKGROUND: Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. OBJECTIVES: To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. METHODS: We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. RESULTS: The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. CONCLUSION: We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.


Asunto(s)
Trastornos del Conocimiento/diagnóstico , Diagnóstico por Computador , Diseño de Software , Humanos
4.
Curr Alzheimer Res ; 7(6): 487-505, 2010 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-20455865

RESUMEN

This paper reviews recent progress in the diagnosis of Alzheimer's disease (AD) from electroencephalograms (EEG). Three major effects of AD on EEG have been observed: slowing of the EEG, reduced complexity of the EEG signals, and perturbations in EEG synchrony. In recent years, a variety of sophisticated computational approaches has been proposed to detect those subtle perturbations in the EEG of AD patients. The paper first describes methods that try to detect slowing of the EEG. Next the paper deals with several measures for EEG complexity, and explains how those measures have been used to study fluctuations in EEG complexity in AD patients. Then various measures of EEG synchrony are considered in the context of AD diagnosis. Also the issue of EEG pre-processing is briefly addressed. Before one can analyze EEG, it is necessary to remove artifacts due to for example head and eye movement or interference from electronic equipment. Pre-processing of EEG has in recent years received much attention. In this paper, several state-of-the-art pre-processing tech- niques are outlined, for example, based on blind source separation and other non-linear filtering paradigms. In addition, the paper outlines opportunities and limitations of computational approaches for diagnosing AD based on EEG. At last, future challenges and open problems are discussed.


Asunto(s)
Enfermedad de Alzheimer/diagnóstico , Enfermedad de Alzheimer/fisiopatología , Electroencefalografía/métodos , Procesamiento de Señales Asistido por Computador , Artefactos , Electromiografía/métodos , Humanos , Análisis de Regresión
5.
Neuroimage ; 49(1): 668-93, 2010 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-19573607

RESUMEN

It is well known that EEG signals of Alzheimer's disease (AD) patients are generally less synchronous than in age-matched control subjects. However, this effect is not always easily detectable. This is especially the case for patients in the pre-symptomatic phase, commonly referred to as mild cognitive impairment (MCI), during which neuronal degeneration is occurring prior to the clinical symptoms appearance. In this paper, various synchrony measures are studied in the context of AD diagnosis, including the correlation coefficient, mean-square and phase coherence, Granger causality, phase synchrony indices, information-theoretic divergence measures, state space based measures, and the recently proposed stochastic event synchrony measures. Experiments with EEG data show that many of those measures are strongly correlated (or anti-correlated) with the correlation coefficient, and hence, provide little complementary information about EEG synchrony. Measures that are only weakly correlated with the correlation coefficient include the phase synchrony indices, Granger causality measures, and stochastic event synchrony measures. In addition, those three families of synchrony measures are mutually uncorrelated, and therefore, they each seem to capture a specific kind of interdependence. For the data set at hand, only two synchrony measures are able to convincingly distinguish MCI patients from age-matched control patients, i.e., Granger causality (in particular, full-frequency directed transfer function) and stochastic event synchrony. Those two measures are used as features to distinguish MCI patients from age-matched control subjects, yielding a leave-one-out classification rate of 83%. The classification performance may be further improved by adding complementary features from EEG; this approach may eventually lead to a reliable EEG-based diagnostic tool for MCI and AD.


Asunto(s)
Enfermedad de Alzheimer/diagnóstico , Electroencefalografía/estadística & datos numéricos , Algoritmos , Enfermedad de Alzheimer/fisiopatología , Artefactos , Trastornos del Conocimiento/diagnóstico , Trastornos del Conocimiento/fisiopatología , Sincronización Cortical , Entropía , Humanos , Teoría de la Información , Modelos Estadísticos , Dinámicas no Lineales , Procesos Estocásticos
6.
Neural Comput ; 21(8): 2152-202, 2009 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-19670479

RESUMEN

We present a novel approach to quantify the statistical interdependence of two time series, referred to as stochastic event synchrony (SES). The first step is to extract the two given time series. The next step is to try to align events from one time series with events from the other. The better the alignment the more similar the two series are considered to be. More precisely, the similarity is quantified by the following parameters: time delay, variance of the time jitter, fraction of noncoincident events, and average similarity of the aligned events. The pairwise alignment and SES parameters are determined by statistical inference. In particular, the SES parameters are computed by maximum a posteriori (MAP) estimation, and the pairwise alignment is obtained by applying the max product algorithm. This letter deals with one-dimensional point processes; the extension to multidimensional point processes is considered in a companion letter in this issue. By analyzing surrogate data, we demonstrate that SES is able to quantify both timing precision and event reliability more robustly than classical measures can. As an illustration, neuronal spike data generated by Morris-Lecar neuron model are considered.


Asunto(s)
Potenciales de Acción/fisiología , Modelos Neurológicos , Modelos Estadísticos , Neuronas/fisiología , Algoritmos , Factores de Tiempo
7.
Neural Comput ; 21(8): 2203-68, 2009 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-19409054

RESUMEN

Stochastic event synchrony is a technique to quantify the similarity of pairs of signals. First, events are extracted from the two given time series. Next, one tries to align events from one time series with events from the other. The better the alignment, the more similar the two time series are considered to be. In Part I, the companion letter in this issue, one-dimensional events are considered; this letter concerns multidimensional events. Although the basic idea is similar, the extension to multidimensional point processes involves a significantly more difficult combinatorial problem and therefore is nontrivial. Also in the multidimensional case, the problem of jointly computing the pairwise alignment and SES parameters is cast as a statistical inference problem. This problem is solved by coordinate descent, more specifically, by alternating the following two steps: (1) estimate the SES parameters from a given pairwise alignment; (2) with the resulting estimates, refine the pairwise alignment. The SES parameters are computed by maximum a posteriori (MAP) estimation (step 1), in analogy to the one-dimensional case. The pairwise alignment (step 2) can no longer be obtained through dynamic programming, since the state space becomes too large. Instead it is determined by applying the max-product algorithm on a cyclic graphical model. In order to test the robustness and reliability of the SES method, it is first applied to surrogate data. Next, it is applied to detect anomalies in EEG synchrony of mild cognitive impairment (MCI) patients. Numerical results suggest that SES is significantly more sensitive to perturbations in EEG synchrony than a large variety of classical synchrony measures.


Asunto(s)
Algoritmos , Modelos Neurológicos , Modelos Estadísticos , Procesamiento de Señales Asistido por Computador , Procesos Estocásticos , Animales , Electroencefalografía , Humanos , Redes Neurales de la Computación , Dinámicas no Lineales , Factores de Tiempo
9.
Horm Metab Res ; 40(5): 338-41, 2008 May.
Artículo en Inglés | MEDLINE | ID: mdl-18491253

RESUMEN

This study was aimed at summarizing our experience in the management of 1,444 patients with incidentally found adrenal tumors observed at a single endocrinological centre. Hormonal determinations were performed in all patients at the beginning of the observation period to detect subclinical adrenal hyperfunction. The imaging phenotype on CT and MRI was analyzed for defining the malignant potential of the tumors. Based on the results of these examinations we diagnosed among our cohort probably benign masses in 87%, malignant tumors in 10% (adrenal carcinoma - 9%), and metastases in 3%. Subclinical hyperfunction was diagnosed in 8%; the most frequent was the pre-Cushing's syndrome. A subgroup of 480 patients (33%) was submitted to surgery because of oncological or endocrinological indications. The patients not qualified for surgery were carefully controlled by imaging and hormonal examinations. Malignancy is the most serious risk in the group of patients with incidentally discovered adrenal tumors.


Asunto(s)
Neoplasias de las Glándulas Suprarrenales/diagnóstico por imagen , Neoplasias de las Glándulas Suprarrenales/cirugía , Imagen por Resonancia Magnética , Neoplasias/diagnóstico por imagen , Neoplasias/cirugía , Adolescente , Neoplasias de las Glándulas Suprarrenales/sangre , Adulto , Anciano , Anciano de 80 o más Años , Niño , Femenino , Humanos , Masculino , Persona de Mediana Edad , Neoplasias/sangre , Estudios Retrospectivos , Tomografía Computarizada por Rayos X
10.
Langenbecks Arch Surg ; 393(2): 121-6, 2008 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-17994250

RESUMEN

BACKGROUND AND AIMS: The aim of this study is to analyze the clinical data and criteria for surgery in a group of over 1,100 patients with adrenal incidentalomas (AI) observed at the Department of Endocrinology. PATIENTS AND METHODS: The material consisted of 1,161 patients (842 women and 319 men, 10-87 years old) with AI ranging in size from 1.0 to 23.0 cm. The methods included clinical examination, imaging studies, hormonal determinations in the blood and in the urine as well as histological and immunocytochemical investigations in 390 patients treated by surgery. RESULTS: Basing on these studies, we diagnosed 112 patients with primary malignant adrenal tumors (100 with carcinoma), 45 with metastatic infiltrations, and 1,004 with probable benign AI. Imaging phenotypes (especially high density on computed tomography, CT) were characteristic of malignant and chromaffin tumors. Subclinical adrenal hyperactivity was found in 8% of the patients with pre-Cushing's syndrome as the most frequent form (6.5%). Chromaffin tumors were detected in 3%. CONCLUSIONS: (1) Indications for surgery include malignant tumors (both primary and metastatic), tumors with subclinical hyperfunction, and chromaffin tumors. High density on CT, >20 HU, appeared to be an important indication for surgery. (2) A slight prevalence of oncological indications over endocrinological indications (14 vs. 11%) was found.


Asunto(s)
Neoplasias de las Glándulas Suprarrenales/diagnóstico , Neoplasias de las Glándulas Suprarrenales/cirugía , Hallazgos Incidentales , Adolescente , Neoplasias de las Glándulas Suprarrenales/patología , Neoplasias de las Glándulas Suprarrenales/secundario , Glándulas Suprarrenales/patología , Adrenalectomía , Adulto , Anciano , Anciano de 80 o más Años , Niño , Diagnóstico Diferencial , Femenino , Humanos , Imagen por Resonancia Magnética , Masculino , Persona de Mediana Edad , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT)/diagnóstico , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT)/patología , Hipersecreción de la Hormona Adrenocorticotrópica Pituitaria (HACT)/cirugía , Tomografía Computarizada por Rayos X
11.
Physiol Meas ; 28(4): 335-47, 2007 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-17395990

RESUMEN

Alzheimer's disease (AD) is a degenerative disease which causes serious cognitive decline. Studies suggest that effective treatments for AD may be aided by the detection of the disease in its early stages, prior to extensive neuronal degeneration. In this paper, we propose a set of novel techniques which could help to perform this task, and present the results of experiments conducted to evaluate these approaches. The challenge is to discriminate between spontaneous EEG recordings from two groups of subjects: one afflicted with mild cognitive impairment and eventual AD and the other an age-matched control group. The classification results obtained indicate that the proposed methods are promising additions to the existing tools for detection of AD, though further research and experimentation with larger datasets is required to verify their effectiveness.


Asunto(s)
Algoritmos , Enfermedad de Alzheimer/diagnóstico , Inteligencia Artificial , Trastornos del Conocimiento/diagnóstico , Diagnóstico por Computador/métodos , Electroencefalografía/métodos , Reconocimiento de Normas Patrones Automatizadas/métodos , Anciano , Enfermedad de Alzheimer/complicaciones , Trastornos del Conocimiento/complicaciones , Potenciales Evocados , Femenino , Humanos , Masculino , Reproducibilidad de los Resultados , Sensibilidad y Especificidad
12.
Comput Intell Neurosci ; : 82827, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-18364991

RESUMEN

While conventional approaches of BCI feature extraction are based on the power spectrum, we have tried using nonlinear features for classifying BCI data. In this paper, we report our test results and findings, which indicate that the proposed method is a potentially useful addition to current feature extraction techniques.

14.
Comput Intell Neurosci ; : 97026, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-18301719
15.
IEEE Trans Neural Netw ; 14(3): 631-45, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-18238044

RESUMEN

We propose a robust approach for independent component analysis (ICA) of signals where observations are contaminated with high-level additive noise and/or outliers. The source signals may contain mixtures of both sub-Gaussian and super-Gaussian components, and the number of sources is unknown. Our robust approach includes two procedures. In the first procedure, a robust prewhitening technique is used to reduce the power of additive noise, the dimensionality and the correlation among sources. A cross-validation technique is introduced to estimate the number of sources in this first procedure. In the second procedure, a nonlinear function is derived using the parameterized t-distribution density model. This nonlinear function is robust against the undue influence of outliers fundamentally. Moreover, the stability of the proposed algorithm and the robust property of misestimating the parameters (kurtosis) have been studied. By combining the t-distribution model with a family of light-tailed distributions (sub-Gaussian) model, we can separate the mixture of sub-Gaussian and super-Gaussian source components. Through the analysis of artificially synthesized data and real-world magnetoencephalographic (MEG) data, we illustrate the efficacy of this robust approach.

17.
Neural Comput ; 13(9): 1995-2003, 2001 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-11516354

RESUMEN

In this work we develop a very simple batch learning algorithm for semiblind extraction of a desired source signal with temporal structure from linear mixtures. Although we use the concept of sequential blind extraction of sources and independent component analysis, we do not carry out the extraction in a completely blind manner; neither do we assume that sources are statistically independent. In fact, we show that the a priori information about the autocorrelation function of primary sources can be used to extract the desired signals (sources of interest) from their linear mixtures. Extensive computer simulations and real data application experiments confirm the validity and high performance of the proposed algorithm.


Asunto(s)
Algoritmos , Simulación por Computador , Electrocardiografía , Femenino , Corazón Fetal/fisiología , Corazón/fisiología , Humanos , Modelos Biológicos , Distribución Normal , Embarazo , Reproducibilidad de los Resultados
18.
Phys Rev Lett ; 86(24): 5446-9, 2001 Jun 11.
Artículo en Inglés | MEDLINE | ID: mdl-11415272

RESUMEN

New electron scattering measurements have been made that extend data on the (3)He elastic magnetic form factor up to Q(2) = 42.6 fm(-2). These new data test theoretical conjectures regarding non-nucleonic effects in the three-body system. The very small cross sections, as low as 10(-40) cm(2)/sr, required the use of a high-pressure cryogenic gas target and a detector system with excellent background rejection capability. No existing theoretical calculation satisfactorily accounts for all the available data.

19.
IEEE Trans Biomed Eng ; 48(5): 501-12, 2001 May.
Artículo en Inglés | MEDLINE | ID: mdl-11341524

RESUMEN

In this paper, we use third-order correlations (TOC) in developing a filtering technique for the recovery of brain evoked potentials (EPs). The main idea behind the presented technique is to pass the noisy signal through a finite impulse response filter whose impulse response is matched with the shape of the noise-free signal. It is shown that it is possible to estimate the filter impulse response on basis of a selected third-order correlation slice (TOCS) of the input noisy signal. This is justified by two facts. The first one is that the noise-free EPs can be modeled as a sum of damped sinusoidal signals and the selected TOCS preserve the signal structure. The second fact is that the TOCS is insensitive to both Gaussian noise and other symmetrically distributed non-Gaussian noise, (white or colored). Furthermore, the approach can be applied to either nonaveraged or averaged EP observation data. In the nonaveraged data case, the approach therefore preserves information about amplitude and latency changes. Both fixed and adaptive versions of the proposed filtering technique are described. Extensive simulation results are provided to show the validity and effectiveness of the proposed cumulant-based filtering technique in comparison with the conventional correlation-based counterpart.


Asunto(s)
Potenciales Evocados Somatosensoriales , Modelos Neurológicos , Procesamiento de Señales Asistido por Computador , Modelos Estadísticos , Reproducibilidad de los Resultados
20.
Med Biol Eng Comput ; 39(2): 237-48, 2001 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-11361251

RESUMEN

An adaptive filtering approach for the segmentation and tracking of electro-encephalogram (EEG) signal waves is described. In this approach, an adaptive recursive bandpass filter is employed for estimating and tracking the centre frequency associated with each EEG wave. The main advantage inherent in the approach is that the employed adaptive filter has only one unknown coefficient to be updated. This coefficient, having an absolute value less than 1, represents an efficient distinct feature for each EEG specific wave, and its time function reflects the non-stationarity behaviour of the EEG signal. Therefore the proposed approach is simple and accurate in comparison with existing multivariate adaptive approaches. The approach is examined using extensive computer simulations. It is applied to computer-generated EEG signals composed of different waves. The adaptive filter coefficient (i.e. the segmentation parameter) is -0.492 for the delta wave, -0.360 for the theta wave, -0.191 for the alpha wave, -0.027 for the sigma wave, 0.138 for the beta wave and 0.605 for the gamma wave. This implies that the segmentation parameter increases with the increase in the centre frequency of the EEG waves, which provides fast on-line information about the behaviour of the EEG signal. The approach is also applied to real-world EEG data for the detection of sleep spindles.


Asunto(s)
Electroencefalografía/métodos , Procesamiento de Señales Asistido por Computador , Algoritmos , Simulación por Computador , Humanos , Modelos Neurológicos , Sueño/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...