Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 31
Filtrar
1.
Neuroimage ; 49(1): 668-93, 2010 Jan 01.
Artigo em Inglês | MEDLINE | ID: mdl-19573607

RESUMO

It is well known that EEG signals of Alzheimer's disease (AD) patients are generally less synchronous than in age-matched control subjects. However, this effect is not always easily detectable. This is especially the case for patients in the pre-symptomatic phase, commonly referred to as mild cognitive impairment (MCI), during which neuronal degeneration is occurring prior to the clinical symptoms appearance. In this paper, various synchrony measures are studied in the context of AD diagnosis, including the correlation coefficient, mean-square and phase coherence, Granger causality, phase synchrony indices, information-theoretic divergence measures, state space based measures, and the recently proposed stochastic event synchrony measures. Experiments with EEG data show that many of those measures are strongly correlated (or anti-correlated) with the correlation coefficient, and hence, provide little complementary information about EEG synchrony. Measures that are only weakly correlated with the correlation coefficient include the phase synchrony indices, Granger causality measures, and stochastic event synchrony measures. In addition, those three families of synchrony measures are mutually uncorrelated, and therefore, they each seem to capture a specific kind of interdependence. For the data set at hand, only two synchrony measures are able to convincingly distinguish MCI patients from age-matched control patients, i.e., Granger causality (in particular, full-frequency directed transfer function) and stochastic event synchrony. Those two measures are used as features to distinguish MCI patients from age-matched control subjects, yielding a leave-one-out classification rate of 83%. The classification performance may be further improved by adding complementary features from EEG; this approach may eventually lead to a reliable EEG-based diagnostic tool for MCI and AD.


Assuntos
Doença de Alzheimer/diagnóstico , Eletroencefalografia/estatística & dados numéricos , Algoritmos , Doença de Alzheimer/fisiopatologia , Artefatos , Transtornos Cognitivos/diagnóstico , Transtornos Cognitivos/fisiopatologia , Sincronização Cortical , Entropia , Humanos , Teoria da Informação , Modelos Estatísticos , Dinâmica não Linear , Processos Estocásticos
2.
Langenbecks Arch Surg ; 393(2): 121-6, 2008 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-17994250

RESUMO

BACKGROUND AND AIMS: The aim of this study is to analyze the clinical data and criteria for surgery in a group of over 1,100 patients with adrenal incidentalomas (AI) observed at the Department of Endocrinology. PATIENTS AND METHODS: The material consisted of 1,161 patients (842 women and 319 men, 10-87 years old) with AI ranging in size from 1.0 to 23.0 cm. The methods included clinical examination, imaging studies, hormonal determinations in the blood and in the urine as well as histological and immunocytochemical investigations in 390 patients treated by surgery. RESULTS: Basing on these studies, we diagnosed 112 patients with primary malignant adrenal tumors (100 with carcinoma), 45 with metastatic infiltrations, and 1,004 with probable benign AI. Imaging phenotypes (especially high density on computed tomography, CT) were characteristic of malignant and chromaffin tumors. Subclinical adrenal hyperactivity was found in 8% of the patients with pre-Cushing's syndrome as the most frequent form (6.5%). Chromaffin tumors were detected in 3%. CONCLUSIONS: (1) Indications for surgery include malignant tumors (both primary and metastatic), tumors with subclinical hyperfunction, and chromaffin tumors. High density on CT, >20 HU, appeared to be an important indication for surgery. (2) A slight prevalence of oncological indications over endocrinological indications (14 vs. 11%) was found.


Assuntos
Neoplasias das Glândulas Suprarrenais/diagnóstico , Neoplasias das Glândulas Suprarrenais/cirurgia , Achados Incidentais , Adolescente , Neoplasias das Glândulas Suprarrenais/patologia , Neoplasias das Glândulas Suprarrenais/secundário , Glândulas Suprarrenais/patologia , Adrenalectomia , Adulto , Idoso , Idoso de 80 Anos ou mais , Criança , Diagnóstico Diferencial , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Pessoa de Meia-Idade , Hipersecreção Hipofisária de ACTH/diagnóstico , Hipersecreção Hipofisária de ACTH/patologia , Hipersecreção Hipofisária de ACTH/cirurgia , Tomografia Computadorizada por Raios X
3.
J Neural Eng ; 15(3): 031005, 2018 06.
Artigo em Inglês | MEDLINE | ID: mdl-29488902

RESUMO

OBJECTIVE: Most current electroencephalography (EEG)-based brain-computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. APPROACH: We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. MAIN RESULTS: We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. SIGNIFICANCE: This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.


Assuntos
Algoritmos , Interfaces Cérebro-Computador/tendências , Encéfalo/fisiologia , Eletroencefalografia/tendências , Processamento de Sinais Assistido por Computador , Animais , Aprendizado Profundo/tendências , Eletroencefalografia/métodos , Humanos , Fatores de Tempo
4.
Physiol Meas ; 28(4): 335-47, 2007 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-17395990

RESUMO

Alzheimer's disease (AD) is a degenerative disease which causes serious cognitive decline. Studies suggest that effective treatments for AD may be aided by the detection of the disease in its early stages, prior to extensive neuronal degeneration. In this paper, we propose a set of novel techniques which could help to perform this task, and present the results of experiments conducted to evaluate these approaches. The challenge is to discriminate between spontaneous EEG recordings from two groups of subjects: one afflicted with mild cognitive impairment and eventual AD and the other an age-matched control group. The classification results obtained indicate that the proposed methods are promising additions to the existing tools for detection of AD, though further research and experimentation with larger datasets is required to verify their effectiveness.


Assuntos
Algoritmos , Doença de Alzheimer/diagnóstico , Inteligência Artificial , Transtornos Cognitivos/diagnóstico , Diagnóstico por Computador/métodos , Eletroencefalografia/métodos , Reconhecimento Automatizado de Padrão/métodos , Idoso , Doença de Alzheimer/complicações , Transtornos Cognitivos/complicações , Potenciais Evocados , Feminino , Humanos , Masculino , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
5.
Methods Inf Med ; 54(3): 256-61, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25762456

RESUMO

INTRODUCTION: We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. BACKGROUND: Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. OBJECTIVES: To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. METHODS: We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. RESULTS: The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. CONCLUSION: We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.


Assuntos
Transtornos Cognitivos/diagnóstico , Diagnóstico por Computador , Design de Software , Humanos
6.
IEEE Trans Biomed Eng ; 48(5): 501-12, 2001 May.
Artigo em Inglês | MEDLINE | ID: mdl-11341524

RESUMO

In this paper, we use third-order correlations (TOC) in developing a filtering technique for the recovery of brain evoked potentials (EPs). The main idea behind the presented technique is to pass the noisy signal through a finite impulse response filter whose impulse response is matched with the shape of the noise-free signal. It is shown that it is possible to estimate the filter impulse response on basis of a selected third-order correlation slice (TOCS) of the input noisy signal. This is justified by two facts. The first one is that the noise-free EPs can be modeled as a sum of damped sinusoidal signals and the selected TOCS preserve the signal structure. The second fact is that the TOCS is insensitive to both Gaussian noise and other symmetrically distributed non-Gaussian noise, (white or colored). Furthermore, the approach can be applied to either nonaveraged or averaged EP observation data. In the nonaveraged data case, the approach therefore preserves information about amplitude and latency changes. Both fixed and adaptive versions of the proposed filtering technique are described. Extensive simulation results are provided to show the validity and effectiveness of the proposed cumulant-based filtering technique in comparison with the conventional correlation-based counterpart.


Assuntos
Potenciais Somatossensoriais Evocados , Modelos Neurológicos , Processamento de Sinais Assistido por Computador , Modelos Estatísticos , Reprodutibilidade dos Testes
7.
IEEE Trans Neural Netw ; 5(6): 910-23, 1994.
Artigo em Inglês | MEDLINE | ID: mdl-18267865

RESUMO

In this paper a new class of simplified low-cost analog artificial neural networks with on chip adaptive learning algorithms are proposed for solving linear systems of algebraic equations in real time. The proposed learning algorithms for linear least squares (LS), total least squares (TLS) and data least squares (DLS) problems can be considered as modifications and extensions of well known algorithms: the row-action projection-Kaczmarz algorithm and/or the LMS (Adaline) Widrow-Hoff algorithms. The algorithms can be applied to any problem which can be formulated as a linear regression problem. The correctness and high performance of the proposed neural networks are illustrated by extensive computer simulation results.

8.
IEEE Trans Neural Netw ; 9(6): 1495-501, 1998.
Artigo em Inglês | MEDLINE | ID: mdl-18255826

RESUMO

This paper presents the derivation of an unsupervised learning algorithm, which enables the identification and visualization of latent structure within ensembles of high-dimensional data. This provides a linear projection of the data onto a lower dimensional subspace to identify the characteristic structure of the observations independent latent causes. The algorithm is shown to be a very promising tool for unsupervised exploratory data analysis and data visualization. Experimental results confirm the attractiveness of this technique for exploratory data analysis and an empirical comparison is made with the recently proposed generative topographic mapping (GTM) and standard principal component analysis (PCA). Based on standard probability density models a generic nonlinearity is developed which allows both 1) identification and visualization of dichotomised clusters inherent in the observed data and 2) separation of sources with arbitrary distributions from mixtures, whose dimensionality may be greater than that of number of sources. The resulting algorithm is therefore also a generalized neural approach to independent component analysis (ICA) and it is considered to be a promising method for analysis of real-world data that will consist of sub- and super-Gaussian components such as biomedical signals.

9.
IEEE Trans Neural Netw ; 11(6): 1423-37, 2000.
Artigo em Inglês | MEDLINE | ID: mdl-18249866

RESUMO

In this paper we present an iterative inversion (II) approach to blind source separation (BSS). It consists of a quasi-Newton method for the resolution of an estimating equation obtained from the implicit inversion of a robust estimate of the mixing system. The resulting learning rule includes several existing algorithms for BSS as particular cases giving them a novel and unified interpretation.It also provides a justification of the Cardoso and Laheld step size normalization. The II method is first presented for instantaneous mixtures and then extended to the problem of blind separation of convolutive mixtures. Finally, we derive the necessary and sufficient asymptotic stability conditions for both the instantaneous and convolutive methods to converge.

10.
IEEE Trans Neural Netw ; 14(3): 631-45, 2003.
Artigo em Inglês | MEDLINE | ID: mdl-18238044

RESUMO

We propose a robust approach for independent component analysis (ICA) of signals where observations are contaminated with high-level additive noise and/or outliers. The source signals may contain mixtures of both sub-Gaussian and super-Gaussian components, and the number of sources is unknown. Our robust approach includes two procedures. In the first procedure, a robust prewhitening technique is used to reduce the power of additive noise, the dimensionality and the correlation among sources. A cross-validation technique is introduced to estimate the number of sources in this first procedure. In the second procedure, a nonlinear function is derived using the parameterized t-distribution density model. This nonlinear function is robust against the undue influence of outliers fundamentally. Moreover, the stability of the proposed algorithm and the robust property of misestimating the parameters (kurtosis) have been studied. By combining the t-distribution model with a family of light-tailed distributions (sub-Gaussian) model, we can separate the mixture of sub-Gaussian and super-Gaussian source components. Through the analysis of artificially synthesized data and real-world magnetoencephalographic (MEG) data, we illustrate the efficacy of this robust approach.

11.
Med Biol Eng Comput ; 39(2): 237-48, 2001 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-11361251

RESUMO

An adaptive filtering approach for the segmentation and tracking of electro-encephalogram (EEG) signal waves is described. In this approach, an adaptive recursive bandpass filter is employed for estimating and tracking the centre frequency associated with each EEG wave. The main advantage inherent in the approach is that the employed adaptive filter has only one unknown coefficient to be updated. This coefficient, having an absolute value less than 1, represents an efficient distinct feature for each EEG specific wave, and its time function reflects the non-stationarity behaviour of the EEG signal. Therefore the proposed approach is simple and accurate in comparison with existing multivariate adaptive approaches. The approach is examined using extensive computer simulations. It is applied to computer-generated EEG signals composed of different waves. The adaptive filter coefficient (i.e. the segmentation parameter) is -0.492 for the delta wave, -0.360 for the theta wave, -0.191 for the alpha wave, -0.027 for the sigma wave, 0.138 for the beta wave and 0.605 for the gamma wave. This implies that the segmentation parameter increases with the increase in the centre frequency of the EEG waves, which provides fast on-line information about the behaviour of the EEG signal. The approach is also applied to real-world EEG data for the detection of sleep spindles.


Assuntos
Eletroencefalografia/métodos , Processamento de Sinais Assistido por Computador , Algoritmos , Simulação por Computador , Humanos , Modelos Neurológicos , Sono/fisiologia
12.
Int J Neural Syst ; 8(2): 219-37, 1997 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-9327277

RESUMO

Noise is an unavoidable factor in real sensor signals. We study how additive and convolutive noise can be reduced or even eliminated in the blind source separation (BSS) problem. Particular attention is paid to cases in which the number of sensors is larger than the number of sources. We propose various methods and associated adaptive learning algorithms for such an extended BSS problem. Performance and validity of the proposed approaches are demonstrated by extensive computer simulations.


Assuntos
Redes Neurais de Computação , Processamento de Sinais Assistido por Computador , Algoritmos , Inteligência Artificial , Simulação por Computador
13.
Ginekol Pol ; 72(9): 728-32, 2001 Sep.
Artigo em Polonês | MEDLINE | ID: mdl-11757485

RESUMO

The experience with the use of plastic mesh for abdominal wall reconstruction after surgery for recurrent carcinoma of the cervix is presented. This method has proved to be effective with an excellent mechanical support for the viscera, with good early and distant morbidity. This technique allowed a subsequent radiotherapy with good tolerance. The presence of abdominal mesh proved not to be a serious obstacle when a secondary surgical intervention is required.


Assuntos
Recidiva Local de Neoplasia/cirurgia , Procedimentos de Cirurgia Plástica , Telas Cirúrgicas , Neoplasias do Colo do Útero/cirurgia , Músculos Abdominais/cirurgia , Adulto , Feminino , Humanos , Pessoa de Meia-Idade , Procedimentos de Cirurgia Plástica/métodos , Resultado do Tratamento
14.
Curr Alzheimer Res ; 7(6): 487-505, 2010 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-20455865

RESUMO

This paper reviews recent progress in the diagnosis of Alzheimer's disease (AD) from electroencephalograms (EEG). Three major effects of AD on EEG have been observed: slowing of the EEG, reduced complexity of the EEG signals, and perturbations in EEG synchrony. In recent years, a variety of sophisticated computational approaches has been proposed to detect those subtle perturbations in the EEG of AD patients. The paper first describes methods that try to detect slowing of the EEG. Next the paper deals with several measures for EEG complexity, and explains how those measures have been used to study fluctuations in EEG complexity in AD patients. Then various measures of EEG synchrony are considered in the context of AD diagnosis. Also the issue of EEG pre-processing is briefly addressed. Before one can analyze EEG, it is necessary to remove artifacts due to for example head and eye movement or interference from electronic equipment. Pre-processing of EEG has in recent years received much attention. In this paper, several state-of-the-art pre-processing tech- niques are outlined, for example, based on blind source separation and other non-linear filtering paradigms. In addition, the paper outlines opportunities and limitations of computational approaches for diagnosing AD based on EEG. At last, future challenges and open problems are discussed.


Assuntos
Doença de Alzheimer/diagnóstico , Doença de Alzheimer/fisiopatologia , Eletroencefalografia/métodos , Processamento de Sinais Assistido por Computador , Artefatos , Eletromiografia/métodos , Humanos , Análise de Regressão
15.
Neural Comput ; 21(8): 2152-202, 2009 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-19670479

RESUMO

We present a novel approach to quantify the statistical interdependence of two time series, referred to as stochastic event synchrony (SES). The first step is to extract the two given time series. The next step is to try to align events from one time series with events from the other. The better the alignment the more similar the two series are considered to be. More precisely, the similarity is quantified by the following parameters: time delay, variance of the time jitter, fraction of noncoincident events, and average similarity of the aligned events. The pairwise alignment and SES parameters are determined by statistical inference. In particular, the SES parameters are computed by maximum a posteriori (MAP) estimation, and the pairwise alignment is obtained by applying the max product algorithm. This letter deals with one-dimensional point processes; the extension to multidimensional point processes is considered in a companion letter in this issue. By analyzing surrogate data, we demonstrate that SES is able to quantify both timing precision and event reliability more robustly than classical measures can. As an illustration, neuronal spike data generated by Morris-Lecar neuron model are considered.


Assuntos
Potenciais de Ação/fisiologia , Modelos Neurológicos , Modelos Estatísticos , Neurônios/fisiologia , Algoritmos , Fatores de Tempo
16.
Neural Comput ; 21(8): 2203-68, 2009 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-19409054

RESUMO

Stochastic event synchrony is a technique to quantify the similarity of pairs of signals. First, events are extracted from the two given time series. Next, one tries to align events from one time series with events from the other. The better the alignment, the more similar the two time series are considered to be. In Part I, the companion letter in this issue, one-dimensional events are considered; this letter concerns multidimensional events. Although the basic idea is similar, the extension to multidimensional point processes involves a significantly more difficult combinatorial problem and therefore is nontrivial. Also in the multidimensional case, the problem of jointly computing the pairwise alignment and SES parameters is cast as a statistical inference problem. This problem is solved by coordinate descent, more specifically, by alternating the following two steps: (1) estimate the SES parameters from a given pairwise alignment; (2) with the resulting estimates, refine the pairwise alignment. The SES parameters are computed by maximum a posteriori (MAP) estimation (step 1), in analogy to the one-dimensional case. The pairwise alignment (step 2) can no longer be obtained through dynamic programming, since the state space becomes too large. Instead it is determined by applying the max-product algorithm on a cyclic graphical model. In order to test the robustness and reliability of the SES method, it is first applied to surrogate data. Next, it is applied to detect anomalies in EEG synchrony of mild cognitive impairment (MCI) patients. Numerical results suggest that SES is significantly more sensitive to perturbations in EEG synchrony than a large variety of classical synchrony measures.


Assuntos
Algoritmos , Modelos Neurológicos , Modelos Estatísticos , Processamento de Sinais Assistido por Computador , Processos Estocásticos , Animais , Eletroencefalografia , Humanos , Redes Neurais de Computação , Dinâmica não Linear , Fatores de Tempo
17.
Horm Metab Res ; 40(5): 338-41, 2008 May.
Artigo em Inglês | MEDLINE | ID: mdl-18491253

RESUMO

This study was aimed at summarizing our experience in the management of 1,444 patients with incidentally found adrenal tumors observed at a single endocrinological centre. Hormonal determinations were performed in all patients at the beginning of the observation period to detect subclinical adrenal hyperfunction. The imaging phenotype on CT and MRI was analyzed for defining the malignant potential of the tumors. Based on the results of these examinations we diagnosed among our cohort probably benign masses in 87%, malignant tumors in 10% (adrenal carcinoma - 9%), and metastases in 3%. Subclinical hyperfunction was diagnosed in 8%; the most frequent was the pre-Cushing's syndrome. A subgroup of 480 patients (33%) was submitted to surgery because of oncological or endocrinological indications. The patients not qualified for surgery were carefully controlled by imaging and hormonal examinations. Malignancy is the most serious risk in the group of patients with incidentally discovered adrenal tumors.


Assuntos
Neoplasias das Glândulas Suprarrenais/diagnóstico por imagem , Neoplasias das Glândulas Suprarrenais/cirurgia , Imageamento por Ressonância Magnética , Neoplasias/diagnóstico por imagem , Neoplasias/cirurgia , Adolescente , Neoplasias das Glândulas Suprarrenais/sangue , Adulto , Idoso , Idoso de 80 Anos ou mais , Criança , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Neoplasias/sangue , Estudos Retrospectivos , Tomografia Computadorizada por Raios X
19.
Comput Intell Neurosci ; : 82827, 2007.
Artigo em Inglês | MEDLINE | ID: mdl-18364991

RESUMO

While conventional approaches of BCI feature extraction are based on the power spectrum, we have tried using nonlinear features for classifying BCI data. In this paper, we report our test results and findings, which indicate that the proposed method is a potentially useful addition to current feature extraction techniques.

20.
Neural Comput ; 13(9): 1995-2003, 2001 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-11516354

RESUMO

In this work we develop a very simple batch learning algorithm for semiblind extraction of a desired source signal with temporal structure from linear mixtures. Although we use the concept of sequential blind extraction of sources and independent component analysis, we do not carry out the extraction in a completely blind manner; neither do we assume that sources are statistically independent. In fact, we show that the a priori information about the autocorrelation function of primary sources can be used to extract the desired signals (sources of interest) from their linear mixtures. Extensive computer simulations and real data application experiments confirm the validity and high performance of the proposed algorithm.


Assuntos
Algoritmos , Simulação por Computador , Eletrocardiografia , Feminino , Coração Fetal/fisiologia , Coração/fisiologia , Humanos , Modelos Biológicos , Distribuição Normal , Gravidez , Reprodutibilidade dos Testes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA