Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Neural Netw Learn Syst ; 23(12): 1896-904, 2012 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-24808145

RESUMO

The minimization of the empirical risk based on an arbitrary Bregman divergence is known to provide posterior class probability estimates in classification problems, but the accuracy of the estimate for a given value of the true posterior depends on the specific choice of the divergence. Ad hoc Bregman divergences can be designed to get a higher estimation accuracy for the posterior probability values that are most critical for a particular cost-sensitive classification scenario. Moreover, some sequences of Bregman loss functions can be constructed in such a way that their minimization guarantees, asymptotically, minimum number of errors in nonseparable cases, and maximum margin classifiers in separable problems. In this paper, we analyze general conditions on the Bregman generator to satisfy this property, and generalize the result for cost-sensitive classification.

2.
Magn Reson Med ; 63(3): 592-600, 2010 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-20187173

RESUMO

Spatial suppression of peripheral regions (outer volume suppression) is used in MR spectroscopic imaging to reduce contamination from strong lipid and water signals. The manual placement of outer volume suppression slices requires significant operator interaction, which is time consuming and introduces variability in volume coverage. Placing a large number of outer volume saturation bands for volumetric MR spectroscopic imaging studies is particularly challenging and time consuming and becomes unmanageable as the number of suppression bands increases. In this study, a method is presented that automatically segments a high-resolution MR image in order to identify the peripheral lipid-containing regions. This method computes an optimized placement of suppression bands in three dimensions and is based on the maximization of a criterion function. This criterion function maximizes coverage of peripheral lipid-containing areas and minimizes suppression of cortical brain regions and regions outside of the head. Computer simulation demonstrates automatic placement of 16 suppression slices to form a convex hull that covers peripheral lipid-containing regions above the base of the brain. In vivo metabolite mapping obtained with short echo time proton-echo-planar spectroscopic imaging shows that the automatic method yields a placement of suppression slices that is very similar to that of a skilled human operator in terms of lipid suppression and usable brain voxels.


Assuntos
Algoritmos , Encéfalo/anatomia & histologia , Encéfalo/metabolismo , Imageamento Tridimensional/métodos , Lipídeos/análise , Espectroscopia de Ressonância Magnética/métodos , Humanos
3.
IEEE Trans Neural Netw ; 16(4): 799-809, 2005 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-16121722

RESUMO

This paper proposes a novel algorithm to jointly determine the structure and the parameters of a posteriori probability model based on neural networks (NNs). It makes use of well-known ideas of pruning, splitting, and merging neural components and takes advantage of the probabilistic interpretation of these components. The algorithm, so called a posteriori probability model selection (PPMS), is applied to an NN architecture called the generalized softmax perceptron (GSP) whose outputs can be understood as probabilities although results shown can be extended to more general network architectures. Learning rules are derived from the application of the expectation-maximization algorithm to the GSP-PPMS structure. Simulation results show the advantages of the proposed algorithm with respect to other schemes.


Assuntos
Algoritmos , Neoplasias da Mama/diagnóstico , Diagnóstico por Computador/métodos , Modelos Biológicos , Modelos Estatísticos , Redes Neurais de Computação , Reconhecimento Automatizado de Padrão/métodos , Neoplasias da Mama/classificação , Análise por Conglomerados , Simulação por Computador , Metodologias Computacionais , Técnicas de Apoio para a Decisão , Humanos , Análise Numérica Assistida por Computador , Processos Estocásticos
4.
IEEE Trans Neural Netw ; 16(4): 810-20, 2005 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-16121723

RESUMO

In this paper, we analyze stochastic gradient learning rules for posterior probability estimation using networks with a single layer of weights and a general nonlinear activation function. We provide necessary and sufficient conditions on the learning rules and the activation function to obtain probability estimates. Also, we extend the concept of well-formed cost function, proposed by Wittner and Denker, to multiclass problems, and we provide theoretical results showing the advantages of this kind of objective functions.


Assuntos
Algoritmos , Técnicas de Apoio para a Decisão , Modelos Biológicos , Modelos Estatísticos , Redes Neurais de Computação , Reconhecimento Automatizado de Padrão/métodos , Análise por Conglomerados , Simulação por Computador , Metodologias Computacionais , Análise Numérica Assistida por Computador , Processos Estocásticos
5.
IEEE Trans Neural Netw ; 15(2): 309-17, 2004 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-15384524

RESUMO

Decision theory shows that the optimal decision is a function of the posterior class probabilities. More specifically, in binary classification, the optimal decision is based on the comparison of the posterior probabilities with some threshold. Therefore, the most accurate estimates of the posterior probabilities are required near these decision thresholds. This paper discusses the design of objective functions that provide more accurate estimates of the probability values, taking into account the characteristics of each decision problem. We propose learning algorithms based on the stochastic gradient minimization of these loss functions. We show that the performance of the classifier is improved when these algorithms behave like sample selectors: samples near the decision boundary are the most relevant during learning.


Assuntos
Probabilidade , Projetos de Pesquisa/normas
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...