Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Sensors (Basel) ; 21(7)2021 Mar 26.
Artigo em Inglês | MEDLINE | ID: mdl-33810513

RESUMO

Wireless sensor networks are used in many location-dependent applications. The location of sensor nodes is commonly carried out in a distributed way for energy saving and network robustness, where the handling of these characteristics is still a great challenge. It is very desirable that distributed algorithms invest as few iterations as possible with the highest accuracy on position estimates. This research proposes a range-based and robust localization method, derived from the Newton scheme, that can be applied over isotropic and anisotropic networks in presence of outliers in the pair-wise distance measurements. The algorithm minimizes the error of position estimates using a hop-weighted function and a scaling factor that allows a significant improvement on position estimates in only few iterations. Simulations demonstrate that our proposed algorithm outperforms other similar algorithms under anisotropic networks.

2.
Entropy (Basel) ; 21(1)2019 Jan 18.
Artigo em Inglês | MEDLINE | ID: mdl-33266799

RESUMO

The reconstruction of positron emission tomography data is a difficult task, particularly at low count rates because Poisson noise has a significant influence on the statistical uncertainty of positron emission tomography (PET) measurements. Prior information is frequently used to improve image quality. In this paper, we propose the use of a field of experts to model a priori structure and capture anatomical spatial dependencies of the PET images to address the problems of noise and low count data, which make the reconstruction of the image difficult. We reconstruct PET images by using a modified MXE algorithm, which minimizes a objective function with the cross-entropy as a fidelity term, while the field of expert model is incorporated as a regularizing term. Comparisons with the expectation maximization algorithm and a iterative method with a prior penalizing relative differences showed that the proposed method can lead to accurate estimation of the image, especially with acquisitions at low count rate.

3.
Sci Rep ; 13(1): 9014, 2023 Jun 02.
Artigo em Inglês | MEDLINE | ID: mdl-37268706

RESUMO

The adaptive filtering theory has been extensively developed, and most of the proposed algorithms work under the assumption of Euclidean space. However, in many applications, the data to be processed comes from a non-linear manifold. In this article, we propose an alternative adaptive filter that works on a manifold, thus generalizing the filtering task to non-Euclidean spaces. To this end, we generalized the least-mean-squared algorithm to work on a manifold using an exponential map. Our experiments showed that the proposed method outperforms other state-of-the-art algorithms in several filtering tasks.

4.
J Healthc Eng ; 2018: 4706165, 2018.
Artigo em Inglês | MEDLINE | ID: mdl-30581548

RESUMO

Positron emission tomography (PET) provides images of metabolic activity in the body, and it is used in the research, monitoring, and diagnosis of several diseases. However, the raw data produced by the scanner are severely corrupted by noise, causing a degraded quality in the reconstructed images. In this paper, we proposed a reconstruction algorithm to improve the image reconstruction process, addressing the problem from a variational geometric perspective. We proposed using the weighted Gaussian curvature (WGC) as a regularization term to better deal with noise and preserve the original geometry of the image, such as the lesion structure. In other contexts, the WGC term has been found to have excellent capabilities for preserving borders and structures of low gradient magnitude, such as ramp-like structures; at the same time, it effectively removes noise in the image. We presented several experiments aimed at evaluating contrast and lesion detectability in the reconstructed images. The results for simulated images and real data showed that our proposed algorithm effectively preserves lesions and removes noise.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Tomografia por Emissão de Pósitrons/métodos , Algoritmos , Humanos , Fígado/diagnóstico por imagem , Hepatopatias/diagnóstico por imagem , Modelos Biológicos , Distribuição Normal , Imagens de Fantasmas
5.
IEEE Trans Med Imaging ; 33(10): 2010-9, 2014 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-24951682

RESUMO

In this paper, we address the problem of denoising reconstructed small animal positron emission tomography (PET) images, based on a multiresolution approach which can be implemented with any transform such as contourlet, shearlet, curvelet, and wavelet. The PET images are analyzed and processed in the transform domain by modeling each subband as a set of different regions separated by boundaries. Homogeneous and heterogeneous regions are considered. Each region is independently processed using different filters: a linear estimator for homogeneous regions and a surface polynomial estimator for the heterogeneous region. The boundaries between the different regions are estimated using a modified edge focusing filter. The proposed approach was validated by a series of experiments. Our method achieved an overall reduction of up to 26% in the %STD of the reconstructed image of a small animal NEMA phantom. Additionally, a test on a simulated lesion showed that our method yields better contrast preservation than other state-of-the art techniques used for noise reduction. Thus, the proposed method provides a significant reduction of noise while at the same time preserving contrast and important structures such as lesions.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Tomografia por Emissão de Pósitrons/métodos , Algoritmos , Animais , Simulação por Computador , Imagens de Fantasmas , Ratos , Reprodutibilidade dos Testes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA