Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 15 de 15
Filtrar
1.
Med Phys ; 49(7): 4404-4418, 2022 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-35588288

RESUMO

PURPOSE: Standard four-dimensional computed tomography (4DCT) cardiac reconstructions typically include spiraling artifacts that depend not only on the motion of the heart but also on the gantry angle range over which the data was acquired. We seek to reduce these motion artifacts and, thereby, improve the accuracy of left ventricular wall positions in 4DCT image series. METHODS: We use a motion artifact reduction approach (ResyncCT) that is based largely on conjugate pairs of partial angle reconstruction (PAR) images. After identifying the key locations where motion artifacts exist in the uncorrected images, paired subvolumes within the PAR images are analyzed with a modified cross-correlation function in order to estimate 3D velocity and acceleration vectors at these locations. A subsequent motion compensation process (also based on PAR images) includes the creation of a dense motion field, followed by a backproject-and-warp style compensation. The algorithm was tested on a 3D printed phantom, which represents the left ventricle (LV) and on challenging clinical cases corrupted by severe artifacts. RESULTS: The results from our preliminary phantom test as well as from clinical cardiac scans show crisp endocardial edges and resolved double-wall artifacts. When viewed as a temporal series, the corrected images exhibit a much smoother motion of the LV endocardial boundary as compared to the uncorrected images. In addition, quantitative results from our phantom studies show that ResyncCT processing reduces endocardial surface distance errors from 0.9 ± 0.8 to 0.2 ± 0.1 mm. CONCLUSIONS: The ResyncCT algorithm was shown to be effective in reducing motion artifacts and restoring accurate wall positions. Some perspectives on the use of conjugate-PAR images and on techniques for CT motion artifact reduction more generally are also given.


Assuntos
Artefatos , Tomografia Computadorizada Quadridimensional , Algoritmos , Tomografia Computadorizada Quadridimensional/métodos , Ventrículos do Coração/diagnóstico por imagem , Movimento (Física) , Imagens de Fantasmas
2.
IEEE Trans Med Imaging ; 34(1): 167-78, 2015 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-25163058

RESUMO

Statistical X-ray computed tomography (CT) reconstruction can improve image quality from reduced dose scans, but requires very long computation time. Ordered subsets (OS) methods have been widely used for research in X-ray CT statistical image reconstruction (and are used in clinical PET and SPECT reconstruction). In particular, OS methods based on separable quadratic surrogates (OS-SQS) are massively parallelizable and are well suited to modern computing architectures, but the number of iterations required for convergence should be reduced for better practical use. This paper introduces OS-SQS-momentum algorithms that combine Nesterov's momentum techniques with OS-SQS methods, greatly improving convergence speed in early iterations. If the number of subsets is too large, the OS-SQS-momentum methods can be unstable, so we propose diminishing step sizes that stabilize the method while preserving the very fast convergence behavior. Experiments with simulated and real 3D CT scan data illustrate the performance of the proposed algorithms.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada por Raios X/métodos , Algoritmos , Humanos , Imagens de Fantasmas , Radiografia Torácica
3.
IEEE Trans Image Process ; 23(6): 2423-35, 2014 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-24710832

RESUMO

We introduce a family of novel image regularization penalties called generalized higher degree total variation (HDTV). These penalties further extend our previously introduced HDTV penalties, which generalize the popular total variation (TV) penalty to incorporate higher degree image derivatives. We show that many of the proposed second degree extensions of TV are special cases or are closely approximated by a generalized HDTV penalty. Additionally, we propose a novel fast alternating minimization algorithm for solving image recovery problems with HDTV and generalized HDTV regularization. The new algorithm enjoys a tenfold speed up compared with the iteratively reweighted majorize minimize algorithm proposed in a previous paper. Numerical experiments on 3D magnetic resonance images and 3D microscopy images show that HDTV and generalized HDTV improve the image quality significantly compared with TV.


Assuntos
Algoritmos , Artefatos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Análise de Variância , Simulação por Computador , Interpretação Estatística de Dados , Modelos Estatísticos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
4.
Magn Reson Med ; 71(5): 1760-70, 2014 May.
Artigo em Inglês | MEDLINE | ID: mdl-23821331

RESUMO

PURPOSE: Regularizing parallel magnetic resonance imaging (MRI) reconstruction significantly improves image quality but requires tuning parameter selection. We propose a Monte Carlo method for automatic parameter selection based on Stein's unbiased risk estimate that minimizes the multichannel k-space mean squared error (MSE). We automatically tune parameters for image reconstruction methods that preserve the undersampled acquired data, which cannot be accomplished using existing techniques. THEORY: We derive a weighted MSE criterion appropriate for data-preserving regularized parallel imaging reconstruction and the corresponding weighted Stein's unbiased risk estimate. We describe a Monte Carlo approximation of the weighted Stein's unbiased risk estimate that uses two evaluations of the reconstruction method per candidate parameter value. METHODS: We reconstruct images using the denoising sparse images from GRAPPA using the nullspace method (DESIGN) and L1 iterative self-consistent parallel imaging (L1 -SPIRiT). We validate Monte Carlo Stein's unbiased risk estimate against the weighted MSE. We select the regularization parameter using these methods for various noise levels and undersampling factors and compare the results to those using MSE-optimal parameters. RESULTS: Our method selects nearly MSE-optimal regularization parameters for both DESIGN and L1 -SPIRiT over a range of noise levels and undersampling factors. CONCLUSION: The proposed method automatically provides nearly MSE-optimal choices of regularization parameters for data-preserving nonlinear parallel MRI reconstruction methods.


Assuntos
Algoritmos , Encéfalo/anatomia & histologia , Interpretação de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Modelos Estatísticos , Método de Monte Carlo , Simulação por Computador , Humanos , Aumento da Imagem/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
5.
IEEE Trans Med Imaging ; 33(2): 351-61, 2014 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24122551

RESUMO

SPIRiT (iterative self-consistent parallel imaging reconstruction), and its sparsity-regularized variant L1-SPIRiT, are compatible with both Cartesian and non-Cartesian magnetic resonance imaging sampling trajectories. However, the non-Cartesian framework is more expensive computationally, involving a nonuniform Fourier transform with a nontrivial Gram matrix. We propose a novel implementation of the regularized reconstruction problem using variable splitting, alternating minimization of the augmented Lagrangian, and careful preconditioning. Our new method based on the alternating direction method of multipliers converges much faster than existing methods because of the preconditioners' heightened effectiveness. We demonstrate such rapid convergence substantially improves image quality for a fixed computation time. Our framework is a step forward towards rapid non-Cartesian L1-SPIRiT reconstructions.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Algoritmos , Encéfalo/anatomia & histologia , Humanos , Imagens de Fantasmas
6.
IEEE Trans Med Imaging ; 32(8): 1411-22, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23591478

RESUMO

Magnetic resonance image (MRI) reconstruction from undersampled k-space data requires regularization to reduce noise and aliasing artifacts. Proper application of regularization however requires appropriate selection of associated regularization parameters. In this work, we develop a data-driven regularization parameter adjustment scheme that minimizes an estimate [based on the principle of Stein's unbiased risk estimate (SURE)] of a suitable weighted squared-error measure in k-space. To compute this SURE-type estimate, we propose a Monte-Carlo scheme that extends our previous approach to inverse problems (e.g., MRI reconstruction) involving complex-valued images. Our approach depends only on the output of a given reconstruction algorithm and does not require knowledge of its internal workings, so it is capable of tackling a wide variety of reconstruction algorithms and nonquadratic regularizers including total variation and those based on the l1-norm. Experiments with simulated and real MR data indicate that the proposed approach is capable of providing near mean squared-error optimal regularization parameters for single-coil undersampled non-Cartesian MRI reconstruction.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Método de Monte Carlo , Algoritmos , Simulação por Computador , Humanos , Imagens de Fantasmas
7.
IEEE Trans Image Process ; 22(5): 2019-29, 2013 May.
Artigo em Inglês | MEDLINE | ID: mdl-23372080

RESUMO

To reduce blur in noisy images, regularized image restoration methods have been proposed that use nonquadratic regularizers (like l1 regularization or total-variation) that suppress noise while preserving edges in the image. Most of these methods assume a circulant blur (periodic convolution with a blurring kernel) that can lead to wraparound artifacts along the boundaries of the image due to the implied periodicity of the circulant model. Using a noncirculant model could prevent these artifacts at the cost of increased computational complexity. In this paper, we propose to use a circulant blur model combined with a masking operator that prevents wraparound artifacts. The resulting model is noncirculant, so we propose an efficient algorithm using variable splitting and augmented Lagrangian (AL) strategies. Our variable splitting scheme, when combined with the AL framework and alternating minimization, leads to simple linear systems that can be solved noniteratively using fast Fourier transforms (FFTs), eliminating the need for more expensive conjugate gradient-type solvers. The proposed method can also efficiently tackle a variety of convex regularizers, including edge-preserving (e.g., total-variation) and sparsity promoting (e.g., l1-norm) regularizers. Simulation results show fast convergence of the proposed method, along with improved image quality at the boundaries where the circulant model is inaccurate.

8.
IEEE Trans Med Imaging ; 32(3): 556-64, 2013 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-23192524

RESUMO

Several magnetic resonance parallel imaging techniques require explicit estimates of the receive coil sensitivity profiles. These estimates must be accurate over both the object and its surrounding regions to avoid generating artifacts in the reconstructed images. Regularized estimation methods that involve minimizing a cost function containing both a data-fit term and a regularization term provide robust sensitivity estimates. However, these methods can be computationally expensive when dealing with large problems. In this paper, we propose an iterative algorithm based on variable splitting and the augmented Lagrangian method that estimates the coil sensitivity profile by minimizing a quadratic cost function. Our method, ADMM-Circ, reformulates the finite differencing matrix in the regularization term to enable exact alternating minimization steps. We also present a faster variant of this algorithm using intermediate updating of the associated Lagrange multipliers. Numerical experiments with simulated and real data sets indicate that our proposed method converges approximately twice as fast as the preconditioned conjugate gradient method over the entire field-of-view. These concepts may accelerate other quadratic optimization problems.


Assuntos
Algoritmos , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Encéfalo/anatomia & histologia , Simulação por Computador , Análise de Elementos Finitos , Humanos , Imagens de Fantasmas , Sensibilidade e Especificidade
9.
Artigo em Inglês | MEDLINE | ID: mdl-24663389

RESUMO

The main focus of this paper is to introduce a computationally efficient algorithm for solving image recovery problems, regularized by the recently introduced higher degree total variation (HDTV) penalties. The anisotropic HDTV penalty is the fully separable L1 semi-norm of the directional image derivatives; the use of this penalty is seen to considerably improve image quality in biomedical inverse problems. We introduce a novel majorize minimize algorithm to solve the HDTV optimization problem, thus considerably speeding it over the previous implementation. Specifically, comparisons with previous iterative reweighted algorithm show an approximate ten fold speedup. The new algorithm enables us to obtain reconstructions that are free of patchy artifacts exhibited by classical TV schemes, while being comparable to state of the art total variation regularization schemes in run time.

10.
IEEE Trans Image Process ; 21(8): 3659-72, 2012 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-22531764

RESUMO

Regularized iterative reconstruction algorithms for imaging inverse problems require selection of appropriate regularization parameter values. We focus on the challenging problem of tuning regularization parameters for nonlinear algorithms for the case of additive (possibly complex) Gaussian noise. Generalized cross-validation (GCV) and (weighted) mean-squared error (MSE) approaches (based on Steinfs Unbiased Risk Estimate. SURE) need the Jacobian matrix of the nonlinear reconstruction operator (representative of the iterative algorithm) with respect to the data. We derive the desired Jacobian matrix for two types of nonlinear iterative algorithms: a fast variant of the standard iterative reweighted least-squares method and the contemporary split-Bregman algorithm, both of which can accommodate a wide variety of analysis- and synthesis-type regularizers. The proposed approach iteratively computes two weighted SURE-type measures: Predicted-SURE and Projected-SURE (that require knowledge of noise variance Ð2), and GCV (that does not need Ð2) for these algorithms. We apply the methods to image restoration and to magnetic resonance image (MRI) reconstruction using total variation (TV) and an analysis-type .1-regularization. We demonstrate through simulations and experiments with real data that minimizing Predicted-SURE and Projected-SURE consistently lead to near-MSE-optimal reconstructions. We also observed that minimizing GCV yields reconstruction results that are near-MSE-optimal for image restoration and slightly suboptimal for MRI. Theoretical derivations in this work related to Jacobian matrix evaluations can be extended, in principle, to other types of regularizers and reconstruction algorithms.


Assuntos
Algoritmos , Encéfalo/anatomia & histologia , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Interpretação Estatística de Dados , Humanos , Dinâmica não Linear , Distribuição Normal , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
11.
IEEE Trans Med Imaging ; 31(3): 677-88, 2012 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-22084046

RESUMO

Statistical image reconstruction using penalized weighted least-squares (PWLS) criteria can improve image-quality in X-ray computed tomography (CT). However, the huge dynamic range of the statistical weights leads to a highly shift-variant inverse problem making it difficult to precondition and accelerate existing iterative algorithms that attack the statistical model directly. We propose to alleviate the problem by using a variable-splitting scheme that separates the shift-variant and ("nearly") invariant components of the statistical data model and also decouples the regularization term. This leads to an equivalent constrained problem that we tackle using the classical method-of-multipliers framework with alternating minimization. The specific form of our splitting yields an alternating direction method of multipliers (ADMM) algorithm with an inner-step involving a "nearly" shift-invariant linear system that is suitable for FFT-based preconditioning using cone-type filters. The proposed method can efficiently handle a variety of convex regularization criteria including smooth edge-preserving regularizers and nonsmooth sparsity-promoting ones based on the l(1)-norm and total variation. Numerical experiments with synthetic and real in vivo human data illustrate that cone-filter preconditioners accelerate the proposed ADMM resulting in fast convergence of ADMM compared to conventional (nonlinear conjugate gradient, ordered subsets) and state-of-the-art (MFISTA, split-Bregman) algorithms that are applicable for CT.


Assuntos
Algoritmos , Intensificação de Imagem Radiográfica/métodos , Tomografia Computadorizada por Raios X/métodos , Simulação por Computador , Cabeça/diagnóstico por imagem , Humanos , Análise dos Mínimos Quadrados , Modelos Biológicos , Imagens de Fantasmas
12.
IEEE Trans Med Imaging ; 30(3): 694-706, 2011 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-21095861

RESUMO

Magnetic resonance image (MRI) reconstruction using SENSitivity Encoding (SENSE) requires regularization to suppress noise and aliasing effects. Edge-preserving and sparsity-based regularization criteria can improve image quality, but they demand computation-intensive nonlinear optimization. In this paper, we present novel methods for regularized MRI reconstruction from undersampled sensitivity encoded data--SENSE-reconstruction--using the augmented Lagrangian (AL) framework for solving large-scale constrained optimization problems. We first formulate regularized SENSE-reconstruction as an unconstrained optimization task and then convert it to a set of (equivalent) constrained problems using variable splitting. We then attack these constrained versions in an AL framework using an alternating minimization method, leading to algorithms that can be implemented easily. The proposed methods are applicable to a general class of regularizers that includes popular edge-preserving (e.g., total-variation) and sparsity-promoting (e.g., l(1)-norm of wavelet coefficients) criteria and combinations thereof. Numerical experiments with synthetic and in vivo human data illustrate that the proposed AL algorithms converge faster than both general-purpose optimization algorithms such as nonlinear conjugate gradient (NCG) and state-of-the-art MFISTA.


Assuntos
Algoritmos , Artefatos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Reconhecimento Automatizado de Padrão/métodos , Encéfalo , Humanos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
13.
IEEE Trans Med Imaging ; 29(2): 543-58, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-20129854

RESUMO

Interpolation is the means by which a continuously defined model is fit to discrete data samples. When the data samples are exempt of noise, it seems desirable to build the model by fitting them exactly. In medical imaging, where quality is of paramount importance, this ideal situation unfortunately does not occur. In this paper, we propose a scheme that improves on the quality by specifying a tradeoff between fidelity to the data and robustness to the noise. We resort to variational principles, which allow us to impose smoothness constraints on the model for tackling noisy data. Based on shift-, rotation-, and scale-invariant requirements on the model, we show that the L(p)-norm of an appropriate vector derivative is the most suitable choice of regularization for this purpose. In addition to Tikhonov-like quadratic regularization, this includes edge-preserving total-variation-like (TV) regularization. We give algorithms to recover the continuously defined model from noisy samples and also provide a data-driven scheme to determine the optimal amount of regularization. We validate our method with numerical examples where we demonstrate its superiority over an exact fit as well as the benefit of TV-like nonquadratic regularization over Tikhonov-like quadratic regularization.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Modelos Estatísticos , Algoritmos , Cabeça/anatomia & histologia , Humanos , Imageamento por Ressonância Magnética/métodos , Distribuição de Poisson , Reprodutibilidade dos Testes , Tomografia Computadorizada por Raios X/métodos
14.
IEEE Trans Image Process ; 17(9): 1540-54, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18701393

RESUMO

We consider the problem of optimizing the parameters of a given denoising algorithm for restoration of a signal corrupted by white Gaussian noise. To achieve this, we propose to minimize Stein's unbiased risk estimate (SURE) which provides a means of assessing the true mean-squared error (MSE) purely from the measured data without need for any knowledge about the noise-free signal. Specifically, we present a novel Monte-Carlo technique which enables the user to calculate SURE for an arbitrary denoising algorithm characterized by some specific parameter setting. Our method is a black-box approach which solely uses the response of the denoising operator to additional input noise and does not ask for any information about its functional form. This, therefore, permits the use of SURE for optimization of a wide variety of denoising algorithms. We justify our claims by presenting experimental results for SURE-based optimization of a series of popular image-denoising algorithms such as total-variation denoising, wavelet soft-thresholding, and Wiener filtering/smoothing splines. In the process, we also compare the performance of these methods. We demonstrate numerically that SURE computed using the new approach accurately predicts the true MSE for all the considered algorithms. We also show that SURE uncovers the optimal values of the parameters in all cases.


Assuntos
Algoritmos , Artefatos , Aumento da Imagem , Interpretação de Imagem Assistida por Computador/métodos , Simulação por Computador , Interpretação Estatística de Dados , Aumento da Imagem/métodos , Modelos Estatísticos , Método de Monte Carlo , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
15.
J Opt Soc Am A Opt Image Sci Vis ; 24(3): 794-813, 2007 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-17301868

RESUMO

The introduction of high-resolution phase-shifting interferometry methods such as annihilation filter, state space, multiple-signal classification, minimum norm, estimation of signal parameter via rotational invariance, and maximum-likelihood estimator have enabled the estimation of phase in an interferogram in the presence of harmonics and noise. These methods are also effective in holographic moiré where incorporating two piezoelectric transducers (PZTs) yields two orthogonal displacement components simultaneously. Typically, when these methods are used, the first step involves estimating the phase steps pixelwise; then the interference phase distribution is computed by designing a Vandermonde system of equations. In this context, we present a statistical study of these methods for the case of single and dual PZTs. The performance of these methods is also compared with other conventional benchmarking algorithms involving the single PZT. The paper also discusses the significant issue of an allowable pair of phase steps in the presence of noise using a robust statistical tool such as the Cramér-Rao bound. Furthermore, experimental validations of these high-resolution methods are presented for the estimation of single phase in holographic interferometry and for the estimation of multiple phases in holographic moiré.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA