Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 18 de 18
Filtrar
1.
J Nucl Med ; 58(2): 332-338, 2017 02.
Artigo em Inglês | MEDLINE | ID: mdl-27587706

RESUMO

The goal of this paper was to evaluate the in vivo kinetics of the novel tau-specific PET radioligand 18F-AV-1451 in cognitively healthy control (HC) and Alzheimer disease (AD) subjects, using reference region analyses. METHODS: 18F-AV-1451 PET imaging was performed on 43 subjects (5 young HCs, 23 older HCs, and 15 AD subjects). Data were collected from 0 to 150 min after injection, with a break from 100 to 120 min. T1-weighted MR images were segmented using FreeSurfer to create 14 bilateral regions of interest (ROIs). In all analyses, cerebellar gray matter was used as the reference region. Nondisplaceable binding potentials (BPNDs) were calculated using the simplified reference tissue model (SRTM) and SRTM2; the Logan graphical analysis distribution volume ratio (DVR) was calculated for 30-150 min (DVR30-150). These measurements were compared with each other and used as reference standards for defining an appropriate 20-min window for the SUV ratio (SUVR). Pearson correlations were used to compare the reference standards to 20-min SUVRs (start times varied from 30 to 130 min), for all values, for ROIs with low 18F-AV-1451 binding (lROIs, mean of BPND + 1 and DVR30-150 < 1.5), and for ROIs with high 18F-AV-1451 binding (hROIs, mean of BPND + 1 and DVR30-150 > 1.5). RESULTS: SRTM2 BPND + 1 and DVR30-150 were in good agreement. Both were in agreement with SRTM BPND + 1 for lROIs but were greater than SRTM BPND + 1 for hROIs, resulting in a nonlinear relationship. hROI SUVRs increased from 80-100 to 120-140 min by 0.24 ± 0.15. The SUVR time interval resulting in the highest correlation and slope closest to 1 relative to the reference standards for all values was 120-140 min for hROIs, 60-80 min for lROIs, and 80-100 min for lROIs and hROIs. There was minimal difference between methods when statistical significance between ADs and HCs was calculated. CONCLUSION: Despite later time periods providing better agreement between reference standards and SUVRs for hROIs, a good compromise for studying lROIs and hROIs is SUVR80-100. The lack of SUVR plateau for hROIs highlights the importance of precise acquisition time for longitudinal assessment.


Assuntos
Doença de Alzheimer/metabolismo , Encéfalo/metabolismo , Carbolinas/farmacocinética , Carbolinas/normas , Tomografia por Emissão de Pósitrons/métodos , Proteínas tau/metabolismo , Idoso , Doença de Alzheimer/diagnóstico por imagem , Biomarcadores/metabolismo , Encéfalo/diagnóstico por imagem , Simulação por Computador , Feminino , Humanos , Aumento da Imagem/normas , Cinética , Masculino , Taxa de Depuração Metabólica , Pessoa de Meia-Idade , Modelos Biológicos , Imagem Molecular/normas , Mapeamento de Interação de Proteínas/normas , Compostos Radiofarmacêuticos/farmacocinética , Compostos Radiofarmacêuticos/normas , Valores de Referência , Estados Unidos
2.
Int J Mol Imaging ; 2011: 893129, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21490736

RESUMO

The goal of this project is to develop radionuclide molecular imaging technologies using a clinical pinhole SPECT/CT scanner to quantify changes in cardiac metabolism using the spontaneously hypertensive rat (SHR) as a model of hypertensive-related pathophysiology. This paper quantitatively compares fatty acid metabolism in hearts of SHR and Wistar-Kyoto normal rats as a function of age and thereby tracks physiological changes associated with the onset and progression of heart failure in the SHR model. The fatty acid analog, (123)I-labeled BMIPP, was used in longitudinal metabolic pinhole SPECT imaging studies performed every seven months for 21 months. The uniqueness of this project is the development of techniques for estimating the blood input function from projection data acquired by a slowly rotating camera that is imaging fast circulation and the quantification of the kinetics of (123)I-BMIPP by fitting compartmental models to the blood and tissue time-activity curves.

3.
Phys Med Biol ; 55(22): 6931-50, 2010 Nov 21.
Artigo em Inglês | MEDLINE | ID: mdl-21048292

RESUMO

In this paper, we investigate the performance of time-of-flight (TOF) positron emission tomography (PET) in improving lesion detectability. We present a theoretical approach to compare lesion detectability of TOF versus non-TOF systems and perform computer simulations to validate the theoretical prediction. A single-ring TOF PET tomograph is simulated using SimSET software, and images are reconstructed in 2D from list-mode data using a maximum a posteriori method. We use a channelized Hotelling observer to assess the detection performance. Both the receiver operating characteristic (ROC) and localization ROC curves are compared for the TOF and non-TOF PET systems. We first studied the SNR gains for TOF PET with different scatter and random fractions, system timing resolutions and object sizes. We found that the TOF information improves the lesion detectability and the improvement is greater with larger fractions of randoms, better timing resolution and bigger objects. The scatters by themselves have little impact on the SNR gain after correction. Since the true system timing resolution may not be known precisely in practice, we investigated the effect of mismatched timing kernels and showed that using a mismatched kernel during reconstruction always degrades the detection performance, no matter whether it is narrower or wider than the real value. Using the proposed theoretical framework, we also studied the effect of lumpy backgrounds on the detection performance. Our results indicated that with lumpy backgrounds, the TOF PET still outperforms the non-TOF PET, but the improvement is smaller compared with the uniform background case. More specifically, with the same correlation length, the SNR gain reduces with bigger number of lumpy patches and greater lumpy amplitudes. With the same variance, the SNR gain reaches the minimum when the width of the Gaussian lumps is close to the size of the tumor.


Assuntos
Tomografia por Emissão de Pósitrons/métodos , Simulação por Computador , Humanos , Processamento de Imagem Assistida por Computador , Método de Monte Carlo , Reprodutibilidade dos Testes , Espalhamento de Radiação , Fatores de Tempo
4.
Proc IEEE Int Symp Biomed Imaging ; June 28 2009-July 1 2009: 402-405, 2009 Aug 07.
Artigo em Inglês | MEDLINE | ID: mdl-20936051

RESUMO

In this paper, we investigate the performance of time-of-flight (TOF) PET in improving lesion detectability. We present a theoretical approach to compare lesion detectability of TOF versus non-TOF systems. Computer simulations are performed to validate the theoretical predictions. A TOF PET tomograph is simulated using the SimSET software. Images are reconstructed from list-mode data using a maximum a posteriori (MAP) method. We use a channelized Hotelling observer (CHO) to assess the detection performance. Both the receiver operating characteristic (ROC) and localization ROC (LROC) curves are compared for the TOF and non-TOF PET systems. We also study the SNR gains for TOF PET with different scatter and random fractions. Simulation results match with the theoretical predictions very well. Both results show that the TOF information improves lesion detectability and the improvement is greater with larger fractions of randoms and scatters.

5.
Ann Biomed Eng ; 36(7): 1104-17, 2008 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-18437574

RESUMO

The objective of this research was to assess applicability of a technique known as hyperelastic warping for the measurement of local strains in the left ventricle (LV) directly from microPET image data sets. The technique uses differences in image intensities between template (reference) and target (loaded) image data sets to generate a body force that deforms a finite element (FE) representation of the template so that it registers with the target images. For validation, the template image was defined as the end-systolic microPET image data set from a Wistar Kyoto (WKY) rat. The target image was created by mapping the template image using the deformation results obtained from a FE model of diastolic filling. Regression analysis revealed highly significant correlations between the simulated forward FE solution and image derived warping predictions for fiber stretch (R (2) = 0.96), circumferential strain (R (2) = 0.96), radial strain (R (2) = 0.93), and longitudinal strain (R (2) = 0.76) (p < 0.001 for all cases). The technology was applied to microPET image data of two spontaneously hypertensive rats (SHR) and a WKY control. Regional analysis revealed that, the lateral freewall in the SHR subjects showed the greatest deformation compared with the other wall segments. This work indicates that warping can accurately predict the strain distributions during diastole from the analysis of microPET data sets.


Assuntos
Diástole/fisiologia , Técnicas de Imagem por Elasticidade/métodos , Hipertensão/diagnóstico por imagem , Hipertensão/fisiopatologia , Tomografia por Emissão de Pósitrons/métodos , Disfunção Ventricular Esquerda/diagnóstico por imagem , Disfunção Ventricular Esquerda/fisiopatologia , Algoritmos , Animais , Anisotropia , Biotecnologia/métodos , Simulação por Computador , Elasticidade , Hipertensão/complicações , Interpretação de Imagem Assistida por Computador/métodos , Modelos Cardiovasculares , Ratos , Ratos Endogâmicos SHR , Estresse Mecânico , Técnica de Subtração , Disfunção Ventricular Esquerda/etiologia
6.
IEEE Trans Med Imaging ; 25(9): 1172-9, 2006 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-16967802

RESUMO

Medical images in nuclear medicine are commonly represented in three dimensions as a stack of two-dimensional images that are reconstructed from tomographic projections. Although natural and straightforward, this may not be an optimal visual representation for performing various diagnostic tasks. A method for three-dimensional (3-D) tomographic reconstruction is developed using a point cloud image representation. A point cloud is a set of points (nodes) in space, where each node of the point cloud is characterized by its position and intensity. The density of the nodes determines the local resolution allowing for the modeling of different parts of the image with different resolution. The reconstructed volume, which in general could be of any resolution, size, shape, and topology, is represented by a set of nonoverlapping tetrahedra defined by the nodes. The intensity at any point within the volume is defined by linearly interpolating inside a tetrahedron from the values at the four nodes that define the tetrahedron. This approach creates a continuous piecewise linear intensity over the reconstruction domain. The reconstruction provides a distinct multiresolution representation, which is designed to accurately and efficiently represent the 3-D image. The method is applicable to the acquisition of any tomographic geometry, such as parallel-, fan-, and cone-beam; and the reconstruction procedure can also model the physics of the image detection process. An efficient method for evaluating the system projection matrix is presented. The system matrix is used in an iterative algorithm to reconstruct both the intensity and location of the distribution of points in the point cloud. Examples of the reconstruction of projection data generated by computer simulations and projection data experimentally acquired using a Jaszczak cardiac torso phantom are presented. This work creates a framework for voxel-less multiresolution representation of images in nuclear medicine.


Assuntos
Algoritmos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Tomografia Computadorizada de Emissão/métodos , Armazenamento e Recuperação da Informação/métodos , Análise Numérica Assistida por Computador , Imagens de Fantasmas , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Processamento de Sinais Assistido por Computador
7.
Phys Med Biol ; 51(16): 4017-29, 2006 Aug 21.
Artigo em Inglês | MEDLINE | ID: mdl-16885621

RESUMO

Detecting cancerous lesions is one major application in emission tomography. In this paper, we study penalized maximum-likelihood image reconstruction for this important clinical task. Compared to analytical reconstruction methods, statistical approaches can improve the image quality by accurately modelling the photon detection process and measurement noise in imaging systems. To explore the full potential of penalized maximum-likelihood image reconstruction for lesion detection, we derived simplified theoretical expressions that allow fast evaluation of the detectability of a random lesion. The theoretical results are used to design the regularization parameters to improve lesion detectability. We conducted computer-based Monte Carlo simulations to compare the proposed penalty function, conventional penalty function, and a penalty function for isotropic point spread function. The lesion detectability is measured by a channelized Hotelling observer. The results show that the proposed penalty function outperforms the other penalty functions for lesion detection. The relative improvement is dependent on the size of the lesion. However, we found that the penalty function optimized for a 5 mm lesion still outperforms the other two penalty functions for detecting a 14 mm lesion. Therefore, it is feasible to use the penalty function designed for small lesions in image reconstruction, because detection of large lesions is relatively easy.


Assuntos
Algoritmos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Neoplasias/diagnóstico por imagem , Tomografia Computadorizada de Emissão/métodos , Humanos , Armazenamento e Recuperação da Informação/métodos , Funções Verossimilhança , Imagens de Fantasmas , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Tomografia Computadorizada de Emissão/instrumentação
8.
J Nucl Med ; 47(7): 1187-92, 2006 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-16818954

RESUMO

UNLABELLED: Currently, 2 types of phantoms (physical and computer generated) are used for testing and comparing tomographic reconstruction methods. Data from physical phantoms include all physical effects associated with the detection of radiation. However, with physical phantoms it is difficult to control the number of detected counts, simulate the dynamics of uptake and washout, or create multiple noise realizations of an acquisition. Computer-generated phantoms can overcome some of the disadvantages of physical phantoms, but simulation of all factors affecting the detection of radiation is extremely complex and in some cases impossible. To overcome the problems with both types of phantoms, we developed a physical and computer-generated hybrid phantom that allows the creation of multiple noise realizations of tomographic datasets of the dynamic uptake governed by kinetic models. METHODS: The method is phantom and camera specific. We applied it to an anthropomorphic torso phantom with a cardiac insert, using a SPECT system with attenuation correction. First, real data were acquired. For each compartment (heart, blood pool, liver, and background) of the physical phantom, large numbers of short tomographic projections were acquired separately for each angle. Sinograms were built from a database of projections by summing the projections of each compartment of the phantom. The amount of activity in each phantom compartment was regulated by the number of added projections. Sinograms corresponding to various projection times, configurations and numbers of detector heads, numbers of noise realizations, numbers of phantom compartments, and compartment-specific time-activity curves in MBq/cm3 were assembled from the database. RESULTS: The acquisition produced a database of 120 projection angles ranging over 360 degrees . For each angle, 300 projections of 0.5 s each were stored in 128 x 128 matrices for easy access. The acquired database was successful in the generation of static and dynamic sinograms for which the myocardial uptake and washout was governed by a compartment kinetic model. CONCLUSION: A method has been developed that allows creation of sinograms of physical phantoms with the capacity to control the number of noise realizations, the level of noise, the dynamics of uptake in the phantom compartments, and the acquisition parameters and acquisition modes.


Assuntos
Imagens de Fantasmas , Radiometria/métodos , Algoritmos , Computadores , Interpretação Estatística de Dados , Humanos , Cinética , Software , Fatores de Tempo , Tomografia Computadorizada de Emissão de Fóton Único/métodos
9.
IEEE Trans Med Imaging ; 25(5): 640-8, 2006 May.
Artigo em Inglês | MEDLINE | ID: mdl-16689267

RESUMO

Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well.


Assuntos
Algoritmos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Reconhecimento Automatizado de Padrão/métodos , Tomografia Computadorizada de Emissão/métodos , Simulação por Computador , Imageamento Tridimensional/métodos , Armazenamento e Recuperação da Informação/métodos , Funções Verossimilhança , Modelos Biológicos , Modelos Estatísticos , Imagens de Fantasmas , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Tomografia Computadorizada de Emissão/instrumentação
10.
Phys Med ; 21 Suppl 1: 60-3, 2006.
Artigo em Inglês | MEDLINE | ID: mdl-17645996

RESUMO

We present a retrospective on the LBNL Positron Emission Mammography (PEM) project, looking back on our design and experiences. The LBNL PEM camera utilizes detector modules that are capable of measuring depth of interaction (DOI) and places them into 4 detector banks in a rectangular geometry. In order to build this camera, we had to develop the DOI detector module, LSO etching, Lumirror-epoxy reflector for the LSO array (to achieve optimal DOI), photodiode array, custom IC, rigid-flex readout board, packaging, DOI calibration and reconstruction algorithms for the rectangular camera geometry. We will discuss the high-lights (good and bad) of these developments.

11.
Phys Med Biol ; 50(14): 3297-312, 2005 Jul 21.
Artigo em Inglês | MEDLINE | ID: mdl-16177510

RESUMO

Statistically based iterative image reconstruction methods have been developed for emission tomography. One important component in iterative image reconstruction is the system matrix, which defines the mapping from the image space to the data space. Several groups have demonstrated that an accurate system matrix can improve image quality in both single photon emission computed tomography (SPECT) and positron emission tomography (PET). While iterative methods are amenable to arbitrary and complicated system models, the true system response is never known exactly. In practice, one also has to sacrifice the accuracy of the system model because of limited computing and imaging resources. This paper analyses the effect of errors in the system matrix on iterative image reconstruction methods that are based on the maximum a posteriori principle. We derived an analytical expression for calculating artefacts in a reconstructed image that are caused by errors in the system matrix using the first-order Taylor series approximation. The theoretical expression is used to determine the required minimum accuracy of the system matrix in emission tomography. Computer simulations show that the theoretical results work reasonably well in low-noise situations.


Assuntos
Processamento de Imagem Assistida por Computador , Modelos Teóricos , Imagens de Fantasmas , Algoritmos , Encéfalo/diagnóstico por imagem , Simulação por Computador , Humanos , Tomografia por Emissão de Pósitrons , Software
12.
J Nucl Med ; 45(11): 1950-9, 2004 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-15534068

RESUMO

UNLABELLED: The goals of this investigation were to assess the accuracy of (18)F-fluorodihydrorotenone ((18)F-FDHR) as a new deposited myocardial flow tracer and to compare the results to those for (201)Tl. METHODS: The kinetics of these flow tracers in 22 isolated, erythrocyte- and albumin-perfused rabbit hearts were evaluated over a flow range encountered in patients. The 2 flow tracers plus a vascular reference tracer ((131)I-albumin) were introduced as a bolus through a port just above the aortic cannula. Myocardial extraction, retention, washout, and uptake parameters were computed from the venous outflow curves with the multiple-indicator dilution technique and spectral analysis. RESULTS: The mean +/- SD initial extraction fractions for (18)F-FDHR (0.85 +/- 0.07) and (201)Tl (0.87 +/- 0.05) were not significantly different, although the initial extraction fraction for (18)F-FDHR declined with flow (P < 0.0001), whereas the initial extraction fraction for (201)Tl did not. The washout of (201)Tl was faster (P < 0.001) and more affected by flow (P < 0.05) than was the washout of (18)F-FDHR. Except for the initial extraction fraction, (18)F-FDHR retention was higher (P < 0.001) and less affected by flow (P < 0.05) than was (201)Tl retention. Reflecting its superior retention, the net uptake of (18)F-FDHR was better correlated with flow than was that of (201)Tl at both 1 and 15 min after tracer introduction (P < 0.0001 for both comparisons). CONCLUSION: The superior correlation of (18)F-FDHR uptake with flow indicates that it is a better flow tracer than (201)Tl in the isolated rabbit heart. Compared with the other currently available positron-emitting flow tracers ((82)Rb, (13)N-ammonia, and (15)O-water), (18)F-FDHR has the potential of providing excellent image resolution without the need for an on-site cyclotron.


Assuntos
Circulação Coronária , Vasos Coronários/diagnóstico por imagem , Vasos Coronários/metabolismo , Rotenona/análogos & derivados , Rotenona/farmacocinética , Tálio/farmacocinética , Animais , Interpretação de Imagem Assistida por Computador , Técnicas In Vitro , Cinética , Masculino , Taxa de Depuração Metabólica , Tomografia por Emissão de Pósitrons , Coelhos , Técnica de Diluição de Radioisótopos
13.
IEEE Trans Med Imaging ; 23(9): 1094-9, 2004 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-15377118

RESUMO

List mode image reconstruction is attracting renewed attention. It eliminates the storage of empty sinogram bins. However, a single back projection of all LORs is still necessary for the pre-calculation of a sensitivity image. Since the detection sensitivity is dependent on the object attenuation and detector efficiency, it must be computed for each study. Exact computation of the sensitivity image can be a daunting task for modern scanners with huge numbers of LORs. Thus, some fast approximate calculation may be desirable. In this paper, we analyze the error propagation from the sensitivity image into the reconstructed image. The theoretical analysis is based on the fixed point condition of the list mode reconstruction. The nonnegativity constraint is modeled using the Kuhn-Tucker condition. With certain assumptions and the first-order Taylor series approximation, we derive a closed form expression for the error in the reconstructed image as a function of the error in the sensitivity image. The result shows that the error response is frequency-dependent and provides a simple expression for determining the required accuracy of the sensitivity image calculation. Computer simulations show that the theoretical results are in good agreement with the measured results.


Assuntos
Algoritmos , Interpretação Estatística de Dados , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Simulação por Computador , Armazenamento e Recuperação da Informação/métodos , Funções Verossimilhança , Modelos Teóricos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
14.
Am J Physiol Heart Circ Physiol ; 284(2): H654-67, 2003 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-12388225

RESUMO

The purpose of this study was to evaluate flow heterogeneity and impaired reflow during reperfusion after 60-min global no-flow ischemia in the isolated rabbit heart. Radiolabeled microspheres were used to measure relative flow in small left ventricular (LV) segments in five ischemia + reperfused hearts and in five nonischemic controls. Relative flow heterogeneity was expressed as relative dispersion (RD) and computed as standard deviation/mean. In postischemic vs. preischemic hearts, RD was increased for the whole LV (0.92 +/- 0.41 vs. 0.37 +/- 0.07, P < 0.05) as well as the subendocardium (Endo) and subepicardium considered separately (1.28 +/- 0.74 vs. 0.30 +/- 0.09 and 0.69 +/- 0.22 vs. 0.38 +/- 0.08; P < 0.05 for both comparisons, respectively) during early reperfusion. During late reperfusion, the increased RD for the whole LV and Endo remained significant (0.70 +/- 0.22 vs. 0.37 +/- 0.07 and 1.06 +/- 0.55 vs. 0.30 +/- 0.09; P < 0.05 for both comparisons, respectively). In addition to the increase in postischemic flow heterogeneity, there were some regions demonstrating severely impaired reflow, indicating that regional ischemia can persist despite restoration of normal global flow. Also, the relationship between regional and global flow was altered by the increased postischemic flow heterogeneity, substantially reducing the significance of measured global LV reflow. These observations emphasize the need to quantify regional flow during reperfusion after sustained no-flow ischemia in the isolated rabbit heart.


Assuntos
Circulação Coronária , Isquemia Miocárdica/fisiopatologia , Animais , Técnicas In Vitro , Masculino , Microesferas , Traumatismo por Reperfusão Miocárdica/fisiopatologia , Coelhos , Função Ventricular Esquerda
15.
Phys Med Biol ; 47(15): 2759-71, 2002 Aug 07.
Artigo em Inglês | MEDLINE | ID: mdl-12200937

RESUMO

In this paper we present a scatter correction method for a regularized list mode maximum likelihood reconstruction algorithm for the positron emission mammograph (PEM) that is being developed at our laboratory. The scatter events inside the object are modelled as additive Poisson random variables in the forward model of the reconstruction algorithm. The mean scatter sinogram is estimated using a Monte Carlo simulation program. With the assumption that the background activity is nearly uniform, the Monte Carlo scatter simulation only needs to run once for each PEM configuration. This saves computation time. The crystal scatters are modelled as a shift-invariant blurring in image domain because they are more localized. Thus, the useful information in the crystal scatters can be deconvolved in high-resolution reconstructions. The propagation of the noise from the estimated scatter sinogram into the reconstruction is analysed theoretically. The results provide an easy way to calculate the required number of events in the Monte Carlo scatter simulation for a given noise level in the image. The analysis is also applicable to other scatter estimation methods, provided that the covariance of the estimated scatter sinogram is available.


Assuntos
Algoritmos , Mama/diagnóstico por imagem , Aumento da Imagem/métodos , Tomografia Computadorizada de Emissão/métodos , Simulação por Computador , Humanos , Funções Verossimilhança , Método de Monte Carlo , Controle de Qualidade , Espalhamento de Radiação , Processos Estocásticos
16.
IEEE Trans Med Imaging ; 21(3): 216-25, 2002 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-11989846

RESUMO

Factor analysis is a powerful tool used for the analysis of dynamic studies. One of the major drawbacks of factor analysis of dynamic structures (FADS) is that the solution is not mathematically unique when only nonnegativity constraints are used to determine factors and factor coefficients. In this paper, a method to correct for ambiguous FADS solutions has been developed. A nonambiguous solution (to within certain scaling factors) is obtained by constructing and minimizing a new objective function. The most common objective function consists of a least squares term that when minimized with nonnegativity constraints, forces agreement between the applied factor model and the measured data. In our method, this objective function is modified by adding a term that penalizes multiple components in the images of the factor coefficients. Due to nonuniqueness effects, these factor coefficients consist of more than one physiological component. The technique was tested on computer simulations, an experimental canine cardiac study using 99mTc-teboroxime, and a patient planar 99mTc-MAG3 renal study. The results show that the technique works well in comparison to the truth in computer simulations and to region of interest (ROI) measurements in the experimental studies.


Assuntos
Algoritmos , Simulação por Computador , Aumento da Imagem/métodos , Análise dos Mínimos Quadrados , Modelos Lineares , Animais , Cães , Coração/diagnóstico por imagem , Humanos , Rim/diagnóstico por imagem , Imagens de Fantasmas , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Processos Estocásticos , Tomografia Computadorizada de Emissão de Fóton Único/instrumentação , Tomografia Computadorizada de Emissão de Fóton Único/métodos
17.
J Nucl Cardiol ; 9(2): 197-205, 2002.
Artigo em Inglês | MEDLINE | ID: mdl-11986565

RESUMO

BACKGROUND: One of the major problems associated with technetium 99m teboroxime cardiac imaging is the high concentration of activity in the liver. In some cases it is impossible to diagnose defects on the inferior wall because of the finite resolution and scatter that cause images of the inferior wall and the liver to overlap. METHODS AND RESULTS: The least-squares factor analysis of dynamic structures method, with correction for non-unique solutions, was used to remove the liver activity from the image. The method was applied to dynamically acquired Tc-99m teboroxime data. The liver activity removal method was tested through use of computer simulations and tomographically acquired canine and patient cardiac studies. In all studies the least-squares factor analysis of dynamic structures method was able to extract the liver activity from the series of dynamic images, thereby making it possible to remove it quantitatively from the entire series. The method was used successfully to remove the liver activity that partially overlapped the inferior wall in normal hearts. The method tends to increase the contrast between defects and normal myocardial tissue in abnormal hearts. CONCLUSIONS: The method presented can be used to assist in diagnosis of cardiac disease when dynamically acquired teboroxime data are used. Because the contrast between the defect and normal myocardial tissue can be changed, the processed image cannot be used by itself to make an accurate diagnosis. However, with the liver activity removed, the image provides additional information that is very useful in the imaging of patients whose liver activity overlaps the inferior heart wall.


Assuntos
Análise Fatorial , Fígado/metabolismo , Infarto do Miocárdio/diagnóstico por imagem , Compostos de Organotecnécio , Oximas , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Animais , Simulação por Computador , Cães , Humanos , Análise dos Mínimos Quadrados , Miocárdio/metabolismo , Imagens de Fantasmas
18.
Med Image Anal ; 6(1): 29-46, 2002 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-11836133

RESUMO

A four-dimensional deformable motion algorithm is described for use in the motion compensation of gated cardiac positron emission tomography. The algorithm makes use of temporal continuity and a non-uniform elastic material model to provide improved estimates of heart motion between time frames. Temporal continuity is utilized in two ways. First, incremental motion fields between adjacent time frames are calculated to improve estimation of long-range motion between distant time frames. Second, a consistency criterion is used to insure that the image match between distant time frames is consistent with the deformations used to match adjacent time frames. The consistency requirement augments the algorithm's ability to estimate motion between noisy time frames, and the concatenated incremental motion fields improve estimation for large deformations. The estimated motion fields are used to establish a voxel correspondence between volumes and to produce a motion-compensated composite volume.


Assuntos
Coração/diagnóstico por imagem , Processamento de Imagem Assistida por Computador/métodos , Tomografia Computadorizada de Emissão/métodos , Algoritmos , Humanos , Processamento de Imagem Assistida por Computador/instrumentação , Modelos Teóricos , Movimento (Física) , Imagens de Fantasmas , Sensibilidade e Especificidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...