Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 39
Filtrar
1.
Med Biol Eng Comput ; 48(5): 483-8, 2010 May.
Artigo em Inglês | MEDLINE | ID: mdl-20127523

RESUMO

This work presents a spatial filtering method for the estimation of atrial fibrillation activity in the cutaneous electrocardiogram. A linear extraction filter is obtained by maximising the extractor output power on the significant spectral support of the signal of interest. An iterative procedure based on a quasi-maximum likelihood estimator is proposed to jointly estimate the significant spectral support and the extraction filter. Compared with a previously proposed spatio-temporal blind source separation method, our approach yields an improved atrial activity signal estimate as quantified by a higher spectral concentration of the extractor output. The proposed methodology can readily be adapted to signal extraction problems in other application domains.


Assuntos
Fibrilação Atrial/diagnóstico , Processamento de Sinais Assistido por Computador , Algoritmos , Fibrilação Atrial/fisiopatologia , Eletrocardiografia/métodos , Átrios do Coração/fisiopatologia , Humanos
2.
Phys Med Biol ; 54(20): 6079-93, 2009 Oct 21.
Artigo em Inglês | MEDLINE | ID: mdl-19779215

RESUMO

EEG source analysis is a valuable tool for brain functionality research and for diagnosing neurological disorders, such as epilepsy. It requires a geometrical representation of the human head or a head model, which is often modeled as an isotropic conductor. However, it is known that some brain tissues, such as the skull or white matter, have an anisotropic conductivity. Many studies reported that the anisotropic conductivities have an influence on the calculated electrode potentials. However, few studies have assessed the influence of anisotropic conductivities on the dipole estimations. In this study, we want to determine the dipole estimation errors due to not taking into account the anisotropic conductivities of the skull and/or brain tissues. Therefore, head models are constructed with the same geometry, but with an anisotropically conducting skull and/or brain tissue compartment. These head models are used in simulation studies where the dipole location and orientation error is calculated due to neglecting anisotropic conductivities of the skull and brain tissue. Results show that not taking into account the anisotropic conductivities of the skull yields a dipole location error between 2 and 25 mm, with an average of 10 mm. When the anisotropic conductivities of the brain tissues are neglected, the dipole location error ranges between 0 and 5 mm. In this case, the average dipole location error was 2.3 mm. In all simulations, the dipole orientation error was smaller than 10 degrees . We can conclude that the anisotropic conductivities of the skull have to be incorporated to improve the accuracy of EEG source analysis. The results of the simulation, as presented here, also suggest that incorporation of the anisotropic conductivities of brain tissues is not necessary. However, more studies are needed to confirm these suggestions.


Assuntos
Eletroencefalografia/métodos , Cabeça/fisiologia , Crânio/patologia , Algoritmos , Anisotropia , Encéfalo/patologia , Simulação por Computador , Eletrodos , Humanos , Modelos Neurológicos , Doenças do Sistema Nervoso/diagnóstico , Doenças do Sistema Nervoso/patologia , Reprodutibilidade dos Testes , Processamento de Sinais Assistido por Computador , Crânio/anatomia & histologia
3.
Eur J Nucl Med Mol Imaging ; 36(12): 1994-2001, 2009 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-19526237

RESUMO

PURPOSE: The aim of this study is to optimize different parameters in the time-of-flight (TOF) reconstruction for the Philips GEMINI TF. The use of TOF in iterative reconstruction introduces additional variables to be optimized compared to conventional PET reconstruction. The different parameters studied are the TOF kernel width, the kernel truncation (used to reduce reconstruction time) and the scatter correction method. METHODS: These parameters are optimized using measured phantom studies. All phantom studies were acquired with a very high number of counts to limit the effects of noise. A high number of iterations (33 subsets and 3 iterations) was used to reach convergence. The figures of merit are the uniformity in the background, the cold spot recovery and the hot spot contrast. As reference results we used the non-TOF reconstruction of the same data sets. RESULTS: It is shown that contrast recovery loss can only be avoided if the kernel is extended to more than 3 standard deviations. To obtain uniform reconstructions the recommended scatter correction is TOF single scatter simulation (SSS). This also leads to improved cold spot recovery and hot spot contrast. While the daily measurements of the system show a timing resolution in the range of 590­600 ps, the optimal reconstructions are obtained with a TOF kernel full-width at half-maximum (FWHM) of 650­700 ps. The optimal kernel width seems to be less critical for the recovered contrast but has an important effect on the background uniformity. Using smaller or wider kernels results in a less uniform background and reduced hot and cold contrast recovery. CONCLUSION: The different parameters studied have a large effect on the quantitative accuracy of the reconstructed images. The optimal settings from this study can be used as a guideline to make an objective comparison of the gains obtained with TOF PET versus PET reconstruction.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Tomografia por Emissão de Pósitrons/métodos , Humanos , Processamento de Imagem Assistida por Computador/instrumentação , Tomografia por Emissão de Pósitrons/instrumentação , Espalhamento de Radiação , Fatores de Tempo
4.
Med Phys ; 36(4): 1053-60, 2009 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-19472610

RESUMO

The GEANT4 application for tomographic emission (GATE) is one of the most detailed Monte Carlo simulation tools for SPECT and PET. It allows for realistic phantoms, complex decay schemes, and a large variety of detector geometries. However, only a fraction of the information in each particle history is available for postprocessing. In order to extend the analysis capabilities of GATE, a flexible framework was developed. This framework allows all detected events to be subdivided according to their type: In PET, true coincidences from others, and in SPECT, geometrically collimated photons from others. The framework of the authors can be applied to any isotope, phantom, and detector geometry available in GATE. It is designed to enhance the usability of GATE for the study of contamination and for the investigation of the properties of current and future prototype detectors. The authors apply the framework to a case study of Bexxar, first assuming labeling with 124I, then with 131I. It is shown that with 124I PET, results with an optimized window improve upon those with the standard window but achieve less than half of the ideal improvement. Nevertheless, 124I PET shows improved resolution compared to 131I SPECT with triple-energy-window scatter correction.


Assuntos
Tomografia por Emissão de Pósitrons/métodos , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Simulação por Computador , Humanos , Radioisótopos do Iodo/química , Rim/diagnóstico por imagem , Método de Monte Carlo , Imagens de Fantasmas , Fótons , Física/métodos , Tomografia por Emissão de Pósitrons/instrumentação , Radioisótopos/química , Radiometria/métodos , Espalhamento de Radiação , Software , Tórax/metabolismo , Tomografia Computadorizada de Emissão de Fóton Único/instrumentação
5.
IEEE Trans Biomed Eng ; 56(3): 706-17, 2009 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-19272875

RESUMO

Genetic absence epilepsy rats from Strasbourg are a strain of Wistar rats in which all animals exhibit spontaneous occurrences of spike and wave discharges (SWDs) in the EEG. In this paper, we propose a novel method for the detection of SWDs, based on the key observation that SWDs are quasi-periodic signals. The method consists of the following steps: 1) calculation of the spectrogram; 2) estimation of the background spectrum and detection of stimulation artifacts; 3) harmonic analysis with continuity analysis to estimate the fundamental frequency; and 4) classification based on the percentage of power in the harmonics to the total power of the spectrum. We evaluated the performance of the novel detection method and six SWD/seizure detection methods from literature on a large database of labeled EEG data consisting of two datasets running to a total duration of more than 26 days of recording. The method outperforms all tested SWD/seizure detection methods, showing a sensitivity and selectivity of 96% and 97%, respectively, on the first test set, and a sensitivity and selectivity of 94% and 92%, respectively, on the second test set. The detection performance is less satisfactory (as for all other methods) for EEG fragments showing more irregular and less periodic SWDs.


Assuntos
Algoritmos , Eletroencefalografia , Epilepsia Tipo Ausência/fisiopatologia , Reconhecimento Automatizado de Padrão/métodos , Processamento de Sinais Assistido por Computador , Animais , Artefatos , Modelos Animais de Doenças , Masculino , Ratos , Ratos Wistar , Sensibilidade e Especificidade
6.
Phys Med Biol ; 54(6): 1673-89, 2009 Mar 21.
Artigo em Inglês | MEDLINE | ID: mdl-19242052

RESUMO

The simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) can give new insights into how the brain functions. However, the strong electromagnetic field of the MR scanner generates artifacts that obscure the EEG and diminish its readability. Among them, the ballistocardiographic artifact (BCGa) that appears on the EEG is believed to be related to blood flow in scalp arteries leading to electrode movements. Average artifact subtraction (AAS) techniques, used to remove the BCGa, assume a deterministic nature of the artifact. This assumption may be too strong, considering the blood flow related nature of the phenomenon. In this work we propose a new method, based on canonical correlation analysis (CCA) and blind source separation (BSS) techniques, to reduce the BCGa from simultaneously recorded EEG-fMRI. We optimized the method to reduce the user's interaction to a minimum. When tested on six subjects, recorded in 1.5 T or 3 T, the average artifact extracted with BSS-CCA and AAS did not show significant differences, proving the absence of systematic errors. On the other hand, when compared on the basis of intra-subject variability, we found significant differences and better performance of the proposed method with respect to AAS. We demonstrated that our method deals with the intrinsic subject variability specific to the artifact that may cause averaging techniques to fail.


Assuntos
Artefatos , Balistocardiografia , Eletroencefalografia/métodos , Processamento de Imagem Assistida por Computador/métodos , Imageamento por Ressonância Magnética/métodos , Humanos
7.
Phys Med Biol ; 54(3): 715-29, 2009 Feb 07.
Artigo em Inglês | MEDLINE | ID: mdl-19131666

RESUMO

As an alternative to the use of traditional parallel hole collimators, SPECT imaging can be performed using rotating slat collimators. While maintaining the spatial resolution, a gain in image quality could be expected from the higher photon collection efficiency of this type of collimator. However, the use of iterative methods to do fully three-dimensional (3D) reconstruction is computationally much more expensive and furthermore involves slow convergence compared to a classical SPECT reconstruction. It has been proposed to do 3D reconstruction by splitting the system matrix into two separate matrices, forcing the reconstruction to first estimate the sinograms from the rotating slat SPECT data before estimating the image. While alleviating the computational load by one order of magnitude, this split matrix approach would result in fast computation of the projections in an iterative algorithm, but does not solve the problem of slow convergence. There is thus a need for an algorithm which speeds up convergence while maintaining image quality for rotating slat collimated SPECT cameras. Therefore, we developed a reconstruction algorithm based on the split matrix approach which allows both a fast calculation of the forward and backward projection and a fast convergence. In this work, an algorithm of the maximum likelihood expectation maximization (MLEM) type, obtained from a split system matrix MLEM reconstruction, is proposed as a reconstruction method for rotating slat collimated SPECT data. Here, we compare this new algorithm to the conventional split system matrix MLEM method and to a gold standard fully 3D MLEM reconstruction algorithm on the basis of computational load, convergence and contrast-to-noise. Furthermore, ordered subsets expectation maximization (OSEM) implementations of these three algorithms are compared. Calculation of computational load and convergence for the different algorithms shows a speedup for the new method of 38 and 426 compared to the split matrix MLEM approach and the fully 3D MLEM respectively and a speedup of 16 and 21 compared to the split matrix OSEM and the fully 3D OSEM respectively. A contrast-to-noise study based on simulated data shows that our new approach has comparable accuracy as the fully 3D reconstruction method. The algorithm developed in this study allows iterative image reconstruction of rotating slat collimated SPECT data with equal image quality in a comparable amount of computation time as a classical SPECT reconstruction.


Assuntos
Algoritmos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Armazenamento e Recuperação da Informação/métodos , Tomografia Computadorizada de Emissão de Fóton Único/instrumentação , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Desenho de Equipamento , Análise de Falha de Equipamento , Humanos , Imagens de Fantasmas , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
8.
Phys Med Biol ; 53(19): 5405-19, 2008 Oct 07.
Artigo em Inglês | MEDLINE | ID: mdl-18765890

RESUMO

Diffusion weighted magnetic resonance imaging offers a non-invasive tool to explore the three-dimensional structure of brain white matter in clinical practice. Anisotropic diffusion hardware phantoms are useful for the quantitative validation of this technique. This study provides guidelines on how to manufacture anisotropic fibre phantoms in a reproducible way and which fibre material to choose to obtain a good quality of the diffusion weighted images. Several fibre materials are compared regarding their effect on the diffusion MR measurements of the water molecules inside the phantoms. The diffusion anisotropy influencing material properties are the fibre density and diameter, while the fibre surface relaxivity and magnetic susceptibility determine the signal-to-noise ratio. The effect on the T(2)-relaxation time of water in the phantoms has been modelled and the diffusion behaviour inside the fibre phantoms has been quantitatively evaluated using Monte Carlo random walk simulations.


Assuntos
Imagem de Difusão por Ressonância Magnética/métodos , Difusão , Imagens de Fantasmas , Anisotropia , Imagem de Difusão por Ressonância Magnética/instrumentação , Magnetismo , Reprodutibilidade dos Testes , Propriedades de Superfície , Fatores de Tempo , Água/química
9.
IEEE Trans Med Imaging ; 27(7): 943-59, 2008.
Artigo em Inglês | MEDLINE | ID: mdl-18599400

RESUMO

Tomographic reconstruction from positron emission tomography (PET) data is an ill-posed problem that requires regularization. An attractive approach is to impose an l(1) -regularization constraint, which favors sparse solutions in the wavelet domain. This can be achieved quite efficiently thanks to the iterative algorithm developed by Daubechies et al., 2004. In this paper, we apply this technique and extend it for the reconstruction of dynamic (spatio-temporal) PET data. Moreover, instead of using classical wavelets in the temporal dimension, we introduce exponential-spline wavelets (E-spline wavelets) that are specially tailored to model time activity curves (TACs) in PET. We show that the exponential-spline wavelets naturally arise from the compartmental description of the dynamics of the tracer distribution. We address the issue of the selection of the "optimal" E-spline parameters (poles and zeros) and we investigate their effect on reconstruction quality. We demonstrate the usefulness of spatio-temporal regularization and the superior performance of E-spline wavelets over conventional Battle-LemariE wavelets in a series of experiments: the 1-D fitting of TACs, and the tomographic reconstruction of both simulated and clinical data. We find that the E-spline wavelets outperform the conventional wavelets in terms of the reconstructed signal-to-noise ratio (SNR) and the sparsity of the wavelet coefficients. Based on our simulations, we conclude that replacing the conventional wavelets with E-spline wavelets leads to equal reconstruction quality for a 40% reduction of detected coincidences, meaning an improved image quality for the same number of counts or equivalently a reduced exposure to the patient for the same image quality.


Assuntos
Imageamento Tridimensional/métodos , Tomografia por Emissão de Pósitrons/métodos , Processamento de Sinais Assistido por Computador , Algoritmos , Inteligência Artificial , Análise por Conglomerados , Simulação por Computador , Interpretação Estatística de Dados , Retroalimentação , Análise de Fourier , Humanos , Armazenamento e Recuperação da Informação/métodos , Rim/diagnóstico por imagem , Fígado/diagnóstico por imagem , Reconhecimento Automatizado de Padrão , Fatores de Tempo
10.
Clin Neurophysiol ; 119(8): 1756-1770, 2008 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-18499517

RESUMO

OBJECTIVE: Methods for the detection of epileptiform events can be broadly divided into two main categories: temporal detection methods that exploit the EEG's temporal characteristics, and spatial detection methods that base detection on the results of an implicit or explicit source analysis. We describe how the framework of a spatial detection method was extended to improve its performance by including temporal information. This results in a method that provides (i) automated localization of an epileptogenic focus and (ii) detection of focal epileptiform events in an EEG recording. For the detection, only one threshold value needs to be set. METHODS: The method comprises five consecutive steps: (1) dipole source analysis in a moving window, (2) automatic selection of focal brain activity, (3) dipole clustering to arrive at the identification of the epileptiform cluster, (4) derivation of a spatio-temporal template of the epileptiform activity, and (5) template matching. Routine EEG recordings from eight paediatric patients with focal epilepsy were labelled independently by two experts. The method was evaluated in terms of (i) ability to identify the epileptic focus, (ii) validity of the derived template, and (iii) detection performance. The clustering performance was evaluated using a leave-one-out cross validation. Detection performance was evaluated using Precision-Recall curves and compared to the performance of two temporal (mimetic and wavelet based) and one spatial (dipole analysis based) detection methods. RESULTS: The method succeeded in identifying the epileptogenic focus in seven of the eight recordings. For these recordings, the mean distance between the epileptic focus estimated by the method and the region indicated by the labelling of the experts was 8mm. Except for two EEG recordings where the dipole clustering step failed, the derived template corresponded to the epileptiform activity marked by the experts. Over the eight EEGs, the method showed a mean sensitivity and selectivity of 92 and 77%, respectively. CONCLUSIONS: The method allows automated localization of the epileptogenic focus and shows good agreement with the region indicated by the labelling of the experts. If the dipole clustering step is successful, the method allows a detection of the focal epileptiform events, and gave a detection performance comparable or better to that of the other methods. SIGNIFICANCE: The identification and quantification of epileptiform events is of considerable importance in the diagnosis of epilepsy. Our method allows the automatic identification of the epileptic focus, which is of value in epilepsy surgery. The method can also be used as an offline exploration tool for focal EEG activity, displaying the dipole clusters and corresponding time series.


Assuntos
Mapeamento Encefálico , Encéfalo/fisiopatologia , Eletroencefalografia , Epilepsias Parciais/fisiopatologia , Algoritmos , Criança , Pré-Escolar , Análise por Conglomerados , Eletrodos , Epilepsias Parciais/patologia , Feminino , Humanos , Masculino , Reprodutibilidade dos Testes , Processamento de Sinais Assistido por Computador , Fatores de Tempo
11.
Med Phys ; 35(4): 1476-85, 2008 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-18491542

RESUMO

Geant4 application for tomographic emission (GATE) is a geometry and tracking 4 (Geant4) application toolkit for accurate simulation of positron emission tomography and single photon emission computed tomography (SPECT) scanners. GATE simulations with realistic count levels are very CPU-intensive as they take up to several days with single-CPU computers. Therefore, we implemented both standard forced detection (FD) and convolution-based forced detection (CFD) with multiple projection sampling, which allows the simulation of all projections simultaneously in GATE. In addition, a FD and CFD specialized Geant4 navigator was developed to overcome the detailed but slow tracking algorithms in Geant4. This article is focused on the implementation and validation of these aforementioned developments. The results show a good agreement between the FD and CFD versus analog GATE simulations for Tc-99m SPECT. These combined developments accelerate GATE by three orders of magnitude in the case of FD. CFD is an additional two orders of magnitude faster than FD. This renders realistic simulations feasible within minutes on a single CPU. Future work will extend our framework to higher energy isotopes, which will require the inclusion of a septal penetration and collimator scatter model.


Assuntos
Algoritmos , Interpretação de Imagem Assistida por Computador/métodos , Software , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Aumento da Imagem/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Fatores de Tempo
12.
Phys Med Biol ; 53(7): 1989-2002, 2008 Apr 07.
Artigo em Inglês | MEDLINE | ID: mdl-18356576

RESUMO

The main remaining challenge for a gamma camera is to overcome the existing trade-off between collimator spatial resolution and system sensitivity. This problem, strongly limiting the performance of parallel hole collimated gamma cameras, can be overcome by applying new collimator designs such as rotating slat (RS) collimators which have a much higher photon collection efficiency. The drawback of a RS collimated gamma camera is that, even for obtaining planar images, image reconstruction is needed, resulting in noise accumulation. However, nowadays iterative reconstruction techniques with accurate system modeling can provide better image quality. Because the impact of this modeling on image quality differs from one system to another, an objective assessment of the image quality obtained with a RS collimator is needed in comparison to classical projection images obtained using a parallel hole (PH) collimator. In this paper, a comparative study of image quality, achieved with system modeling, is presented. RS data are reconstructed to planar images using maximum likelihood expectation maximization (MLEM) with an accurate Monte Carlo derived system matrix while PH projections are deconvolved using a Monte Carlo derived point-spread function. Contrast-to-noise characteristics are used to show image quality for cold and hot spots of varying size. Influence of the object size and contrast is investigated using the optimal contrast-to-noise ratio (CNR(o)). For a typical phantom setup, results show that cold spot imaging is slightly better for a PH collimator. For hot spot imaging, the CNR(o) of the RS images is found to increase with increasing lesion diameter and lesion contrast while it decreases when background dimensions become larger. Only for very large background dimensions in combination with low contrast lesions, the use of a PH collimator could be beneficial for hot spot imaging. In all other cases, the RS collimator scores better. Finally, the simulation of a planar bone scan on a RS collimator revealed a hot spot contrast improvement up to 54% compared to a classical PH bone scan.


Assuntos
Câmaras gama , Interpretação de Imagem Assistida por Computador/métodos , Algoritmos , Computadores , Humanos , Processamento de Imagem Assistida por Computador , Funções Verossimilhança , Modelos Estatísticos , Modelos Teóricos , Método de Monte Carlo , Metástase Neoplásica , Neoplasias/patologia , Imagens de Fantasmas , Software , Tomografia Computadorizada por Raios X/métodos
13.
Phys Med Biol ; 53(7): 1877-94, 2008 Apr 07.
Artigo em Inglês | MEDLINE | ID: mdl-18364544

RESUMO

To improve the EEG source localization in the brain, the conductivities used in the head model play a very important role. In this study, we focus on the modeling of the anisotropic conductivity of the white matter. The anisotropic conductivity profile can be derived from diffusion weighted magnetic resonance images (DW-MRI). However, deriving these anisotropic conductivities from diffusion weighted MR images of the white matter is not straightforward. In the literature, two methods can be found for calculating the conductivity from the diffusion weighted images. One method uses a fixed value for the ratio of the conductivity in different directions, while the other method uses a conductivity profile obtained from a linear scaling of the diffusion ellipsoid. We propose a model which can be used to derive the conductivity profile from the diffusion tensor images. This model is based on the variable anisotropic ratio throughout the white matter and is a combination of the linear relationship as stated in the literature, with a constraint on the magnitude of the conductivity tensor (also known as the volume constraint). This approach is stated in the paper as approach A. In our study we want to investigate dipole estimation differences due to using a more simplified model for white matter anisotropy (approach B), while the electrode potentials are derived using a head model with a more realistic approach for the white matter anisotropy (approach A). We used a realistic head model, in which the forward problem was solved using a finite difference method that can incorporate anisotropic conductivities. As error measures we considered the dipole location error and the dipole orientation error. The results show that the dipole location errors are all below 10 mm and have an average of 4 mm in gray matter regions. The dipole orientation errors ranged up to 66.4 degrees, and had a mean of, on average, 11.6 degrees in gray matter regions. In a qualitative manner, the results show that the orientation and location error is dependent on the orientation of the test dipole. The location error is larger when the orientation of the test dipole is similar to the orientation of the anisotropy, while the orientation error is larger when the orientation of the test dipole deviates from the orientation of the anisotropy. From these results, we can conclude that the modeling of white matter anisotropy plays an important role in EEG source localization. More specifically, accurate source localization will require an accurate modeling of the white matter conductivity profile in each voxel.


Assuntos
Eletroencefalografia/métodos , Algoritmos , Animais , Anisotropia , Encéfalo/patologia , Mapeamento Encefálico/métodos , Simulação por Computador , Difusão , Eletrodos , Eletroencefalografia/instrumentação , Desenho de Equipamento , Humanos , Modelos Biológicos , Neurônios/metabolismo , Reprodutibilidade dos Testes , Processamento de Sinais Assistido por Computador
14.
Eur J Nucl Med Mol Imaging ; 35(5): 922-32, 2008 May.
Artigo em Inglês | MEDLINE | ID: mdl-18219482

RESUMO

PURPOSE: Thallous (201Tl) chloride is a single-photon emission computed tomography (SPECT) tracer mainly used for assessing perfusion and viability of myocardial tissue. 201Tl emits X-rays around 72 keV and gammas at 167 keV, and has a half-life of 73 h. Regulations allow an intrinsic contamination up to 3-5%, which is mainly caused by 200Tl (368 keV; 26 h) and by 202Tl (439 keV; 12.2 days). Contra-intuitive to the low-level percentages in which these contaminants are present, their impact may be significant because of much higher gamma camera sensitivity for these high-energy photon emissions. Therefore, we investigate the effects of the contaminants in terms of detected fractions of photons in projections and contrast degradation in reconstructed images. METHODS: Acquisitions of a digital thorax phantom filled with thallous (201Tl) chloride were simulated with a validated Monte Carlo tool, thereby, modelling 1% of contamination by 200Tl and 202Tl each. In addition, measurements of a thorax phantom on a dual-headed gamma camera were performed. The product used was contaminated by 0.17% of 200Tl and 0.24% of 202Tl at activity reference time (ART). This ART is specified by the manufacturer, thereby, accounting for the difference in half-lives of 201Tl and its contaminants. These measurements were repeated at different dates associated with various contamination levels. RESULTS: Simulations showed that, with 1% of 200Tl and 202Tl, the total contamination in the 72 keV window can rise up to one out of three detected photons. For the 167 keV window, the contamination is even more pronounced: more than four out of five detections in this photopeak window originate from contaminants. Measurements indicate that cold lesion contrast in myocardial perfusion SPECT imaging is at maximum close to ART. In addition to a higher noise level, relative contrast decreases 15% 2 days early to ART, which is explained by an increase in 200Tl contamination. After ART, contrast decreased by 16% when the 202Tl contamination increased to the maximal allowed limit. CONCLUSIONS: Contra-intuitive to the low-level percentages in which they are typically present, penetration and downscatter of high-energy photons from 200Tl and 202Tl significantly contribute to thallous (201Tl) chloride images, thereby, reducing contrast and adding noise. These findings may prompt for improved production methods, for updated policies with regard to timing of usage, and they also render the usefulness of adding the high photopeak window (167 keV) questionable. A window-based correction method for this contamination is advisable.


Assuntos
Artefatos , Contaminação de Medicamentos , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Tálio , Tomografia Computadorizada de Emissão de Fóton Único/métodos , Imagens de Fantasmas , Compostos Radiofarmacêuticos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Tomografia Computadorizada de Emissão de Fóton Único/instrumentação
15.
Artigo em Inglês | MEDLINE | ID: mdl-19163051

RESUMO

An objective function is presented to recover a spectrally narrow band signal from multichannel measurements, as in electrocardiogram recordings of atrial fibrillation. The criterion can be efficiently maximized through the eigenvalue decomposition of some spectral correlation matrices of the whitened observations across appropriately chosen frequency bands. It is conjectured that the global optimum so attained recovers the source of interest when its spectral concentration around its modal frequency is maximal. Numerical experiments on synthetic data seem to support the validity of this hypothesis. Moreover, the components extracted from a patient data set with known atrial fibrillation show the characteristics of the associated f-wave as described in medical literature.


Assuntos
Fibrilação Atrial/diagnóstico , Flutter Atrial/diagnóstico , Diagnóstico por Computador/estatística & dados numéricos , Eletrocardiografia/estatística & dados numéricos , Algoritmos , Fibrilação Atrial/fisiopatologia , Flutter Atrial/fisiopatologia , Engenharia Biomédica , Bases de Dados Factuais , Humanos , Método de Monte Carlo , Processamento de Sinais Assistido por Computador
16.
Artigo em Inglês | MEDLINE | ID: mdl-19163423

RESUMO

In this work we show how one can make use of priors on signal statistics under the form of cumulant guesses to extract an independent source from an observed mixture. The advantage of using statistical priors on the signal lies in the fact that no specific knowledge is needed about its temporal behavior, neither about its spatial distribution. We show that these statistics can be obtained either by reasoning on the theoretical values of a supposed waveform, either by using a subset of the observations from which we know that their statistics are merely hindered by interferences. Results on an electro-cardiographic recording confirm the above assumptions.


Assuntos
Interpretação Estatística de Dados , Eletrocardiografia/métodos , Algoritmos , Eletrocardiografia/instrumentação , Processamento Eletrônico de Dados , Humanos , Funções Verossimilhança , Modelos Estatísticos , Modelos Teóricos , Análise de Componente Principal , Reprodutibilidade dos Testes , Fatores de Tempo
17.
J Magn Reson ; 190(2): 189-99, 2008 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-18023218

RESUMO

Diffusion weighted magnetic resonance imaging enables the visualization of fibrous tissues such as brain white matter. The validation of this non-invasive technique requires phantoms with a well-known structure and diffusion behavior. This paper presents anisotropic diffusion phantoms consisting of parallel fibers. The diffusion properties of the fiber phantoms are measured using diffusion weighted magnetic resonance imaging and bulk NMR measurements. To enable quantitative evaluation of the measurements, the diffusion in the interstitial space between fibers is modeled using Monte Carlo simulations of random walkers. The time-dependent apparent diffusion coefficient and kurtosis, quantifying the deviation from a Gaussian diffusion profile, are simulated in 3D geometries of parallel fibers with varying packing geometries and packing densities. The simulated diffusion coefficients are compared to the theory of diffusion in porous media, showing a good agreement. Based on the correspondence between simulations and experimental measurements, the fiber phantoms are shown to be useful for the quantitative validation of diffusion imaging on clinical MRI-scanners.


Assuntos
Mapeamento Encefálico/métodos , Imagem de Difusão por Ressonância Magnética/métodos , Imagens de Fantasmas , Algoritmos , Anisotropia , Água Corporal/metabolismo , Simulação por Computador , Processamento de Imagem Assistida por Computador , Imageamento Tridimensional , Método de Monte Carlo , Fibras Nervosas
18.
J Neuroeng Rehabil ; 4: 46, 2007 Nov 30.
Artigo em Inglês | MEDLINE | ID: mdl-18053144

RESUMO

BACKGROUND: The aim of electroencephalogram (EEG) source localization is to find the brain areas responsible for EEG waves of interest. It consists of solving forward and inverse problems. The forward problem is solved by starting from a given electrical source and calculating the potentials at the electrodes. These evaluations are necessary to solve the inverse problem which is defined as finding brain sources which are responsible for the measured potentials at the EEG electrodes. METHODS: While other reviews give an extensive summary of the both forward and inverse problem, this review article focuses on different aspects of solving the forward problem and it is intended for newcomers in this research field. RESULTS: It starts with focusing on the generators of the EEG: the post-synaptic potentials in the apical dendrites of pyramidal neurons. These cells generate an extracellular current which can be modeled by Poisson's differential equation, and Neumann and Dirichlet boundary conditions. The compartments in which these currents flow can be anisotropic (e.g. skull and white matter). In a three-shell spherical head model an analytical expression exists to solve the forward problem. During the last two decades researchers have tried to solve Poisson's equation in a realistically shaped head model obtained from 3D medical images, which requires numerical methods. The following methods are compared with each other: the boundary element method (BEM), the finite element method (FEM) and the finite difference method (FDM). In the last two methods anisotropic conducting compartments can conveniently be introduced. Then the focus will be set on the use of reciprocity in EEG source localization. It is introduced to speed up the forward calculations which are here performed for each electrode position rather than for each dipole position. Solving Poisson's equation utilizing FEM and FDM corresponds to solving a large sparse linear system. Iterative methods are required to solve these sparse linear systems. The following iterative methods are discussed: successive over-relaxation, conjugate gradients method and algebraic multigrid method. CONCLUSION: Solving the forward problem has been well documented in the past decades. In the past simplified spherical head models are used, whereas nowadays a combination of imaging modalities are used to accurately describe the geometry of the head model. Efforts have been done on realistically describing the shape of the head model, as well as the heterogenity of the tissue types and realistically determining the conductivity. However, the determination and validation of the in vivo conductivity values is still an important topic in this field. In addition, more studies have to be done on the influence of all the parameters of the head model and of the numerical techniques on the solution of the forward problem.


Assuntos
Mapeamento Encefálico , Encéfalo/fisiologia , Eletroencefalografia , Humanos , Modelos Neurológicos
19.
Artigo em Inglês | MEDLINE | ID: mdl-18003514

RESUMO

In this work it will be shown that a contrast for independent component analysis based on prior knowledge of the source kurtosis signs (ica-sks) is able to extract atrial activity from the electrocardiogram when a constrained updating is introduced. A spectral concentration measure is used, only allowing signal pair updates when spectral concentration augments. This strategy proves to be valid for independent source extraction with priors on the spectral concentration. Moreover, the method is computationally attractive with a very low complexity compared to the recently proposed methods based on spatiotemporal extraction of the atrial fibrillation signal.


Assuntos
Fibrilação Atrial/fisiopatologia , Flutter Atrial/fisiopatologia , Eletrocardiografia , Átrios do Coração/fisiopatologia , Algoritmos , Humanos
20.
Artigo em Inglês | MEDLINE | ID: mdl-18003524

RESUMO

Tomographic reconstruction from PET data is an ill-posed problem that requires regularization. Recently, Daubechies et al. [1] proposed an l (1) regularization of the wavelet coefficients that can be optimized using iterative thresholding schemes. In this paper, we extend this approach for the reconstruction of dynamic (spatio-temporal) PET data. Instead of using classical wavelets in the temporal dimension, we introduce exponential-spline wavelets that are specially tailored to model time activity curves (TACs) in PET. We show the usefulness of spatio-temporal regularization and the superior performance of E-spline wavelets over conventional Battle-Lemarié wavelets for a 1-D TAC fitting experiment and a tomographic reconstruction experiment.


Assuntos
Processamento de Imagem Assistida por Computador , Tomografia por Emissão de Pósitrons , Miocárdio , Imagens de Fantasmas
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...