Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
1.
Z Med Phys ; 32(4): 403-416, 2022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35597742

RESUMO

Photon-counting (PC) detectors for clinical computed tomography (CT) may offer improved imaging capabilities compared to conventional energy-integrating (EI) detectors, e.g. superior spatial resolution and detective efficiency. We here investigate if PCCT can reduce the administered dose in examinations aimed at quantifying trabecular bone microstructure. Five human vertebral bodies were scanned three times in an abdomen phantom (QRM, Germany) using an experimental dual-source CT (Somatom CounT, Siemens Healthineers, Germany) housing an EI detector (0.60 mm pixel size at the iso-center) and a PC detector (0.25 mm pixel size). A tube voltage of 120 kV was used. Tube current-time product for EICT was 355 mAs (23.8 mGy CTDI32 cm). Dose-matched UHR-PCCT (UHRdm, 23.8 mGy) and noise-matched acquisitions (UHRnm, 10.5 mGy) were performed and reconstructed to a voxel size of 0.156 mm using a sharp kernel. Measurements of bone mineral density (BMD) and trabecular separation (Tb.Sp) and Tb.Sp percentiles reflecting the different scales of the trabecular interspacing were performed and compared to a gold-standard measurement using a peripheral CT device (XtremeCT, SCANCO Medical, Switzerland) with an isotropic voxel size of 0.082 mm and 6.6 mGy CTDI10 cm. The image noise was quantified and the relative error with respect to the gold-standard along with the agreement between CT protocols using Lin's concordance correlation coefficient (rCCC) were calculated. The Mean ±â€¯StdDev of the measured image noise levels in EICT was 109.6 ±â€¯3.9 HU. UHRdm acquisitions (same dose as EICT) showed a significantly lower noise level of 78.6 ±â€¯4.6 HU (p = 0.0122). UHRnm (44% dose of EICT) showed a noise level of 115.8 ±â€¯3.7 HU, very similar to EICT at the same spatial resolution. For BMD the overall Mean ±â€¯StdDev for EI, UHRdm and UHRnm were 114.8 ±â€¯28.6 mgHA/cm3, 121.6 ±â€¯28.8 mgHA/cm3 and 121.5 ±â€¯28.6 mgHA/cm3, respectively, compared to 123.1 ±â€¯25.5 mgHA/cm3 for XtremeCT. For Tb.Sp these values were 1.86 ±â€¯0.54 mm, 1.80 ±â€¯0.56 mm and 1.84 ±â€¯0.52 mm, respectively, compared to 1.66 ±â€¯0.48 mm for XtremeCT. The ranking of the vertebrae with regard to Tb.Sp data was maintained throughout all Tb.Sp percentiles and among the CT protocols and the gold-standard. The agreement between protocols was very good for all comparisons: UHRnm vs. EICT (BMD rCCC = 0.97; Tb.Sp rCCC = 0.998), UHRnm vs. UHRdm (BMD rCCC = 0.998; Tb.Sp rCCC = 0.993) and UHRdm vs. EICT (BMD rCCC = 0.97; Tb.Sp rCCC = 0.991). Consequently, the relative RMS-errors from linear regressions against the gold-standard for EICT, UHRdm and UHRnm were very similar for BMD (7.1%, 5.2% and 5.4%) and for Tb.Sp (3.3%, 3.3% and 2.9%), with a much lower radiation dose for UHRnm. Short-term reproducibility for BMD measurements was similar and below 0.2% for all protocols, but for Tb.Sp showed better results for UHR (about 1/3 of the level for EICT). In conclusion, CT with UHR-PC detectors demonstrated lower image noise and better reproducibility for assessments of bone microstructure at similar dose levels. For UHRnm, radiation exposure levels could be reduced by 56% without deterioration of performance levels in the assessment of bone mineral density and bone microstructure.


Assuntos
Fótons , Tomografia Computadorizada por Raios X , Humanos , Reprodutibilidade dos Testes , Tomografia Computadorizada por Raios X/métodos , Imagens de Fantasmas , Abdome
2.
Med Phys ; 49(4): 2259-2269, 2022 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-35107176

RESUMO

PURPOSE: With the rising number of computed tomography (CT) examinations and the trend toward personalized medicine, patient-specific dose estimates are becoming more and more important in CT imaging. However, current approaches are often too slow or too inaccurate to be applied routinely. Therefore, we propose the so-called deep dose estimation (DDE) to provide highly accurate patient dose distributions in real time METHODS: To combine accuracy and computational performance, the DDE algorithm uses a deep convolutional neural network to predict patient dose distributions. To do so, a U-net like architecture is trained to reproduce Monte Carlo simulations from a two-channel input consisting of a CT reconstruction and a first-order dose estimate. Here, the corresponding training data were generated using CT simulations based on 45 whole-body patient scans. For each patient, simulations were performed for different anatomies (pelvis, abdomen, thorax, head), different tube voltages (80 kV, 100 kV, 120 kV), different scan trajectories (circle, spiral), and with and without bowtie filtration and tube current modulation. Similar simulations were performed using a second set of eight whole-body CT scans from the Visual Concept Extraction Challenge in Radiology (Visceral) project to generate testing data. Finally, the DDE algorithm was evaluated with respect to the generalization to different scan parameters and the accuracy of organ dose and effective dose estimates based on an external organ segmentation. RESULTS: DDE dose distributions were quantified in terms of the mean absolute percentage error (MAPE) and a gamma analysis with respect to the ground truth Monte Carlo simulation. Both measures indicate that DDE generalizes well to different scan parameters and different anatomical regions with a maximum MAPE of 6.3% and a minimum gamma passing rate of 91%. Evaluating the organ dose values for all organs listed in the International Commission on Radiological Protection (ICRP) recommendation, shows an average error of 3.1% and maximum error of 7.2% (bone surface). CONCLUSIONS: The DDE algorithm provides an efficient approach to determine highly accurate dose distributions. Being able to process a whole-body CT scan in about 1.5 s, it provides a valuable alternative to Monte Carlo simulations on a graphics processing unit (GPU). Here, the main advantage of DDE is that it can be used on top of any existing Monte Carlo code such that real-time performance can be achieved without major adjustments. Thus, DDE opens up new options not only for dosimetry but also for scan and protocol optimization.


Assuntos
Tomografia Computadorizada por Raios X , Humanos , Método de Monte Carlo , Imagens de Fantasmas , Doses de Radiação , Radiometria/métodos , Tomografia Computadorizada por Raios X/métodos
3.
Med Phys ; 49(1): 93-106, 2022 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-34796532

RESUMO

PURPOSE: Various studies have demonstrated that additional prefilters and/or reduced tube voltages have the potential to significantly increase the contrast-to-noise ratios at unit dose (CNRDs) and thereby to significantly reduce patient dose in clinical CT. An exhaustive analysis, accounting for a wide range of filter thicknesses and a wide range of tube voltages extending beyond the 70 to 150 kV range of today's CT systems, including their specific choice depending on the patient size, is, however, missing. Therefore, this work analyzes the dose reduction potential for patient-specific selectable prefilters combined with a wider range of tube voltages. We do so for soft tissue and iodine contrast in single energy CT. The findings may be helpful to guide further developments of x-ray tubes and automatic filter changers. METHODS: CT acquisitions were simulated for different patient sizes (semianthropomorphic phantoms for child, adult, and obese patients), tube voltages (35-150 kV), prefilter materials (tin and copper), and prefilter thicknesses (up to 5 mm). For each acquisition soft tissue and iodine CNRDs were determined. Dose was calculated using Monte Carlo simulations of a computed tomography dose index (CTDI) phantom. CNRD values of acquisitions with different parameters were used to evaluate dose reduction. RESULTS: Dose reduction through patient-specific prefilters depends on patient size and available tube current among others. With an available tube current time product of 1000 mAs dose reductions of 17% for the child, 32% for the adult and 29% for the obese phantom were achieved for soft tissue contrast. For iodine contrast dose reductions were 57%, 49%, and 39% for child, adult, and obese phantoms, respectively. Here, a tube voltage range extended to lower kV is important. CONCLUSIONS: Substantial dose reduction can be achieved by utilizing patient-specific prefilters. Tube voltages lower than 70 kV are beneficial for dose reduction with iodine contrast, especially for small patients. The optimal implementation of patient-specific prefilters benefits from higher tube power. Tin prefilters should be available in 0.1 mm steps or lower, copper prefilter in 0.3 mm steps or lower. At least 10 different prefilter thicknesses should be used to cover the dose optima of all investigated patient sizes and contrast mechanisms. In many cases it would be advantageous to adapt the prefilter thickness rather than the tube current to the patient size, that is, to always use the maximum available tube current and to control the exposure by adjusting the thickness of the prefilter.


Assuntos
Redução da Medicação , Tomografia Computadorizada por Raios X , Adulto , Criança , Humanos , Método de Monte Carlo , Imagens de Fantasmas , Doses de Radiação
4.
Nat Rev Cardiol ; 17(7): 427-450, 2020 07.
Artigo em Inglês | MEDLINE | ID: mdl-32094693

RESUMO

Cardiac imaging has a pivotal role in the prevention, diagnosis and treatment of ischaemic heart disease. SPECT is most commonly used for clinical myocardial perfusion imaging, whereas PET is the clinical reference standard for the quantification of myocardial perfusion. MRI does not involve exposure to ionizing radiation, similar to echocardiography, which can be performed at the bedside. CT perfusion imaging is not frequently used but CT offers coronary angiography data, and invasive catheter-based methods can measure coronary flow and pressure. Technical improvements to the quantification of pathophysiological parameters of myocardial ischaemia can be achieved. Clinical consensus recommendations on the appropriateness of each technique were derived following a European quantitative cardiac imaging meeting and using a real-time Delphi process. SPECT using new detectors allows the quantification of myocardial blood flow and is now also suited to patients with a high BMI. PET is well suited to patients with multivessel disease to confirm or exclude balanced ischaemia. MRI allows the evaluation of patients with complex disease who would benefit from imaging of function and fibrosis in addition to perfusion. Echocardiography remains the preferred technique for assessing ischaemia in bedside situations, whereas CT has the greatest value for combined quantification of stenosis and characterization of atherosclerosis in relation to myocardial ischaemia. In patients with a high probability of needing invasive treatment, invasive coronary flow and pressure measurement is well suited to guide treatment decisions. In this Consensus Statement, we summarize the strengths and weaknesses as well as the future technological potential of each imaging modality.


Assuntos
Isquemia Miocárdica/diagnóstico por imagem , Técnica Delphi , Ecocardiografia , Humanos , Imageamento por Ressonância Magnética , Isquemia Miocárdica/fisiopatologia , Imagem de Perfusão do Miocárdio , Tomografia por Emissão de Pósitrons , Tomografia Computadorizada de Emissão de Fóton Único , Tomografia Computadorizada por Raios X
5.
Med Phys ; 46(1): 238-249, 2019 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-30390295

RESUMO

PURPOSE: X-ray scattering leads to CT images with a reduced contrast, inaccurate CT values as well as streak and cupping artifacts. Therefore, scatter correction is crucial to maintain the diagnostic value of CT and CBCT examinations. However, existing approaches are not able to combine both high accuracy and high computational performance. Therefore, we propose the deep scatter estimation (DSE): a deep convolutional neural network to derive highly accurate scatter estimates in real time. METHODS: Gold standard scatter estimation approaches rely on dedicated Monte Carlo (MC) photon transport codes. However, being computationally expensive, MC methods cannot be used routinely. To enable real-time scatter correction with similar accuracy, DSE uses a deep convolutional neural network that is trained to predict MC scatter estimates based on the acquired projection data. Here, the potential of DSE is demonstrated using simulations of CBCT head, thorax, and abdomen scans as well as measurements at an experimental table-top CBCT. Two conventional computationally efficient scatter estimation approaches were implemented as reference: a kernel-based scatter estimation (KSE) and the hybrid scatter estimation (HSE). RESULTS: The simulation study demonstrates that DSE generalizes well to varying tube voltages, varying noise levels as well as varying anatomical regions as long as they are appropriately represented within the training data. In any case the deviation of the scatter estimates from the ground truth MC scatter distribution is less than 1.8% while it is between 6.2% and 293.3% for HSE and between 11.2% and 20.5% for KSE. To evaluate the performance for real data, measurements of an anthropomorphic head phantom were performed. Errors were quantified by a comparison to a slit scan reconstruction. Here, the deviation is 278 HU (no correction), 123 HU (KSE), 65 HU (HSE), and 6 HU (DSE), respectively. CONCLUSIONS: The DSE clearly outperforms conventional scatter estimation approaches in terms of accuracy. DSE is nearly as accurate as Monte Carlo simulations but is superior in terms of speed (≈10 ms/projection) by orders of magnitude.


Assuntos
Anatomia , Tomografia Computadorizada de Feixe Cônico , Processamento de Imagem Assistida por Computador/métodos , Doses de Radiação , Espalhamento de Radiação , Artefatos , Humanos , Método de Monte Carlo , Imagens de Fantasmas , Razão Sinal-Ruído
6.
Med Phys ; 2018 Jun 11.
Artigo em Inglês | MEDLINE | ID: mdl-29888791

RESUMO

PURPOSE: The purpose of this study is to investigate the necessity of detruncation for scatter estimation of truncated cone-beam CT (CBCT) data and to evaluate different detruncation algorithms. Scattered radiation results in some of the most severe artifacts in CT and depends strongly on the size and the shape of the scanned object. Especially in CBCT systems the large cone-angle and the small detector-to-isocenter distance lead to a large amount of scatter detected, resulting in cupping artifacts, streak artifacts, and inaccurate CT-values. If a small field of measurement (FOM) is used, as it is often the case in CBCT systems, data are truncated in longitudinal and lateral direction. Since only truncated data are available as input for the scatter estimation, the already challenging correction of scatter artifacts becomes even more difficult. METHODS: The following detruncation methods are compared and evaluated with respect to scatter estimation: constant detruncation, cosine detruncation, adaptive detruncation, and prior-based detruncation using anatomical data from a similar phantom or patient, also compared to the case where no detruncation was performed. Each of the resulting, detruncated reconstructions serve as input volume for a Monte Carlo (MC) scatter estimation and subsequent scatter correction. An evaluation is performed on a head simulation, measurements of a head phantom and a patient using a dental CBCT geometry with a FOM diameter of 11 cm. Additionally, a thorax phantom is measured to assess performance in a C-Arm geometry with a FOM of up to 20 cm. RESULTS: If scatter estimation is based on simple detruncation algorithms like a constant or a cosine detruncation scatter is estimated inaccurately, resulting in incorrect CT-values as well as streak artifacts in the corrected volume. For the dental CBCT phantom measurement CT-values for soft tissue were corrected from -204 HU (no scatter correction) to -87 HU (no detruncation), -218 HU (constant detruncation), -141 HU (cosine detruncation), -91 HU (adaptive detruncation), -34 HU (prior-based detruncation using a different prior) and -24 HU (prior-based detruncation using the identical prior) for a reference value of -26 HU measured in slit scan mode. In all cases the prior-based detruncation results in the best scatter correction, followed by the adaptive detruncation, as these algorithms provide a rather accurate model of high-density structures outside the FOM, compared to a simple constant or a cosine detruncation. CONCLUSIONS: Our contribution is twofold: first we give a comprehensive comparison of various detruncation methods for the purpose of scatter estimation. We find that the choice of the detruncation method has a significant influence on the quality of MC-based scatter correction. Simple or no detruncation is often insufficient for artifact removal and results in inaccurate CT-values. On the contrary, prior-based detruncation can achieve a high CT-value accuracy and nearly artifact-free volumes from truncated CBCT data when combined with other state-of-the-art artifact corrections. Secondly, we show that prior-based detruncation is effective even with data from a different patient or phantom. The fact that data completion does not require data from the same patient dramatically increases the applicability and usability of this scatter estimation.

7.
Med Phys ; 43(7): 3945, 2016 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-27370113

RESUMO

PURPOSE: To introduce and evaluate an increment matrix approach (IMA) describing the signal statistics of energy-selective photon counting detectors including spatial-spectral correlations between energy bins of neighboring detector pixels. The importance of the occurring correlations for image-based material decomposition is studied. METHODS: An IMA describing the counter increase patterns in a photon counting detector is proposed. This IMA has the potential to decrease the number of required random numbers compared to Monte Carlo simulations by pursuing an approach based on convolutions. To validate and demonstrate the IMA, an approximate semirealistic detector model is provided, simulating a photon counting detector in a simplified manner, e.g., by neglecting count rate-dependent effects. In this way, the spatial-spectral correlations on the detector level are obtained and fed into the IMA. The importance of these correlations in reconstructed energy bin images and the corresponding detector performance in image-based material decomposition is evaluated using a statistically optimal decomposition algorithm. RESULTS: The results of IMA together with the semirealistic detector model were compared to other models and measurements using the spectral response and the energy bin sensitivity, finding a good agreement. Correlations between the different reconstructed energy bin images could be observed, and turned out to be of weak nature. These correlations were found to be not relevant in image-based material decomposition. An even simpler simulation procedure based on the energy bin sensitivity was tested instead and yielded similar results for the image-based material decomposition task, as long as the fact that one incident photon can increase multiple counters across neighboring detector pixels is taken into account. CONCLUSIONS: The IMA is computationally efficient as it required about 10(2) random numbers per ray incident on a detector pixel instead of an estimated 10(8) random numbers per ray as Monte Carlo approaches would need. The spatial-spectral correlations as described by IMA are not important for the studied image-based material decomposition task. Respecting the absolute photon counts and thus the multiple counter increases by a single x-ray photon, the same material decomposition performance could be obtained with a simpler detector description using the energy bin sensitivity.


Assuntos
Algoritmos , Modelos Estatísticos , Fótons , Raios X , Simulação por Computador , Processamento de Imagem Assistida por Computador , Método de Monte Carlo , Radiografia/instrumentação
8.
Med Phys ; 42(1): 469-78, 2015 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-25563286

RESUMO

PURPOSE: Scattered radiation is one of the major problems facing image quality in flat detector cone-beam computed tomography (CBCT). Previously, a new scatter estimation and correction method using primary beam modulation has been proposed. The original image processing technique used a frequency-domain-based analysis, which proved to be sensitive to the accuracy of the modulator pattern both spatially and in amplitude as well as to the frequency of the modulation pattern. In addition, it cannot account for penumbra effects that occur, for example, due to the finite focal spot size and the scatter estimate can be degraded by high-frequency components of the primary image. METHODS: In this paper, the authors present a new way to estimate the scatter using primary modulation. It is less sensitive to modulator nonidealities and most importantly can handle arbitrary modulator shapes and changes in modulator attenuation. The main idea is that the scatter estimation can be expressed as an optimization problem, which yields a separation of the scatter and the primary image. The method is evaluated using simulated and experimental CBCT data. The scattering properties of the modulator itself are analyzed using a Monte Carlo simulation. RESULTS: All reconstructions show strong improvements of image quality. To quantify the results, all images are compared to reference images (ideal simulations and collimated scans). CONCLUSIONS: The proposed modulator-based scatter reduction algorithm may open the field of flat detector-based imaging to become a quantitative modality. This may have significant impact on C-arm imaging and on image-guided radiation therapy.


Assuntos
Algoritmos , Tomografia Computadorizada de Feixe Cônico/métodos , Espalhamento de Radiação , Simulação por Computador , Tomografia Computadorizada de Feixe Cônico/instrumentação , Cabeça/diagnóstico por imagem , Humanos , Pulmão/diagnóstico por imagem , Modelos Teóricos , Método de Monte Carlo , Imagens de Fantasmas
9.
Med Phys ; 41(5): 051908, 2014 May.
Artigo em Inglês | MEDLINE | ID: mdl-24784387

RESUMO

PURPOSE: Phase-correlated microcomputed tomography (micro-CT) imaging plays an important role in the assessment of mouse models of cardiovascular diseases and the determination of functional parameters as the left ventricular volume. As the current gold standard, the phase-correlated Feldkamp reconstruction (PCF), shows poor performance in case of low dose scans, more sophisticated reconstruction algorithms have been proposed to enable low-dose imaging. In this study, the authors focus on the McKinnon-Bates (MKB) algorithm, the low dose phase-correlated (LDPC) reconstruction, and the high-dimensional total variation minimization reconstruction (HDTV) and investigate their potential to accurately determine the left ventricular volume at different dose levels from 50 to 500 mGy. The results were verified in phantom studies of a five-dimensional (5D) mathematical mouse phantom. METHODS: Micro-CT data of eight mice, each administered with an x-ray dose of 500 mGy, were acquired, retrospectively gated for cardiac and respiratory motion and reconstructed using PCF, MKB, LDPC, and HDTV. Dose levels down to 50 mGy were simulated by using only a fraction of the projections. Contrast-to-noise ratio (CNR) was evaluated as a measure of image quality. Left ventricular volume was determined using different segmentation algorithms (Otsu, level sets, region growing). Forward projections of the 5D mouse phantom were performed to simulate a micro-CT scan. The simulated data were processed the same way as the real mouse data sets. RESULTS: Compared to the conventional PCF reconstruction, the MKB, LDPC, and HDTV algorithm yield images of increased quality in terms of CNR. While the MKB reconstruction only provides small improvements, a significant increase of the CNR is observed in LDPC and HDTV reconstructions. The phantom studies demonstrate that left ventricular volumes can be determined accurately at 500 mGy. For lower dose levels which were simulated for real mouse data sets, the HDTV algorithm shows the best performance. At 50 mGy, the deviation from the reference obtained at 500 mGy were less than 4%. Also the LDPC algorithm provides reasonable results with deviation less than 10% at 50 mGy while PCF and MKB reconstruction show larger deviations even at higher dose levels. CONCLUSIONS: LDPC and HDTV increase CNR and allow for quantitative evaluations even at dose levels as low as 50 mGy. The left ventricular volumes exemplarily illustrate that cardiac parameters can be accurately estimated at lowest dose levels if sophisticated algorithms are used. This allows to reduce dose by a factor of 10 compared to today's gold standard and opens new options for longitudinal studies of the heart.


Assuntos
Algoritmos , Ventrículos do Coração/diagnóstico por imagem , Intensificação de Imagem Radiográfica/métodos , Microtomografia por Raio-X/métodos , Animais , Camundongos , Modelos Biológicos , Imagens de Fantasmas , Doses de Radiação
11.
Med Phys ; 40(3): 031101, 2013 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-23464281

RESUMO

PURPOSE: To discuss options in designing detector shapes in third generation CT and to quantify potential cost savings for compact third generation CT systems, and to extend the work from two-dimensional fan-beam CT to three-dimensional cone-beam CT for circular, sequential, and spiral scan trajectories. METHODS: Third generation CT scanners typically comprise detectors which are flat or whose shape is the segment of a cylinder or a sphere that is focused onto the focal spot of the x-ray source. There appear to be two design criteria that favor this choice of detector shape. One is the possibility of performing fan-beam and cone-beam filtered backprojection in the native geometry (without rebinning) and the other criterion could be to enable the early use of focused antiscatter grids. It is less known, however, that other detector shapes may also have these properties. While these designs have been evaluated for 2D CT from a theoretical standpoint more than one decade ago the authors revisit and generalize these considerations, extend them to 3D circular, sequential, and spiral cone-beam CT and propose an optimal design in terms of detector costs while keeping image quality constant. Their considerations and conclusions are based on considering the sampling density of the x-rays, including the effects of finite focal spot and finite detector element size. Proposing image reconstruction algorithms or numerically evaluating the results by reconstructing simulated projection data is not within the scope of this work. RESULTS: If the detector arc is curved to be nearly concentric with the circle describing the edge of the field of measurement significantly less detector area and detector pixels are required compared to today's third generation CT systems where the detector arc is centered about the focal spot. Combined with a detector that just covers the spiral Tam window cost savings of 60% or more are possible in compact CT systems. In terms of practicability the new designs appear to be nearly as easy to realize as today's third generation systems. CONCLUSIONS: Compact CT systems, which require the focal spot to be mounted close to the edge of the field of measurement, may significantly benefit from using detector shapes other than the typical equiangular detector that is focused onto the focal spot.


Assuntos
Tomografia Computadorizada por Raios X/instrumentação , Artefatos , Tomografia Computadorizada de Feixe Cônico , Redução de Custos , Imageamento Tridimensional , Tomografia Computadorizada Espiral , Tomografia Computadorizada por Raios X/economia
12.
Phys Med Biol ; 57(21): 6849-67, 2012 Nov 07.
Artigo em Inglês | MEDLINE | ID: mdl-23038048

RESUMO

The purpose of this study was to develop and evaluate the hybrid scatter correction algorithm (HSC) for CT imaging. Therefore, two established ways to perform scatter correction, i.e. physical scatter correction based on Monte Carlo simulations and a convolution-based scatter correction algorithm, were combined in order to perform an object-dependent, fast and accurate scatter correction. Based on a reconstructed CT volume, patient-specific scatter intensity is estimated by a coarse Monte Carlo simulation that uses a reduced amount of simulated photons in order to reduce the simulation time. To further speed up the Monte Carlo scatter estimation, scatter intensities are simulated only for a fraction of all projections. In a second step, the high noise estimate of the scatter intensity is used to calibrate the open parameters in a convolution-based algorithm which is then used to correct measured intensities for scatter. Furthermore, the scatter-corrected intensities are used in order to reconstruct a scatter-corrected CT volume data set. To evaluate the scatter reduction potential of HSC, we conducted simulations in a clinical CT geometry and measurements with a flat detector CT system. In the simulation study, HSC-corrected images were compared to scatter-free reference images. For the measurements, no scatter-free reference image was available. Therefore, we used an image corrected with a low-noise Monte Carlo simulation as a reference. The results show that the HSC can significantly reduce scatter artifacts. Compared to the reference images, the error due to scatter artifacts decreased from 100% for uncorrected images to a value below 20% for HSC-corrected images for both the clinical (simulated data) and the flat detector CT geometry (measurement). Compared to a low-noise Monte Carlo simulation, with the HSC the number of photon histories can be reduced by about a factor of 100 per projection without losing correction accuracy. Furthermore, it was sufficient to calibrate the parameters in the convolution model at an angular increment of about 20°. The reduction of the simulated photon histories together with the reduced amount of simulated Monte Carlo scatter projections decreased the total runtime of the scatter correction by about two orders of magnitude for the cases investigated here when using the HSC instead of a low-noise Monte Carlo simulation for scatter correction.


Assuntos
Artefatos , Método de Monte Carlo , Espalhamento de Radiação , Tomografia Computadorizada por Raios X/métodos , Algoritmos , Calibragem , Humanos , Fótons , Fatores de Tempo
13.
Med Phys ; 36(12): 5683-94, 2009 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-20095281

RESUMO

PURPOSE: Cardiac CT achieves its high temporal resolution by lowering the scan range from 2pi to pi plus fan angle (partial scan). This, however, introduces CT-value variations, depending on the angular position of the pi range. These partial scan artifacts are of the order of a few HU and prevent the quantitative evaluation of perfusion measurements. The authors present the new algorithm partial scan artifact reduction (PSAR) that corrects a dynamic phase-correlated scan without a priori information. METHODS: In general, a full scan does not suffer from partial scan artifacts since all projections in [0, 2pi] contribute to the data. To maintain the optimum temporal resolution and the phase correlation, PSAR creates an artificial full scan pn(AF) by projectionwise averaging a set of neighboring partial scans pn(P) from the same perfusion examination (typically N approximately 30 phase-correlated partial scans distributed over 20 s and n = 1, ..., N). Corresponding to the angular range of each partial scan, the authors extract virtual partial scans pn(V) from the artificial full scan pn(AF). A standard reconstruction yields the corresponding images fn(P), fn(AF), and fn(V). Subtracting the virtual partial scan image fn(V) from the artificial full scan image fn(AF) yields an artifact image that can be used to correct the original partial scan image: fn(C) = fn(P) - fn(V) + fn(AF), where fn(C) is the corrected image. RESULTS: The authors evaluated the effects of scattered radiation on the partial scan artifacts using simulated and measured water phantoms and found a strong correlation. The PSAR algorithm has been validated with a simulated semianthropomorphic heart phantom and with measurements of a dynamic biological perfusion phantom. For the stationary phantoms, real full scans have been performed to provide theoretical reference values. The improvement in the root mean square errors between the full and the partial scans with respect to the errors between the full and the corrected scans is up to 54% for the simulations and 90% for the measurements. CONCLUSIONS: The phase-correlated data now appear accurate enough for a quantitative analysis of cardiac perfusion.


Assuntos
Artefatos , Imagem de Perfusão do Miocárdio/métodos , Tomografia Computadorizada por Raios X/métodos , Coração/diagnóstico por imagem , Imagens de Fantasmas , Água
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA