Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
Sci Rep ; 13(1): 10907, 2023 07 05.
Article in English | MEDLINE | ID: mdl-37407807

ABSTRACT

Cryo-imaging has been effectively used to study the biodistribution of fluorescent cells or microspheres in animal models. Sequential slice-by-slice fluorescent imaging enables detection of fluorescent cells or microspheres for corresponding quantification of their distribution in tissue. However, if slices are too thin, there will be data overload and excessive scan times. If slices are too thick, then cells can be missed. In this study, we developed a model for detection of fluorescent cells or microspheres to aid optimal slice thickness determination. Key factors include: section thickness (X), fluorescent cell intensity (Ifluo), effective tissue attenuation coefficient (µT), and a detection threshold (T). The model suggests an optimal slice thickness value that provides near-ideal sensitivity while minimizing scan time. The model also suggests a correction method to compensate for missed cells in the case that image data were acquired with overly large slice thickness. This approach allows cryo-imaging operators to use larger slice thickness to expedite the scan time without significant loss of cell count. We validated the model using real data from two independent studies: fluorescent microspheres in a pig heart and fluorescently labeled stem cells in a mouse model. Results show that slice thickness and detection sensitivity relationships from simulations and real data were well-matched with 99% correlation and 2% root-mean-square (RMS) error. We also discussed the detection characteristics in situations where key assumptions of the model were not met such as fluorescence intensity variation and spatial distribution. Finally, we show that with proper settings, cryo-imaging can provide accurate quantification of the fluorescent cell biodistribution with remarkably high recovery ratios (number of detections/delivery). As cryo-imaging technology has been used in many biological applications, our optimal slice thickness determination and data correction methods can play a crucial role in further advancing its usability and reliability.


Subject(s)
Heart , Tomography, X-Ray Computed , Mice , Animals , Swine , Microspheres , Reproducibility of Results , Tissue Distribution , Tomography, X-Ray Computed/methods
2.
Magn Reson Med ; 89(6): 2441-2455, 2023 06.
Article in English | MEDLINE | ID: mdl-36744695

ABSTRACT

PURPOSE: Fast and accurate thigh muscle segmentation from MRI is important for quantitative assessment of thigh muscle morphology and composition. A novel deep learning (DL) based thigh muscle and surrounding tissues segmentation model was developed for fully automatic and reproducible cross-sectional area (CSA) and fat fraction (FF) quantification and tested in patients at 10 years after anterior cruciate ligament reconstructions. METHODS: A DL model combining UNet and DenseNet was trained and tested using manually segmented thighs from 16 patients (32 legs). Segmentation accuracy was evaluated using Dice similarity coefficients (DSC) and average symmetric surface distance (ASSD). A UNet model was trained for comparison. These segmentations were used to obtain CSA and FF quantification. Reproducibility of CSA and FF quantification was tested with scan and rescan of six healthy subjects. RESULTS: The proposed UNet and DenseNet had high agreement with manual segmentation (DSC >0.97, ASSD < 0.24) and improved performance compared with UNet. For hamstrings of the operated knee, the automated pipeline had largest absolute difference of 6.01% for CSA and 0.47% for FF as compared to manual segmentation. In reproducibility analysis, the average difference (absolute) in CSA quantification between scan and rescan was better for the automatic method as compared with manual segmentation (2.27% vs. 3.34%), whereas the average difference (absolute) in FF quantification were similar. CONCLUSIONS: The proposed method exhibits excellent accuracy and reproducibility in CSA and FF quantification compared with manual segmentation and can be used in large-scale patient studies.


Subject(s)
Deep Learning , Thigh , Humans , Thigh/diagnostic imaging , Reproducibility of Results , Knee Joint , Muscle, Skeletal/diagnostic imaging , Magnetic Resonance Imaging/methods
3.
JACC Case Rep ; 7: 101722, 2023 Feb 01.
Article in English | MEDLINE | ID: mdl-36776793

ABSTRACT

In the following case series, we describe the clinical presentation of 2 patients with myocardial infarction with nonobstructive coronary arteries with different underlying pathophysiologic mechanisms. In both scenarios, cardiac magnetic resonance (CMR) imaging provided comprehensive tissue characterization with both conventional parametric mapping techniques and CMR fingerprinting. These cases demonstrate the diagnostic utility for CMR to elucidate the underlying etiology and appropriate therapeutic strategy. (Level of Difficulty: Advanced.).

4.
Curr Cardiol Rep ; 25(3): 119-131, 2023 03.
Article in English | MEDLINE | ID: mdl-36805913

ABSTRACT

PURPOSE OF REVIEW: Cardiac magnetic resonance fingerprinting (cMRF) has developed as a technique for rapid, multi-parametric tissue property mapping that has potential to both improve cardiac MRI exam efficiency and expand the information captured. In this review, we describe the cMRF technique, summarize technical developments and in vivo reports, and highlight potential clinical applications. RECENT FINDINGS: Technical developments in cMRF continue to progress rapidly, including motion compensated reconstruction, additional tissue property quantification, signal time course analysis, and synthetic LGE image generation. Such technical developments can enable simplified CMR protocols by combining multiple evaluations into a single protocol and reducing the number of breath-held scans. cMRF continues to be reported for use in a range of pathologies; however barriers to clinical implementation remain. Technical developments are described in this review, followed by a focus on potential clinical applications that they may support. Clinical translation of cMRF could shorten protocols, improve CMR accessibility, and provide additional information as compared to conventional cardiac parametric mapping methods. Current needs for clinical implementation are discussed, as well as how those needs may be met in order to bring cMRF from its current research setting to become a viable tool for patient care.


Subject(s)
Heart Diseases , Heart , Humans , Heart/diagnostic imaging , Magnetic Resonance Imaging/methods , Magnetic Resonance Spectroscopy , Heart Diseases/diagnostic imaging
5.
Invest Radiol ; 58(1): 60-75, 2023 Jan 01.
Article in English | MEDLINE | ID: mdl-36165880

ABSTRACT

ABSTRACT: Magnetic resonance imaging (MRI) is a valuable tool for evaluating musculoskeletal disease as it offers a range of image contrasts that are sensitive to underlying tissue biochemical composition and microstructure. Although MRI has the ability to provide high-resolution, information-rich images suitable for musculoskeletal applications, most MRI utilization remains in qualitative evaluation. Quantitative MRI (qMRI) provides additional value beyond qualitative assessment via objective metrics that can support disease characterization, disease progression monitoring, or therapy response. In this review, musculoskeletal qMRI techniques are summarized with a focus on techniques developed for osteoarthritis evaluation. Cartilage compositional MRI methods are described with a detailed discussion on relaxometric mapping (T 2 , T 2 *, T 1ρ ) without contrast agents. Methods to assess inflammation are described, including perfusion imaging, volume and signal changes, contrast-enhanced T 1 mapping, and semiquantitative scoring systems. Quantitative characterization of structure and function by bone shape modeling and joint kinematics are described. Muscle evaluation by qMRI is discussed, including size (area, volume), relaxometric mapping (T 1 , T 2 , T 1ρ ), fat fraction quantification, diffusion imaging, and metabolic assessment by 31 P-MR and creatine chemical exchange saturation transfer. Other notable technologies to support qMRI in musculoskeletal evaluation are described, including magnetic resonance fingerprinting, ultrashort echo time imaging, ultrahigh-field MRI, and hybrid MRI-positron emission tomography. Challenges for adopting and using qMRI in musculoskeletal evaluation are discussed, including the need for metal artifact suppression and qMRI standardization.


Subject(s)
Cartilage, Articular , Musculoskeletal Diseases , Humans , Cartilage, Articular/pathology , Magnetic Resonance Imaging/methods , Disease Progression , Musculoskeletal Diseases/pathology , Muscles
6.
Circ Heart Fail ; 15(10): e009322, 2022 10.
Article in English | MEDLINE | ID: mdl-35924562

ABSTRACT

Sarcopenia has been established as a predictor of poor outcomes in various clinical settings. It is particularly prevalent in heart failure, a clinical syndrome that poses significant challenges to health care worldwide. Despite this, sarcopenia remains overlooked and undertreated in cardiology practice. Understanding the currently proposed diagnostic process is paramount for the early detection and treatment of sarcopenia to mitigate downstream adverse health outcomes.


Subject(s)
Heart Failure , Sarcopenia , Humans , Heart Failure/diagnosis , Sarcopenia/diagnostic imaging , Sarcopenia/therapy , Frailty
7.
Doc Ophthalmol ; 144(2): 137-145, 2022 04.
Article in English | MEDLINE | ID: mdl-35247110

ABSTRACT

PURPOSE: A left ventricular assist device (LVAD) is an implantable cardiac pump that uses a magnetically-levitating rotor to pump blood into circulation for patients with congestive heart failure. The continuous high-frequency motion of the pump can cause significant interference in electroretinography (ERG) recordings. We evaluate filtering methods to improve ERG quality in the presence of LVAD interference. METHODS: A patient with an implanted LVAD was referred to our clinic for ERG testing on suspicion of a retinal dystrophy. Full-field ERG (ffERG) and pattern ERG (pERG) were performed according to ISCEV standards. Recordings were acquired once in full-bandwidth mode and again in low-bandwidth mode. Digital low-pass and band-stop filtering were performed to mitigate ERG interference. Post-processing was also evaluated in a control subject with no implanted device. RESULTS: High-frequency interference was present in all ERG recordings and corresponded to the speed settings of the pump. When applied in post-processing, both low-pass and band-stop filters suppressed the interference and presented readable ERGs without affecting peak times or amplitudes. By contrast, when recording in low-bandwidth mode, the filter drop-off was not steep enough to completely remove the interference and peak delays were introduced that could not be readily corrected. CONCLUSIONS: LVAD interference in ERG waveforms can be successfully removed using simple digital filters. If post hoc data processing capabilities are unavailable, a large amount of interference can be removed by narrowing the acquisition bandwidth and averaging additional repeats of each stimulus response.


Subject(s)
Heart Failure , Heart-Assist Devices , Retinal Dystrophies , Electroretinography/methods , Heart Failure/surgery , Humans
8.
Int J Cardiol ; 351: 107-110, 2022 Mar 15.
Article in English | MEDLINE | ID: mdl-34963645

ABSTRACT

BACKGROUND: Cardiac amyloidosis (CA) is an infiltrative cardiomyopathy with poor prognosis absent appropriate treatment. Elevated native myocardial T1 and T2 have been reported for CA, and tissue characterization by cardiac MRI may expedite diagnosis and treatment. Cardiac Magnetic Resonance Fingerprinting (cMRF) has the potential to enable tissue characterization for CA through rapid, simultaneous T1 and T2 mapping. Furthermore, cMRF signal timecourses may provide additional information beyond myocardial T1 and T2. METHODS: Nine CA patients and five controls were scanned at 3 T using a prospectively gated cMRF acquisition. Two cMRF-based analysis approaches were examined: (1) relaxometric-based linear discriminant analysis (LDA) using native T1 and T2, and (2) signal timecourse-based LDA. The Fisher coefficient was used to compare the separability of patient and control groups from both approaches. Leave-two-out cross-validation was employed to evaluate the classification error rates of both approaches. RESULTS: Elevated myocardial T1 and T2 was observed in patients vs controls (T1: 1395 ± 121 vs 1240 ± 36.4 ms, p < 0.05; T2: 36.8 ± 3.3 vs 31.8 ± 2.6 ms, p < 0.05). LDA scores were elevated in patients for relaxometric-based LDA (0.56 ± 0.28 vs 0.18 ± 0.13, p < 0.05) and timecourse-based LDA (0.97 ± 0.02 vs 0.02 ± 0.02, p < 0.05). The Fisher coefficient was greater for timecourse-based LDA (60.8) vs relaxometric-based LDA (1.6). Classification error rates were lower for timecourse-based LDA vs relaxometric-based LDA (12.6 ± 24.3 vs 22.5 ± 30.1%, p < 0.05). CONCLUSIONS: These findings suggest that cMRF may be a valuable technique for the detection and characterization of CA. Analysis of cMRF signal timecourse data may improve tissue characterization as compared to analysis of native T1 and T2 alone.


Subject(s)
Amyloidosis , Heart , Amyloidosis/diagnostic imaging , Humans , Magnetic Resonance Imaging , Magnetic Resonance Imaging, Cine/methods , Magnetic Resonance Spectroscopy , Myocardium , Phantoms, Imaging , Predictive Value of Tests
9.
Prog Nucl Magn Reson Spectrosc ; 122: 11-22, 2021 02.
Article in English | MEDLINE | ID: mdl-33632415

ABSTRACT

Quantitative cardiac magnetic resonance has emerged in recent years as an approach for evaluating a range of cardiovascular conditions, with T1 and T2 mapping at the forefront of these developments. Cardiac Magnetic Resonance Fingerprinting (cMRF) provides a rapid and robust framework for simultaneous quantification of myocardial T1 and T2 in addition to other tissue properties. Since the advent of cMRF, a number of technical developments and clinical validation studies have been reported. This review provides an overview of cMRF, recent technical developments, healthy subject and patient studies, anticipated technical improvements, and potential clinical applications. Recent technical developments include slice profile and pulse efficiency corrections, improvements in image reconstruction, simultaneous multislice imaging, 3D whole-ventricle imaging, motion-resolved imaging, fat-water separation, and machine learning for rapid dictionary generation. Future technical developments in cMRF, such as B0 and B1 field mapping, acceleration of acquisition and reconstruction, imaging of patients with implanted devices, and quantification of additional tissue properties are also described. Potential clinical applications include characterization of infiltrative, inflammatory, and ischemic cardiomyopathies, tissue characterization in the left atrium and right ventricle, post-cardiac transplantation assessment, reduction of contrast material, pre-procedural planning for electrophysiology interventions, and imaging of patients with implanted devices.


Subject(s)
Heart , Magnetic Resonance Imaging , Heart/diagnostic imaging , Humans , Magnetic Resonance Spectroscopy , Myocardium , Phantoms, Imaging
10.
Med Phys ; 48(1): 287-299, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33206403

ABSTRACT

PURPOSE: Myocardial perfusion imaging using computed tomography (MPI-CT) and coronary CT angiography (CTA) have the potential to make CT an ideal noninvasive imaging gatekeeper exam for invasive coronary angiography. However, beam hardening can prevent accurate blood flow estimation in dynamic MPI-CT and can create artifacts that resemble flow deficits in single-shot MPI-CT. In this work, we compare four automatic beam hardening correction algorithms (ABHCs) applied to CT images, for their ability to produce accurate single images of contrast and accurate MPI flow maps using images from conventional CT systems, without energy sensitivity. METHODS: Previously, we reported a method, herein called ABHC-1, where we iteratively optimized a cost function sensitive to beam hardening artifacts in MPI-CT images and used a low order polynomial correction on projections of segmentation-processed CT images. Here, we report results from two new algorithms with higher order polynomial corrections, ABHC-2 and ABHC-3 (with three and seven free parameters, respectively), having potentially better correction but likely reduced estimability. Additionally, we compared results to an algorithm reported by others in the literature (ABHC-NH). Comparisons were made on a digital static phantom with simulated water, bone, and iodine regions; on a digital dynamic anthropomorphic phantom, with simulated blood flow; and on preclinical porcine experiments. We obtained CT images on a prototype spectral detector CT (Philips Healthcare) scanner that provided both conventional and virtual keV images, allowing us to quantitatively compare corrected CT images to virtual keV images. To test these methods' parameter optimization sensitivity to noise, we evaluated results on images obtained using different mAs. RESULTS: In images of the static phantom, ABHC-2 reduced beam hardening artifacts better than our previous ABHC-1 algorithm, giving artifacts smaller than 1.8 HU, even in the presence of high noise which should affect parameter optimization. Taken together, the quality of static phantom results ordered ABHC-2> ABHC-3> ABHC-1>> ABHC-NH. In an anthropomorphic MPI-CT simulator with homogeneous myocardial blood flow of 100 ml⋅min-1 ⋅100 g-1 , blood flow estimation results were 122 ± 24 (FBP), 135 ± 24 (ABHC-NH), 104 ± 14 (ABHC-1), 100 ± 12 (ABHC-2), and 108 ± 18 (ABHC-3) ml⋅min-1 ⋅100 g-1 , showing ABHC-2 as a clear winner. Visual and quantitative evaluations showed much improved homogeneity of myocardial flow with ABHC-2, nearly eliminating substantial artifacts in uncorrected flow maps which could be misconstrued as flow deficits. ABHC-2 performed universally better than ABHC-1, ABHC-3, and ABHC-NH in simulations with different acquisitions (varying noise and kVp values). In the presence of a simulated flow deficit, all ABHC methods retained the flow deficit, and ABHC-2 gave the most accurate flow ratio and homogeneity. ABHC-3 corrected phantom flow values were slightly better than ABHC-2, in noiseless images, suggesting that reduced quality in noisy images was due to reduced estimability. In an experiment with a pig expected to have uniform flow, ABHC-2 applied to conventional images improved flow maps to compare favorably to those from 70keV images. CONCLUSION: The automated algorithm can be used with different parametric BH correction models. ABHC-2 improved MPI-CT blood flow estimation as compared to other approaches and was robust to noisy images. In simulation and preclinical experiments, ABHC-2 gave results approaching gold standard 70 keV measurements.


Subject(s)
Myocardial Perfusion Imaging , Algorithms , Animals , Artifacts , Phantoms, Imaging , Swine , Tomography, X-Ray Computed
11.
J Med Imaging (Bellingham) ; 6(4): 046001, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31720314

ABSTRACT

We created and evaluated a processing method for dynamic computed tomography myocardial perfusion imaging (CT-MPI) of myocardial blood flow (MBF), which combines a modified simple linear iterative clustering algorithm (SLIC) with robust perfusion quantification, hence the name SLICR. SLICR adaptively segments the myocardium into nonuniform super-voxels with similar perfusion time attenuation curves (TACs). Within each super-voxel, an α-trimmed-median TAC was computed to robustly represent the super-voxel and a robust physiological model (RPM) was implemented to semi-analytically estimate MBF. SLICR processing was compared with another voxel-wise MBF preprocessing approach, which included a spatiotemporal bilateral filter (STBF) for noise reduction prior to perfusion quantification. Image data from a digital CT-MPI phantom and a porcine ischemia model were evaluated. SLICR was ∼ 50 -fold faster than voxel-wise RPM and other model-based methods while retaining sufficient resolution to show clinically relevant features, such as a transmural perfusion gradient. SLICR showed markedly improved accuracy and precision, as compared with other methods. At a simulated MBF of 100 mL/min-100 g and a tube current-time product of 100 mAs (50% of nominal), the MBF estimates were 101 ± 12 , 94 ± 56 , and 54 ± 24 mL / min - 100 g for SLICR, the voxel-wise Johnson-Wilson model, and a singular value decomposition-model independent method with STBF, respectively. SLICR estimated MBF precisely and accurately ( 103 ± 23 mL / min - 100 g ) at 25% nominal dose, while other methods resulted in larger errors. With the porcine model, the SLICR results were consistent with the induced ischemia. SLICR simultaneously accelerated and improved the quality of quantitative perfusion processing without compromising clinically relevant distributions of perfusion characteristics.

12.
Med Phys ; 46(4): 1648-1662, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30689216

ABSTRACT

PURPOSE: Computed tomography myocardial perfusion imaging (CT-MPI) and coronary CTA have the potential to make CT an ideal noninvasive imaging gatekeeper exam for invasive coronary angiography. However, beam hardening (BH) artifacts prevent accurate blood flow calculation in CT-MPI. BH correction methods require either energy-sensitive CT, not widely available, or typically, a calibration-based method in conventional CT. We propose a calibration-free, automatic BH correction (ABHC) method suitable for CT-MPI and evaluate its ability to reduce BH artifacts in single "static-perfusion" images and to create accurate myocardial blood flow (MBF) in dynamic CT-MPI. METHODS: In the algorithm, we used input CT DICOM images and iteratively optimized parameters in a polynomial BH correction until a BH-sensitive cost function was minimized on output images. An input image was segmented into a soft tissue image and a highly attenuating material (HAM) image containing bones and regions of high iodine concentrations, using mean HU and temporal enhancement properties. We forward projected HAM, corrected projection values according to a polynomial correction, and reconstructed a correction image to obtain the current iteration's BH corrected image. The cost function was sensitive to BH streak artifacts and cupping. We evaluated the algorithm on simulated CT and physical phantom images, and on preclinical porcine with optional coronary obstruction and clinical CT-MPI data. Assessments included measures of BH artifact in single images as well as MBF estimates. We obtained CT images on a prototype spectral detector CT (SDCT, Philips Healthcare) scanner that provided both conventional and virtual keV images, allowing us to quantitatively compare corrected CT images to virtual keV images. To stress test the method, we evaluated results on images from a different scanner (iCT, Philips Healthcare) and different kVp values. RESULTS: In a CT-simulated digital phantom consisting of water with iodine cylinder insets, BH streak artifacts between simulated iodine inserts were reduced from 13 ± 2 to 0 ± 1 HU. In a similar physical phantom having higher iodine concentrations, BH streak artifacts were reduced from 48 ± 6 to 1 ± 5 HU and cupping was reduced by 86%, from 248 to 23 HU. In preclinical CT-MPI images without coronary obstruction, BH artifact was reduced from 24 ± 6 HU to less than 5 ± 4 HU at peak enhancement. Standard deviation across different regions of interest (ROI) along the myocardium was reduced from 13.26 to 6.86 HU for ABHC, comparing favorably to measurements in the corresponding virtual keV image. Corrections greatly reduced variations in preclinical MBF maps as obtained in normal animals without obstruction (FFR = 1). Coefficients of variations were 22% (conventional CT), 9% (ABHC), and 5% (virtual keV). Moreover, variations in flow tended to be localized after ABHC, giving result which would not be confused with a flow deficit in a coronary vessel territory. CONCLUSION: The automated algorithm can be used to reduce BH artifact in conventional CT and improve CT-MPI accuracy particularly by removing regions of reduced estimated flow which might be misinterpreted as flow deficits.


Subject(s)
Algorithms , Coronary Occlusion/diagnostic imaging , Myocardial Perfusion Imaging/methods , Phantoms, Imaging , Radiographic Image Enhancement/methods , Radiographic Image Interpretation, Computer-Assisted/methods , Tomography, X-Ray Computed/methods , Animals , Calibration , Female , Myocardial Perfusion Imaging/instrumentation , Swine , Tomography, X-Ray Computed/instrumentation
13.
Phys Med Biol ; 63(18): 185011, 2018 09 13.
Article in English | MEDLINE | ID: mdl-30113311

ABSTRACT

In this work, we clarified the role of acquisition parameters and quantification methods in myocardial blood flow (MBF) estimability for myocardial perfusion imaging using CT (MPI-CT). We used a physiologic model with a CT simulator to generate time-attenuation curves across a range of imaging conditions, i.e. tube current-time product, imaging duration, and temporal sampling, and physiologic conditions, i.e. MBF and arterial input function width. We assessed MBF estimability by precision (interquartile range of MBF estimates) and bias (difference between median MBF estimate and reference MBF) for multiple quantification methods. Methods included: six existing model-based deconvolution models, such as the plug-flow tissue uptake model (PTU), Fermi function model, and single-compartment model (SCM); two proposed robust physiologic models (RPM1, RPM2); model-independent singular value decomposition with Tikhonov regularization determined by the L-curve criterion (LSVD); and maximum upslope (MUP). Simulations show that MBF estimability is most affected by changes in imaging duration for model-based methods and by changes in tube current-time product and sampling interval for model-independent methods. Models with three parameters, i.e. RPM1, RPM2, and SCM, gave least biased and most precise MBF estimates. The average relative bias (precision) for RPM1, RPM2, and SCM was ⩽11% (⩽10%) and the models produced high-quality MBF maps in CT simulated phantom data as well as in a porcine model of coronary artery stenosis. In terms of precision, the methods ranked best-to-worst are: RPM1 > RPM2 > Fermi > SCM > LSVD > MUP [Formula: see text] other methods. In terms of bias, the models ranked best-to-worst are: SCM > RPM2 > RPM1 > PTU > LSVD [Formula: see text] other methods. Models with four or more parameters, particularly five-parameter models, had very poor precision (as much as 310% uncertainty) and/or significant bias (as much as 493%) and were sensitive to parameter initialization, thus suggesting the presence of multiple local minima. For improved estimates of MBF from MPI-CT, it is recommended to use reduced models that incorporate prior knowledge of physiology and contrast agent uptake, such as the proposed RPM1 and RPM2 models.


Subject(s)
Algorithms , Coronary Circulation , Coronary Vessels/physiology , Myocardial Perfusion Imaging/methods , Phantoms, Imaging , Radiographic Image Interpretation, Computer-Assisted/methods , Tomography, X-Ray Computed/methods , Animals , Swine
14.
Sci Rep ; 8(1): 4169, 2018 03 08.
Article in English | MEDLINE | ID: mdl-29520005

ABSTRACT

The measurement method for the LA wall thickness (WT) using cardiac computed tomography (CT) is observer dependent and cannot provide a rapid and comprehensive visualisation of the global LA WT. We aim to develop a LA wall-mapping application to display the global LA WT on a coplanar plane. The accuracy, intra-observer, and inter-observer reproducibility of the application were validated using digital/physical phantoms, and CT images of eight patients. This application on CT-based LA WT measures were further validated by testing six pig cardiac specimens. To evaluate its accuracy, the expanded maps of the physical phantom and pig LA were generated from the CT images and compared with the expanded map of the digital phantom and LA wall of pig heart. No significant differences (p > 0.05) were found between physical phantom and digital phantom as well as pig heart specimen and CT images using our application. Moreover, the analysis was based on the LA physical phantom or images of clinical patients; the results consistently demonstrated high intra-observer reproducibility (ICC > 0.9) and inter-observer reproducibility (ICC > 0.8) and showed good correlation between measures of pig heart specimen and CT data (r = 0.96, p < 0.001). The application can process and analyse the LA architecture for further visualisation and quantification.


Subject(s)
Models, Cardiovascular , Phantoms, Imaging , Software , Tomography, X-Ray Computed , Animals , Female , Heart Atria/diagnostic imaging , Humans , Male , Reproducibility of Results , Swine , Tomography, X-Ray Computed/instrumentation , Tomography, X-Ray Computed/methods
15.
Article in English | MEDLINE | ID: mdl-32189825

ABSTRACT

There are several computational methods for estimating myocardial blood flow (MBF) using CT myocardial perfusion imaging (CT-MPI). Previous work has shown that model-based deconvolution methods are more accurate and precise than model-independent methods such as singular value decomposition and max-upslope. However, iterative optimization is computationally expensive and models are sensitive to image noise, thus limiting the utility of low x-ray dose acquisitions. We propose a new processing method, SLICR, which segments the myocardium into super-voxels using a modified simple linear iterative clustering (SLIC) algorithm and quantifies MBF via a robust physiologic model (RPM). We compared SLICR against voxel-wise SVD and voxel-wise model-based deconvolution methods (RPM, single-compartment and Johnson-Wilson). We used image data from a digital CT-MPI phantom to evaluate robustness of processing methods to noise at reduced x-ray dose. We validate SLICR in a porcine model with and without partial occlusion of the LAD coronary artery with known pressure-wire fractional flow reserve. SLICR was ~50 times faster than voxel-wise RPM and other model-based methods while retaining sufficient resolution to show all clinically interesting features (e.g., a flow deficit in the endocardial wall). SLICR showed much better precision and accuracy than the other methods. For example, at simulated MBF=100 mL/min/100g and 100 mAs exposure (50% of nominal dose) in the digital simulator, MBF estimates were 101 ± 12 mL/min/100g, 160 ± 54 mL/min/100g, and 122 ± 99 mL/min/100g for SLICR, SVD, and Johnson-Wilson, respectively. SLICR even gave excellent results (103 ± 23 ml/min/100g) at 50 mAs, corresponding to 25% nominal dose.

16.
Phys Med Biol ; 61(6): 2407-31, 2016 Mar 21.
Article in English | MEDLINE | ID: mdl-26943749

ABSTRACT

We optimized and evaluated dynamic myocardial CT perfusion (CTP) imaging on a prototype spectral detector CT (SDCT) scanner. Simultaneous acquisition of energy sensitive projections on the SDCT system enabled projection-based material decomposition, which typically performs better than image-based decomposition required by some other system designs. In addition to virtual monoenergetic, or keV images, the SDCT provided conventional (kVp) images, allowing us to compare and contrast results. Physical phantom measurements demonstrated linearity of keV images, a requirement for quantitative perfusion. Comparisons of kVp to keV images demonstrated very significant reductions in tell-tale beam hardening (BH) artifacts in both phantom and pig images. In phantom images, consideration of iodine contrast to noise ratio and small residual BH artifacts suggested optimum processing at 70 keV. The processing pipeline for dynamic CTP measurements included 4D image registration, spatio-temporal noise filtering, and model-independent singular value decomposition deconvolution, automatically regularized using the L-curve criterion. In normal pig CTP, 70 keV perfusion estimates were homogeneous throughout the myocardium. At 120 kVp, flow was reduced by more than 20% on the BH-hypo-enhanced myocardium, a range that might falsely indicate actionable ischemia, considering the 0.8 threshold for actionable FFR. With partial occlusion of the left anterior descending (LAD) artery (FFR < 0.8), perfusion defects at 70 keV were correctly identified in the LAD territory. At 120 kVp, BH affected the size and flow in the ischemic area; e.g. with FFR ≈ 0.65, the anterior-to-lateral flow ratio was 0.29 ± 0.01, over-estimating stenosis severity as compared to 0.42 ± 0.01 (p < 0.05) at 70 keV. On the non-ischemic inferior wall (not a LAD territory), the flow ratio was 0.50 ± 0.04 falsely indicating an actionable ischemic condition in a healthy territory. This ratio was 1.00 ± 0.08 at 70 keV. Results suggest that projection-based keV imaging with the SDCT system and proper processing could enable useful myocardial CTP, much improved over conventional CT.


Subject(s)
Myocardial Ischemia/diagnosis , Myocardial Perfusion Imaging/methods , Tomography, X-Ray Computed/methods , Animals , Myocardial Perfusion Imaging/instrumentation , Phantoms, Imaging , Swine , Tomography, X-Ray Computed/instrumentation
17.
Article in English | MEDLINE | ID: mdl-29568147

ABSTRACT

Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3±19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5±28.3mL/min/100g and 78.3±25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.

18.
Article in English | MEDLINE | ID: mdl-32210495

ABSTRACT

The detection of subendocardial ischemia exhibiting an abnormal transmural perfusion gradient (TPG) may help identify ischemic conditions due to micro-vascular dysfunction. We evaluated the effect of beam hardening (BH) artifacts on TPG quantification using myocardial CT perfusion (CTP). We used a prototype spectral detector CT scanner (Philips Healthcare) to acquire dynamic myocardial CTP scans in a porcine ischemia model with partial occlusion of the left anterior descending (LAD) coronary artery guided by pressure wire-derived fractional flow reserve (FFR) measurements. Conventional 120 kVp and 70 keV projection-based mono-energetic images were reconstructed from the same projection data and used to compute myocardial blood flow (MBF) using the Johnson-Wilson model. Under moderate LAD occlusion (FFR~0.7), we used three 5 mm short axis slices and divided the myocardium into three LAD segments and three remote segments. For each slice and each segment, we characterized TPG as the mean "endo-to-epi" transmural flow ratio (TFR). BH-induced hypoenhancement on the ischemic anterior wall at 120 kVp resulted in significantly lower mean TFR value as compared to the 70 keV TFR value (0.29±0.01 vs. 0.55±0.01; p<1e-05). No significant difference was measured between 120 kVp and 70 keV mean TFR values on segments moderately affected or unaffected by BH. In the entire ischemic LAD territory, 120 kVp mean endocardial flow was significantly reduced as compared to mean epicardial flow (15.80±10.98 vs. 40.85±23.44 ml/min/100g; p<1e-04). At 70 keV, BH was effectively minimized resulting in mean endocardial MBF of 40.85±15.3407 ml/min/100g vs. 74.09±5.07 ml/min/100g (p=0.0054) in the epicardium. We also found that BH artifact in the conventional 120 kVp images resulted in falsely reduced MBF measurements even under non-ischemic conditions.

19.
Article in English | MEDLINE | ID: mdl-33953456

ABSTRACT

Myocardial perfusion imaging using CT (MPI-CT) and coronary CTA have the potential to make CT an ideal noninvasive gate-keeper for invasive coronary angiography. However, beam hardening artifacts (BHA) prevent accurate blood flow calculation in MPI-CT. BH Correction (BHC) methods require either energy-sensitive CT, not widely available, or typically a calibration-based method. We developed a calibration-free, automatic BHC (ABHC) method suitable for MPI-CT. The algorithm works with any BHC method and iteratively determines model parameters using proposed BHA-specific cost function. In this work, we use the polynomial BHC extended to three materials. The image is segmented into soft tissue, bone, and iodine images, based on mean HU and temporal enhancement. Forward projections of bone and iodine images are obtained, and in each iteration polynomial correction is applied. Corrections are then back projected and combined to obtain the current iteration's BHC image. This process is iterated until cost is minimized. We evaluate the algorithm on simulated and physical phantom images and on preclinical MPI-CT data. The scans were obtained on a prototype spectral detector CT (SDCT) scanner (Philips Healthcare). Mono-energetic reconstructed images were used as the reference. In the simulated phantom, BH streak artifacts were reduced from 12±2HU to 1±1HU and cupping was reduced by 81%. Similarly, in physical phantom, BH streak artifacts were reduced from 48±6HU to 1±5HU and cupping was reduced by 86%. In preclinical MPI-CT images, BHA was reduced from 28±6 HU to less than 4±4HU at peak enhancement. Results suggest that the algorithm can be used to reduce BHA in conventional CT and improve MPI-CT accuracy.

20.
Med Phys ; 42(10): 6098-111, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26429285

ABSTRACT

PURPOSE: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. METHODS: Detectability (d') was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre-Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, PC. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. RESULTS: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit and model complexity according to AICc. With parameters fixed, the model reasonably predicted detectability of human observers in blended FBP-IMR images. Semianalytic internal noise computation gave results equivalent to Monte Carlo, greatly speeding parameter estimation. Using Model-k4, the authors found an average detectability improvement of 2.7 ± 0.4 times that of FBP. IMR showed greater improvements in detectability with larger signals and relatively consistent improvements across signal contrast and x-ray dose. In the phantom tested, Model-k4 predicted an 82% dose reduction compared to FBP, verified with physical CT scans at 80% reduced dose. CONCLUSIONS: IMR improves detectability over FBP and may enable significant dose reductions. A channelized Hotelling observer with internal noise proportional to channel output standard deviation agreed well with human observers across a wide range of variables, even across reconstructions with drastically different image characteristics. Utility of the model observer was demonstrated by predicting the effect of image processing (blending), analyzing detectability improvements with IMR across dose, size, and contrast, and in guiding real CT scan dose reduction experiments. Such a model observer can be applied in optimizing parameters in advanced iterative reconstruction algorithms as well as guiding dose reduction protocols in physical CT experiments.


Subject(s)
Computer Simulation , Image Processing, Computer-Assisted/methods , Machine Learning , Radiation Dosage , Tomography, X-Ray Computed , Humans , Observer Variation , Phantoms, Imaging , Quality Control
SELECTION OF CITATIONS
SEARCH DETAIL
...