Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Magn Reson Med ; 89(1): 40-53, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36161342

RESUMEN

PURPOSE: We have introduced an artificial intelligence framework, 31P-SPAWNN, in order to fully analyze phosphorus-31 ( 31 $$ {}^{31} $$ P) magnetic resonance spectra. The flexibility and speed of the technique rival traditional least-square fitting methods, with the performance of the two approaches, are compared in this work. THEORY AND METHODS: Convolutional neural network architectures have been proposed for the analysis and quantification of 31 $$ {}^{31} $$ P-spectroscopy. The generation of training and test data using a fully parameterized model is presented herein. In vivo unlocalized free induction decay and three-dimensional 31 $$ {}^{31} $$ P-magnetic resonance spectroscopy imaging data were acquired from healthy volunteers before being quantified using either 31P-SPAWNN or traditional least-square fitting techniques. RESULTS: The presented experiment has demonstrated both the reliability and accuracy of 31P-SPAWNN for estimating metabolite concentrations and spectral parameters. Simulated test data showed improved quantification using 31P-SPAWNN compared with LCModel. In vivo data analysis revealed higher accuracy at low signal-to-noise ratio using 31P-SPAWNN, yet with equivalent precision. Processing time using 31P-SPAWNN can be further shortened up to two orders of magnitude. CONCLUSION: The accuracy, reliability, and computational speed of the method open new perspectives for integrating these applications in a clinical setting.


Asunto(s)
Inteligencia Artificial , Fósforo , Humanos , Reproducibilidad de los Resultados , Espectroscopía de Resonancia Magnética/métodos , Redes Neurales de la Computación
2.
Phys Med ; 102: 79-87, 2022 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-36137403

RESUMEN

MRI is a non-invasive medical imaging modality that is sensitive to patient motion, which constitutes a major limitation in most clinical applications. Solutions may arise from the reduction of acquisition times or from motion-correction techniques, either prospective or retrospective. Benchmarking the latter methods requires labeled motion-corrupted datasets, which are uncommon. Up to our best knowledge, no protocol for generating labeled datasets of MRI images corrupted by controlled motion has yet been proposed. Hence, we present a methodology allowing the acquisition of reproducible motion-corrupted MRI images as well as validation of the system's performance by motion estimation through rigid-body volume registration of fast 3D echo-planar imaging (EPI) time series. A proof-of-concept is presented, to show how the protocol can be implemented to provide qualitative and quantitative results. An MRI-compatible video system displays a moving target that volunteers equipped with customized plastic glasses must follow to perform predefined head choreographies. Motion estimation using rigid-body EPI time series registration demonstrated that head position can be accurately determined (with an average standard deviation of about 0.39 degrees). A spatio-temporal upsampling and interpolation method to cope with fast motion is also proposed in order to improve motion estimation. The proposed protocol is versatile and straightforward. It is compatible with all MRI systems and may provide insights on the origins of specific motion artifacts. The MRI and artificial intelligence research communities could benefit from this work to build in-vivo labeled datasets of motion-corrupted MRI images suitable for training/testing any retrospective motion correction or machine learning algorithm.


Asunto(s)
Artefactos , Inteligencia Artificial , Encéfalo/diagnóstico por imagen , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Imagen por Resonancia Magnética/métodos , Movimiento (Física) , Plásticos , Estudios Prospectivos , Estudios Retrospectivos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...