Estimation of the biomechanical mammographic deformation of the breast using machine learning models.
Clin Biomech (Bristol, Avon)
; 110: 106117, 2023 12.
Article
em En
| MEDLINE
| ID: mdl-37826970
ABSTRACT
BACKGROUND:
A typical problem in the registration of MRI and X-ray mammography is the nonlinear deformation applied to the breast during mammography. We have developed a method for virtual deformation of the breast using a biomechanical model automatically constructed from MRI. The virtual deformation is applied in twosteps:
unloaded state estimation and compression simulation. The finite element method is used to solve the deformation process. However, the extensive computational cost prevents its usage in clinical routine.METHODS:
We propose three machine learning models to overcome thisproblem:
an extremely randomized tree (first model), extreme gradient boosting (second model), and deep learning-based bidirectional long short-term memory with an attention layer (third model) to predict the deformation of a biomechanical model. We evaluated our methods with 516 breasts with realistic compression ratios up to 76%.FINDINGS:
We first applied one-fold validation, in which the second and third models performed better than the first model. We then applied ten-fold validation. For the unloaded state estimation, the median RMSE for the second and third models is 0.8 mm and 1.2 mm, respectively. For the compression, the median RMSE is 3.4 mm for both models. We evaluated correlations between model accuracy and characteristics of the clinical datasets such as compression ratio, breast volume, and tissue types.INTERPRETATION:
Using the proposed models, we achieved accurate results comparable to the finite element model, with a speedup of factor 240 using the extreme gradient boosting model. These proposed models can replace the finite element model simulation, enabling clinically relevant real-time application.Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Mama
/
Mamografia
Idioma:
En
Ano de publicação:
2023
Tipo de documento:
Article