Multi-modal robust inverse-consistent linear registration.
Hum Brain Mapp
; 36(4): 1365-80, 2015 Apr.
Article
em En
| MEDLINE
| ID: mdl-25470798
Registration performance can significantly deteriorate when image regions do not comply with model assumptions. Robust estimation improves registration accuracy by reducing or ignoring the contribution of voxels with large intensity differences, but existing approaches are limited to monomodal registration. In this work, we propose a robust and inverse-consistent technique for cross-modal, affine image registration. The algorithm is derived from a contextual framework of image registration. The key idea is to use a modality invariant representation of images based on local entropy estimation, and to incorporate a heteroskedastic noise model. This noise model allows us to draw the analogy to iteratively reweighted least squares estimation and to leverage existing weighting functions to account for differences in local information content in multimodal registration. Furthermore, we use the nonparametric windows density estimator to reliably calculate entropy of small image patches. Finally, we derive the Gauss-Newton update and show that it is equivalent to the efficient second-order minimization for the fully symmetric registration approach. We illustrate excellent performance of the proposed methods on datasets containing outliers for alignment of brain tumor, full head, and histology images.
Palavras-chave
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Processamento de Imagem Assistida por Computador
/
Imageamento por Ressonância Magnética
/
Técnicas Histológicas
/
Imagem Óptica
/
Microscopia
Limite:
Humans
Idioma:
En
Ano de publicação:
2015
Tipo de documento:
Article