Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
IEEE Trans Med Imaging ; 43(7): 2599-2609, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38381642

RESUMO

Methods for unsupervised domain adaptation (UDA) help to improve the performance of deep neural networks on unseen domains without any labeled data. Especially in medical disciplines such as histopathology, this is crucial since large datasets with detailed annotations are scarce. While the majority of existing UDA methods focus on the adaptation from a labeled source to a single unlabeled target domain, many real-world applications with a long life cycle involve more than one target domain. Thus, the ability to sequentially adapt to multiple target domains becomes essential. In settings where the data from previously seen domains cannot be stored, e.g., due to data protection regulations, the above becomes a challenging continual learning problem. To this end, we propose to use generative feature-driven image replay in conjunction with a dual-purpose discriminator that not only enables the generation of images with realistic features for replay, but also promotes feature alignment during domain adaptation. We evaluate our approach extensively on a sequence of three histopathological datasets for tissue-type classification, achieving state-of-the-art results. We present detailed ablation experiments studying our proposed method components and demonstrate a possible use-case of our continual UDA method for an unsupervised patch-based segmentation task given high-resolution tissue images. Our code is available at: https://github.com/histocartography/multi-scale-feature-alignment.


Assuntos
Processamento de Imagem Assistida por Computador , Processamento de Imagem Assistida por Computador/métodos , Humanos , Algoritmos , Aprendizado de Máquina não Supervisionado , Aprendizado Profundo , Animais , Bases de Dados Factuais , Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa