Your browser doesn't support javascript.
loading
Cross-domain few-shot learning based on pseudo-Siamese neural network.
Gong, Yuxuan; Yue, Yuqi; Ji, Weidong; Zhou, Guohui.
Afiliación
  • Gong Y; School of Computer Science and Information Engineering, Harbin Normal University, Harbin, 150025, Heilongjiang, People's Republic of China.
  • Yue Y; School of Computer Science and Information Engineering, Harbin Normal University, Harbin, 150025, Heilongjiang, People's Republic of China.
  • Ji W; School of Computer Science and Information Engineering, Harbin Normal University, Harbin, 150025, Heilongjiang, People's Republic of China. kingjwd@126.com.
  • Zhou G; School of Computer Science and Information Engineering, Harbin Normal University, Harbin, 150025, Heilongjiang, People's Republic of China.
Sci Rep ; 13(1): 1427, 2023 Jan 25.
Article en En | MEDLINE | ID: mdl-36697442
Cross-domain few-shot learning is one of the research highlights in machine learning. The difficulty lies in the accuracy drop of cross-domain network learning on a single domain due to the differences between the domains. To alleviate the problem, according to the idea of contour cognition and the process of human recognition, we propose a few-shot learning method based on pseudo-Siamese convolution neural network. The original image and the sketch map are respectively sent to the branch network in the pre-training and meta-learning process. While maintaining the original image features, the contour features are separately extracted as branch for training at the same time to improve the accuracy and generalization of learning. We conduct cross-domain few-shot learning experiments and good results have been achieved using mini-ImageNet as source domain, EuroSAT and ChestX as the target domains. Also, the results are qualitatively analyzed using a heatmap to verify the feasibility of our method.

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Sci Rep Año: 2023 Tipo del documento: Article

Texto completo: 1 Base de datos: MEDLINE Idioma: En Revista: Sci Rep Año: 2023 Tipo del documento: Article