Your browser doesn't support javascript.
loading
A Triple-Double Convolutional Neural Network for Panchromatic Sharpening.
IEEE Trans Neural Netw Learn Syst ; 34(11): 9088-9101, 2023 Nov.
Article em En | MEDLINE | ID: mdl-35263264
ABSTRACT
Pansharpening refers to the fusion of a panchromatic (PAN) image with a high spatial resolution and a multispectral (MS) image with a low spatial resolution, aiming to obtain a high spatial resolution MS (HRMS) image. In this article, we propose a novel deep neural network architecture with level-domain-based loss function for pansharpening by taking into account the following double-type structures, i.e., double-level, double-branch, and double-direction, called as triple-double network (TDNet). By using the structure of TDNet, the spatial details of the PAN image can be fully exploited and utilized to progressively inject into the low spatial resolution MS (LRMS) image, thus yielding the high spatial resolution output. The specific network design is motivated by the physical formula of the traditional multi-resolution analysis (MRA) methods. Hence, an effective MRA fusion module is also integrated into the TDNet. Besides, we adopt a few ResNet blocks and some multi-scale convolution kernels to deepen and widen the network to effectively enhance the feature extraction and the robustness of the proposed TDNet. Extensive experiments on reduced- and full-resolution datasets acquired by WorldView-3, QuickBird, and GaoFen-2 sensors demonstrate the superiority of the proposed TDNet compared with some recent state-of-the-art pansharpening approaches. An ablation study has also corroborated the effectiveness of the proposed approach. The code is available at https//github.com/liangjiandeng/TDNet.

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: IEEE Trans Neural Netw Learn Syst Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: IEEE Trans Neural Netw Learn Syst Ano de publicação: 2023 Tipo de documento: Article