Unpaired Multi-View Graph Clustering With Cross-View Structure Matching.
IEEE Trans Neural Netw Learn Syst
; PP2023 Aug 02.
Article
em En
| MEDLINE
| ID: mdl-37531311
Multi-view clustering (MVC), which effectively fuses information from multiple views for better performance, has received increasing attention. Most existing MVC methods assume that multi-view data are fully paired, which means that the mappings of all corresponding samples between views are predefined or given in advance. However, the data correspondence is often incomplete in real-world applications due to data corruption or sensor differences, referred to as the data-unpaired problem (DUP) in multi-view literature. Although several attempts have been made to address the DUP issue, they suffer from the following drawbacks: 1) most methods focus on the feature representation while ignoring the structural information of multi-view data, which is essential for clustering tasks; 2) existing methods for partially unpaired problems rely on pregiven cross-view alignment information, resulting in their inability to handle fully unpaired problems; and 3) their inevitable parameters degrade the efficiency and applicability of the models. To tackle these issues, we propose a novel parameter-free graph clustering framework termed unpaired multi-view graph clustering framework with cross-view structure matching (UPMGC-SM). Specifically, unlike the existing methods, UPMGC-SM effectively utilizes the structural information from each view to refine cross-view correspondences. Besides, our UPMGC-SM is a unified framework for both the fully and partially unpaired multi-view graph clustering. Moreover, existing graph clustering methods can adopt our UPMGC-SM to enhance their ability for unpaired scenarios. Extensive experiments demonstrate the effectiveness and generalization of our proposed framework for both paired and unpaired datasets.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
IEEE Trans Neural Netw Learn Syst
Ano de publicação:
2023
Tipo de documento:
Article