Please use this identifier to cite or link to this item: https://repository.cihe.edu.hk/jspui/handle/cihe/4431
Title: Deep attention-guided graph clustering with dual self-supervision
Author(s): Liu, Hui 
Author(s): Peng, Z.
Jia, Y.
Hou, J.
Issue Date: 2023
Publisher: IEEE
Journal: IEEE Transactions on Circuits and Systems for Video Technology 
Volume: 33
Issue: 7
Start page: 3296
End page: 3307
Abstract: 
Existing deep embedding clustering methods fail to sufficiently utilize the available off-the-shelf information from feature embeddings and cluster assignments, limiting their performance. To this end, we propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC). Specifically, DAGC first utilizes a heterogeneity-wise fusion module to adaptively integrate the features of the auto-encoder and the graph convolutional network in each layer and then uses a scale-wise fusion module to dynamically concatenate the multi-scale features in different layers. Such modules are capable of learning an informative feature embedding via an attention-based mechanism. In addition, we design a distribution-wise fusion module that leverages cluster assignments to acquire clustering results directly. To better explore the off-the-shelf information from the cluster assignments, we develop a dual self-supervision solution consisting of a soft self-supervision strategy with a Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss. Extensive experiments on nine benchmark datasets validate that our method consistently outperforms state-of-the-art methods. Especially, our method improves the ARI by more than 10.29% over the best baseline.
URI: https://repository.cihe.edu.hk/jspui/handle/cihe/4431
DOI: 10.1109/TCSVT.2022.3232604
CIHE Affiliated Publication: Yes
Appears in Collections:CIS Publication

SFX Query Show full item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.