TY - JOUR
T1 - Graph embedding clustering
T2 - Graph attention auto-encoder with cluster-specificity distribution
AU - Xu, Huiling
AU - Xia, Wei
AU - Gao, Quanxue
AU - Han, Jungong
AU - Gao, Xinbo
N1 - Funding Information:
The authors would like to thank the anonymous reviewers and AE for their constructive comments and suggestions. This work is supported by the National Natural Science Foundation of China under Grants 61773302 , Natural Science Basic Research Plan in Shaanxi Province under Grants 2020JZ-19 and 2020JQ-327 , the Initiative Postdocs Supporting Program BX20190262 , the Fundamental Research Funds for the Central Universities, China and the Innovation Fund of Xidian University .
Publisher Copyright:
© 2021 Elsevier Ltd
PY - 2021/10/1
Y1 - 2021/10/1
N2 - Towards exploring the topological structure of data, numerous graph embedding clustering methods have been developed in recent years, none of them takes into account the cluster-specificity distribution of the nodes representations, resulting in suboptimal clustering performance. Moreover, most existing graph embedding clustering methods execute the nodes representations learning and clustering in two separated steps, which increases the instability of its original performance. Additionally, rare of them simultaneously takes node attributes reconstruction and graph structure reconstruction into account, resulting in degrading the capability of graph learning. In this work, we integrate the nodes representations learning and clustering into a unified framework, and propose a new deep graph attention auto-encoder for nodes clustering that attempts to learn more favorable nodes representations by leveraging self-attention mechanism and node attributes reconstruction. Meanwhile, a cluster-specificity distribution constraint, which is measured by ℓ1,2-norm, is employed to make the nodes representations within the same cluster end up with a common distribution in the dimension space while representations with different clusters have different distributions in the intrinsic dimensions. Extensive experiment results reveal that our proposed method is superior to several state-of-the-art methods in terms of performance.
AB - Towards exploring the topological structure of data, numerous graph embedding clustering methods have been developed in recent years, none of them takes into account the cluster-specificity distribution of the nodes representations, resulting in suboptimal clustering performance. Moreover, most existing graph embedding clustering methods execute the nodes representations learning and clustering in two separated steps, which increases the instability of its original performance. Additionally, rare of them simultaneously takes node attributes reconstruction and graph structure reconstruction into account, resulting in degrading the capability of graph learning. In this work, we integrate the nodes representations learning and clustering into a unified framework, and propose a new deep graph attention auto-encoder for nodes clustering that attempts to learn more favorable nodes representations by leveraging self-attention mechanism and node attributes reconstruction. Meanwhile, a cluster-specificity distribution constraint, which is measured by ℓ1,2-norm, is employed to make the nodes representations within the same cluster end up with a common distribution in the dimension space while representations with different clusters have different distributions in the intrinsic dimensions. Extensive experiment results reveal that our proposed method is superior to several state-of-the-art methods in terms of performance.
KW - Cluster-specificity distribution
KW - Graph neural networks
KW - Nodes clustering
KW - Learning
KW - Neural Networks, Computer
KW - Cluster Analysis
UR - http://www.scopus.com/inward/record.url?scp=85106873055&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2021.05.008
DO - 10.1016/j.neunet.2021.05.008
M3 - Article
C2 - 34029998
AN - SCOPUS:85106873055
SN - 0893-6080
VL - 142
SP - 221
EP - 230
JO - Neural Networks
JF - Neural Networks
ER -