Dual-disentangled Deep Multiple Clustering
CoRR(2024)
摘要
Multiple clustering has gathered significant attention in recent years due to
its potential to reveal multiple hidden structures of the data from different
perspectives. Most of multiple clustering methods first derive feature
representations by controlling the dissimilarity among them, subsequently
employing traditional clustering methods (e.g., k-means) to achieve the final
multiple clustering outcomes. However, the learned feature representations can
exhibit a weak relevance to the ultimate goal of distinct clustering. Moreover,
these features are often not explicitly learned for the purpose of clustering.
Therefore, in this paper, we propose a novel Dual-Disentangled deep Multiple
Clustering method named DDMC by learning disentangled representations.
Specifically, DDMC is achieved by a variational Expectation-Maximization (EM)
framework. In the E-step, the disentanglement learning module employs
coarse-grained and fine-grained disentangled representations to obtain a more
diverse set of latent factors from the data. In the M-step, the cluster
assignment module utilizes a cluster objective function to augment the
effectiveness of the cluster output. Our extensive experiments demonstrate that
DDMC consistently outperforms state-of-the-art methods across seven commonly
used tasks. Our code is available at https://github.com/Alexander-Yao/DDMC.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要