Transformer-based Dynamic Fusion Clustering Network.

Knowl. Based Syst.(2022)

Cited 1|Views7
No score
Abstract
Clustering is an advanced task in machine learning. Numerous studies have improved clustering performance by integrating deep learning into clustering technology. However, some limitations have still existed in the current deep clustering research: (1) lack of a dynamic fusion mechanism to help multiple deep learning networks jointly train node information (2) data structure embedding methods are not mature enough, and as the number of layers in the deep learning network deepens, the ability to learn data representation decreases, resulting in low performance. In contrast to these clustering methods, we propose a Transformer-based Dynamic Fusion Clustering Network (TDCN), a novel deep clustering network mainly with Transformer architecture that can successfully address the current issues and improve the clustering performance. Specifically, a new dynamic attention mechanism is used to fuse the feature of Transformer and autoencoder (AE) networks in TDCN. In order to obtain data structural information, a new transformation operation G is designed. The transformation operation G varies according to the characteristics of the source data, helping to represent the data structure. In addition, TDCN stacks multi-layer multi-scale heterogeneous networks to learn node representation and further integrates information at different scales through specific modules, so as to facilitate efficient extraction of information. The whole deep clustering network is trained by a dual self -supervision mechanism. Experiments indicate that our model can achieve comparable or even better performance than the state-of-the-art methods on five datasets.(c) 2022 Elsevier B.V. All rights reserved.
More
Translated text
Key words
Deep clustering,Dynamic attention mechanism,Transformer network,Self -supervised learning,Feature fusion
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined