Self-paced non-convex regularized analysis–synthesis dictionary learning for unsupervised feature selection

Knowledge-Based Systems(2022)

引用 3|浏览6
暂无评分
摘要
Due to the ability to prevent over-fitting, reduce computational complexity and storage cost, and enhance interpretability, unsupervised feature selection (UFS) has received widespread attention in a large number of application scenarios. However, a majority of the existing UFS methods cannot make full use of the diversity between data points, which usually leads to a suboptimal solution. In this paper, a novel UFS approach, referred to as self-paced analysis–synthesis dictionary learning (SPASDL) is proposed by integrating dictionary learning and self-paced learning into a joint framework. Specifically, we reconstruct and code data with the synthesis dictionary and analysis dictionary, respectively. This strategy can avoid using pseudo labels of data, which may mislead the feature selection process. Further, we introduce self-paced learning into dictionary learning to realize the reconstruction of data points from easy to hard in the training process. To identify more discriminative features, we impose the non-convex sparse constraint on the synthesis dictionary. For the purpose of optimizing the resulting model, we exploit an efficient iterative optimization algorithm based on alternative search strategy (ASS). Besides, the theoretical results on the optimization algorithm, including the convergence analysis and computational complexity, are investigated. Finally, experimental results on public available real-world datasets demonstrate the effectiveness and the superiority of the proposed method.
更多
查看译文
关键词
Unsupervised feature selection,Self-paced learning,Non-convex regularization,Alternative search strategy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要