Self-paced and Bayes-decision-rule linear KNN prediction

International Journal of Machine Learning and Cybernetics(2022)

引用 2|浏览8
暂无评分
摘要
While a testing sample may be first encoded linearly with labeled samples and then classified with KNN on the sum of the obtained weights of the samples in each class so as to avoid the consistent distribution assumption explicitly or implicitly used in most of the existing classification methods for training and testing samples, a novel self-paced and Bayes-decision-rule linear KNN prediction method SBLD-KNN in this study will be proposed to achieve three goals: (1) class-ware information will be explicitly reflected in a grouping effect regularization term so as to share the sparsity of a linear encoder and simultaneously have grouping effect of weights on each class; (2) the resultant predictor behaves like Bayes-decision-rule for minimum error; (3) self-paced regularized term is designed to adaptively truncate the weights of labeled samples for enhancing generalization. In order to do so, the corresponding objective function of SBLD-KNN is designed and then optimized by using the alternating optimization strategy, and its Bayes-decision-rule is theoretically analyzed. Our experimental results on benchmark datasets witness the effectiveness of SBLD-KNN, in contrast to the comparative methods, including SBLD-KNN’s simplified version BD-KNN with weight’s truncating rather than self-pacing.
更多
查看译文
关键词
KNN, Linear reconstruction, Class-ware information, Grouping effect, Self-paced learning, Bayes-decision-rule
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要