Cost-effective Batch-mode Multi-label Active Learning.

Xiaoqiang Gui,Xudong Lu,Guoxian Yu

Neurocomputing(2021)

引用 13|浏览9
暂无评分
摘要
Active learning aims to select the most valuable unlabeled instances for annotations and thus to maximize the performance of a learner with the selected instances. Batch-mode active learning methods can select a batch of unlabeled instances with informativeness and low redundancy at each iteration, they are more efficient than myopic active learning methods, which typically have to retrain the model after querying each instance. However, Batch-mode Multi-label Active Learning is not well explored yet, since it is quite difficult to select a batch of instances or instance-label pairs with high informativeness but low redundancy for query and budget saving. In this paper, we propose a novel approach CBMAL (Cost-effective Batch-mode Multi-label Active Learning) to address this challenge. CBMAL firstly selects a batch of informative instance-label pairs using uncertainty, label correlation and label space sparsity. Next, CBMAL leverages the information from the feature and label dimensions to pick out a small batch of instance-label pairs with the highest information and lowest redundancy from the first batch. These instance-label pairs are then queried and used to update the learner for the next iteration. Experimental results on six benchmark datasets demonstrate that CBMAL can reduce both the query and time costs while achieve a better performance than state-of-the-art methods.
更多
查看译文
关键词
Multi-label learning,Batch-mode active learning,Information and redundancy,Label correlations,Label sparsity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要