An Interactive Method to Improve Crowdsourced Annotations.

IEEE Transactions on Visualization and Computer Graphics(2019)

引用 62|浏览163
暂无评分
摘要
In order to effectively infer correct labels from noisy crowdsourced annotations, learning-from-crowds models have introduced expert validation. However, little research has been done on facilitating the validation procedure. In this paper, we propose an interactive method to assist experts in verifying uncertain instance labels and unreliable workers. Given the instance labels and worker reliabil...
更多
查看译文
关键词
Data visualization,Labeling,Data models,Task analysis,Visual analytics,Reliability
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要