Feature selection via dependence maximization

Journal of Machine Learning Research(2012)

引用 470|浏览148
暂无评分
摘要
We introduce a framework for feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly dependent on the labels. Our approach leads to a greedy procedure for feature selection. We show that a number of existing feature selectors are special cases of this framework. Experiments on both artificial and real-world data show that our feature selector works well in practice.
更多
查看译文
关键词
real-world data,feature selection,good feature,hilbert-schmidt independence criterion,estimation problem,greedy procedure,key idea,dependence maximization,feature selector,selected feature,kernel methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要