Discretization and Feature Selection Based on Bias Corrected Mutual Information Considering High-Order Dependencies.

PAKDD (1)(2020)

引用 5|浏览21
暂无评分
摘要
Mutual Information (MI) based feature selection methods are popular due to their ability to capture the nonlinear relationship among variables. However, existing works rarely address the error (bias) that occurs due to the use of finite samples during the estimation of MI. To the best of our knowledge, none of the existing methods address the bias issue for the high-order interaction term which is essential for better approximation of joint MI. In this paper, we first calculate the amount of bias of this term. Moreover, to select features using \\(\\chi ^2\\) based search, we also show that this term follows \\(\\chi ^2\\) distribution. Based on these two theoretical results, we propose Discretization and feature Selection based on bias corrected Mutual information (DSbM). DSbM is extended by adding simultaneous forward selection and backward elimination (DSbM\\(_\\mathrm{fb}\\)). We demonstrate the superiority of DSbM over four state-of-the-art methods in terms of accuracy and the number of selected features on twenty benchmark datasets. Experimental results also demonstrate that DSbM outperforms the existing methods in terms of accuracy, Pareto Optimality and Friedman test. We also observe that compared to DSbM, in some dataset DSbM\\(_\\mathrm{fb}\\) selects fewer features and increases accuracy.
更多
查看译文
关键词
feature selection,bias corrected mutual information,high-order
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要