Chrome Extension
WeChat Mini Program
Use on ChatGLM

Estimating Feature-Label Dependence Using Gini Distance Statistics

Silu Zhang, Xin Dang, Dao Nguyen, Dawn Wilkins, Yixin Chen

IEEE Transactions on Pattern Analysis and Machine Intelligence(2021)

Cited 12|Views9
No score
Abstract
Identifying statistical dependence between the features and the label is a fundamental problem in supervised learning. This paper presents a framework for estimating dependence between numerical features and a categorical label using generalized Gini distance, an energy distance in reproducing kernel Hilbert spaces (RKHS). Two Gini distance based dependence measures are explored: Gini distance covariance and Gini distance correlation. Unlike Pearson covariance and correlation, which do not characterize independence, the above Gini distance based measures define dependence as well as independence of random variables. The test statistics are simple to calculate and do not require probability density estimation. Uniform convergence bounds and asymptotic bounds are derived for the test statistics. Comparisons with distance covariance statistics are provided. It is shown that Gini distance statistics converge faster than distance covariance statistics in the uniform convergence bounds, hence tighter upper bounds on both Type I and Type II errors. Moreover, the probability of Gini distance covariance statistic under-performing the distance covariance statistic in Type II error decreases to 0 exponentially with the increase of the sample size. Extensive experimental results are presented to demonstrate the performance of the proposed method.
More
Translated text
Key words
Energy distance,feature selection,Gini distance covariance,Gini distance correlation,distance covariance,reproducing kernel Hilbert space,dependence test,supervised learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined