Scale Invariant Conditional Dependence Measures.
ICML'13: Proceedings of the 30th International Conference on International Conference on Machine Learning - Volume 28(2013)
摘要
In this paper we develop new dependence and conditional dependence measures and provide their estimators. An attractive property of these measures and estimators is that they are invariant to any monotone increasing transformations of the random variables, which is important in many applications including feature selection. Under certain conditions we show the consistency of these estimators, derive upper bounds on their convergence rates, and show that the estimators do not suffer from the curse of dimensionality. However, when the conditions are less restrictive, we derive a lower bound which proves that in the worst case the convergence can be arbitrarily slow similarly to some other estimators. Numerical illustrations demonstrate the applicability of our method.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络