k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions.

ENTROPY(2011)

引用 10|浏览4
暂无评分
摘要
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.
更多
查看译文
关键词
hyperspherical distribution,directional data,differential entropy,cross entropy,Kullback-Leibler divergence,k-nearest neighbor
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要