Properties of a Generalized Divergence Related to Tsallis Generalized Divergence

IEEE Transactions on Information Theory(2020)

引用 6|浏览26
暂无评分
摘要
In this paper, we investigate the partition inequality, joint convexity, and Pinsker’s inequality, for a divergence that generalizes the Tsallis Relative Entropy and Kullback–Leibler divergence. The generalized divergence is defined in terms of a deformed exponential function, which replaces the Tsallis $q$ -exponential. We also constructed a family of probability distributions related to the generalized divergence. We found necessary and sufficient conditions for the partition inequality to be satisfied. A sufficient condition for the joint convexity was established. We proved that the generalized divergence satisfies the partition inequality, and is jointly convex, if, and only if, it coincides with the Tsallis relative entropy. As an application of partition inequality, a criterion for the Pinsker’s inequality was found.
更多
查看译文
关键词
Probability distribution,Entropy,Information theory,Indexes,Econometrics,Optimization,Statistical learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要