$H(X_{t})$ be the differential entropy of an

Lower Bound for Derivatives of Costa's Differential Entropy

2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2021)

Cited 2|Views2
No score
Abstract
Let $H(X_{t})$ be the differential entropy of an $n$ -dimensional random vector $X_{t}$ introduced by Costa. Cheng and Geng conjectured that $C_{1}(m, n): (-1)^{m+1}(\mathrm{d}^{m}/\mathrm{d}^{m}t)H(X_{t})\geq 0$ . McKean conjectured that $C_{1}(m, n): (-1)^{m+1}(\mathrm{d}^{m}/\mathrm{d}^{m}t)H(X_{t})\geq 0 (-1)^{m+1}(\mathrm{d}^{m}/\mathrm{d}^{m}t)H(X_{Gt})$ . McKean's conjecture was only considered in the univariate case before: $C_{2}(1,1)$ and $C_{2}(2,1)$ were proved by McKean and $C_{2}(i, 1), i=3,4,5$ were proved by Zhang-Anantharam-Geng under the log-concave condition. In this paper, we prove $C_{2}(1, n),\ C_{2}(2, n)$ and observe that McKean's conjecture might not be true for $n\ > \ 1$ and $m > 2$ . We further propose a weaker conjecture $C_{3}(m, n): (-1)^{m+1}(\mathrm{d}^{m}/\mathrm{d}^{m}t)H(X_{t}) \ \geq\ (-1)^{m+1}\frac{1}{n}(\mathrm{d}^{m}/\mathrm{d}^{m}t)H(X_{Gt})$ and prove $C_{3}(3,2), C_{3}(3,3), C_{3}(3,4)$ under the log-concave condition. A systematic procedure to prove $C_{l}(m, n)$ is proposed and the results mentioned above are proved using this procedure.
More
Translated text
Key words
n-dimensional random vector X,univariate case,log-concave condition,Costa differential entropy
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined