A Framework for Information-Theoretic Converses.

ISIT(2023)

引用 0|浏览0
暂无评分
摘要
A new approach to information-theoretic converses is proposed based on Shannon’s original sphere-packing argument. Typical sequence arguments are hardened with decoding sets to include structured codewords. Each decoding set is shown to have a minimum volume of 2 nH(Y|X) typical y-sequences in the point-to-point discrete-memoryless channel if the probability of decoding error vanishes. Since a codebook of type p(x) generates at most 2 nH(Y) typical y-sequences, the error probability is non-vanishing when R > max p(x) I(X;Y). Kolmogorov’s zero-one law is applied to prove the error probability also goes to one, unifying the weak and strong converses. In preparation for the capacity of the relay channel, i.i.d codebooks are shown via the zero-one law and a sphere-absorption argument, to exhibit a clustering property where their orbits in the y-space asymptotically coincide or separate into clusters of indistinguishable codewords. The capacity of the relay channel is shown to be ${\max _{p\left({{x_s},{x_r}}\right)}}\min \left\{ {I\left({{X_s},{X_r};{Y_d}}\right),I\left({{X_s};{Y_r}{Y_d}\mid {X_r}}\right) - \delta } \right\}$ where $\delta : = \min \left\{ {{{\left| {I\left({{{\hat Y}_r};{Y_r}\mid {X_r}{Y_d}}\right) - {C_0}} \right|}^ + },I\left({{X_s};{Y_d}\mid {X_r}{Y_r}}\right)} \right\}$, C 0 := I(X r ;Y d ), and ${\hat Y_r}$ emulates X s in a virtual source-relay channel.
更多
查看译文
关键词
clustering property,decoding error vanishes probability,decoding set,error probability,i.i.d codebooks,indistinguishable codewords,information-theoretic converses,Kolmogorov zero-one law,point-to-point discrete-memoryless channel,Shannon original sphere-packing argument,sphere-absorption argument,structured codewords,typical sequence arguments,typical y-sequences,virtual source-relay channel
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要