Staggered Quantizers for Perfect Perceptual Quality: A Connection between Quantizers with Common Randomness and Without
arxiv(2024)
摘要
The rate-distortion-perception (RDP) framework has attracted significant
recent attention due to its application in neural compression. It is important
to understand the underlying mechanism connecting procedures with common
randomness and those without. Different from previous efforts, we study this
problem from a quantizer design perspective. By analyzing an idealized setting,
we provide an interpretation of the advantage of dithered quantization in the
RDP setting, which further allows us to make a conceptual connection between
randomized (dithered) quantizers and quantizers without common randomness. This
new understanding leads to a new procedure for RDP coding based on staggered
quantizers.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要