Nonlinear Transform Coding

IEEE Journal of Selected Topics in Signal Processing(2021)

引用 165|浏览278
暂无评分
摘要
We review a class of methods that can be collected under the name nonlinear transform coding (NTC), which over the past few years have become competitive with the best linear transform codecs for images, and have superseded them in terms of rate-distortion performance under established perceptual quality metrics such as MS-SSIM. We assess the empirical rate-distortion performance of NTC with the help of simple example sources, for which the optimal performance of a vector quantizer is easier to estimate than with natural data sources. To this end, we introduce a novel variant of entropy-constrained vector quantization. We provide an analysis of various forms of stochastic optimization techniques for NTC models; review architectures of transforms based on artificial neural networks, as well as learned entropy models; and provide a direct comparison of a number of methods to parameterize the rate-distortion trade-off of nonlinear transforms, introducing a simplified one.
更多
查看译文
关键词
Artificial neural networks,data compression,machine learning,rate-distortion,source coding,transform coding,unsupervised learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要