HELMINGER ET AL.: KNOWLEDGE DISTILLATION FOR GAN BASED COMPRESSION 1 Microdosing: Knowledge Distillation for GAN based Compression

semanticscholar(2021)

引用 0|浏览0
暂无评分
摘要
Recently, significant progress has been made in learned image and video compression. In particular, the usage of Generative Adversarial Networks has led to impressive results in the low bit rate regime. However, the model size remains an important issue in current state-of-the-art proposals, and existing solutions require significant computation effort on the decoding side. This limits their usage in realistic scenarios and the extension to video compression. In this paper, we demonstrate how to leverage knowledge distillation to obtain equally capable image decoders at a fraction of the original number of parameters. We investigate several aspects of our solution including sequence specialization with side information for image coding. Finally, we also show how to transfer the obtained benefits into the setting of video compression. Altogether, our proposal allows to reduce a decoder model size by a factor of 20 and to achieve 50% reduction in decoding time.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要