PTS-LIC: Pruning Threshold Searching for Lightweight Learned Image Compression.

2023 IEEE International Conference on Visual Communications and Image Processing (VCIP)(2023)

Cited 0|Views5
No score
Abstract
Learned Image Compression (LIC), which uses neural networks to compress images, has experienced significant growth in recent years. The hyperprior-module-based LIC model has achieved higher performance than classical codecs. However, the LIC models are too heavy (in calculation and parameter amounts) to apply to edge devices. To solve this problem, some former papers focus on structural pruning for LIC models. However, they either cause noticeable performance decrement or neglect the appropriate pruning threshold for each LIC model. These problems keep their pruning results sub-optimal. This paper proposes a Pruning Threshold Searching on the hyperprior module for different-quality LIC models. Our method removes most parameters and calculations while keeping the performance the same as the models before pruning. We removed at least 49.8% of parameters and 28.5% of calculations for the Channel-Wise-Context-Model-based models and 29.1% of parameters for the Cheng-2020 models.
More
Translated text
Key words
Image Compression,Pruning Threshold,Learned Image Compression,Neural Network,Edge Devices,Hyperprior,Growth In Recent Years,Model Performance,Search Method,Bitrate,Original Ones,Higher Ones,Binary Search,Conv Layer,Fine-tuned Model,Pruning Method
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined