Chrome Extension
WeChat Mini Program
Use on ChatGLM

Av1 In-Loop Filtering Using A Wide-Activation Structured Residual Network

2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)(2019)

Cited 13|Views14
No score
Abstract
The in-loop filter, which constitutes an important part in modern video coding, improves both subjective and objective quality of reconstructed frames. Lately, Convolutional Neural Network (CNN) has demonstrated its superiority over traditional methods in addressing in-loop filtering problem. In this paper, we develop a CNN-based in-loop filter, namely Wide Activation Residual Network (WARN), for AV1 encoder. On top of the plain Residual Network (ResNet), we introduce wide activation to each residual block, making a more reasonable allocation of network parameters. When incorporating WARN into video encoder, particular to inter coding, it is intricate to obtain the global optimum performance. After simplifying this as an end-to-end trainable problem, we propose a skipping method by taking advantage of the hierarchical reference structure in AV1. Experimental results show that our WARN achieves up to 14:42% and 9:64% BD-rate reduction in intra and inter coding, respectively.
More
Translated text
Key words
AV1, CNN, video compression, in-loop filter
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined