Chrome Extension
WeChat Mini Program
Use on ChatGLM

Ensemble Knowledge Distillation of Self-Supervised Speech Models

IEEE International Conference on Acoustics, Speech, and Signal Processing(2023)

Cited 22|Views38
Key words
Self-supervised Learning,Ensemble Knowledge Distillation,SUPERB,Distortions
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined