Chrome Extension
WeChat Mini Program
Use on ChatGLM

PILE: Pairwise Iterative Logits Ensemble for Multi-Teacher Labeled Distillation

arXiv (Cornell University)(2022)

Cited 0|Views21
No score
Abstract
Pre-trained language models have become a crucial part of ranking systems and achieved very impressive effects recently. To maintain high performance while keeping efficient computations, knowledge distillation is widely used. In this paper, we focus on two key questions in knowledge distillation for ranking models: 1) how to ensemble knowledge from multi-teacher; 2) how to utilize the label information of data in the distillation process. We propose a unified algorithm called Pairwise Iterative Logits Ensemble (PILE) to tackle these two questions simultaneously. PILE ensembles multi-teacher logits supervised by label information in an iterative way and achieved competitive performance in both offline and online experiments. The proposed method has been deployed in a real-world commercial search system.
More
Translated text
Key words
distillation,pairwise iterative logits ensemble,multi-teacher
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined