An Optimized Method for Large-Scale Pre-Training in Symbolic Music

Shike Liu,Hongguang Xu,Ke Xu

2022 IEEE 16th International Conference on Anti-counterfeiting, Security, and Identification (ASID)(2022)

引用 0|浏览1
暂无评分
摘要
A better understanding of music can effectively improve the performance of music recommendation or generation. Although it has been confirmed that simply using the training method of the BERT model has strong ability in the field of symbolic music, the performance of BERT still has significant potential to be improved. In this paper, we mainly focus on the BERT model and propose a method to enhance its performance in the symbolic music domain. In order to mitigate the problem of information leakage between adjacent music tokens in pre-training, we propose a masking strategy that optimizes pre-training by corrupting data in a novel mechanism. Furthermore, the pre-training datasets used in our work cover both classical and popular music, which can provide a more comprehensive knowledge of different sorts of music, where a dynamic masking strategy is also employed to make full use of the data. We evaluate our improved model on four downstream tasks, including the melody extraction, velocity prediction, composer classification, and emotion classification. Experiments demonstrate that our proposed method has better music understanding ability than the baselines.
更多
查看译文
关键词
music understanding,symbolic music,mask strategy,BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要