Task-Consistent Meta Learning for Low-Resource Speech Recognition.

NLPCC (1)(2023)

引用 0|浏览3
暂无评分
摘要
We propose a new meta learning based framework that enhances previous approaches for low-resource speech recognition. Meta-learning has proven to be a powerful paradigm for transferring knowledge from prior tasks to facilitate the learning of a novel task. However, when faced with complex task environments and diverse task learning directions, averaging all task gradients is ineffective at capturing meta-knowledge. To address this challenge, we propose a task-consistent multilingual meta-learning (TCMML) method that adopts the gradient agreement algorithm to direct the model parameters in a direction where tasks have more consistency. If a task’s gradient matches the average gradient, its weight in meta-optimization is increased, and vice versa. Experiments on two datasets demonstrate that our proposed system can achieve comparable or even superior performance to state-of-the-art baselines on low-resource languages, and can easily combine with various meta learning methods.
更多
查看译文
关键词
speech recognition,learning,meta,task-consistent,low-resource
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要