Generalized Dictionary for Multitask Learning with Boosting.

International Joint Conference on Artificial Intelligence(2016)

引用 13|浏览51
暂无评分
摘要
While multitask learning has been extensively studied, most existing methods rely on linear models (e.g. linear regression, logistic regression), which may fail in dealing with more general (nonlinear) problems. In this paper, we present a new approach that combines dictionary learning with gradient boosting to achieve multitask learning with general (nonlinear) basis functions. Specifically, for each task we learn a sparse representation in a nonlinear dictionary that is shared across the set of tasks. Each atom of the dictionary is a nonlinear feature mapping of the original input space, learned in function space by gradient boosting. The resulting model is a hierarchical ensemble where the top layer of the hierarchy is the task-specific sparse coefficients and the bottom layer is the boosted models common to all tasks. The proposed method takes the advantages of both dictionary learning and boosting for multitask learning: knowledge across tasks can be shared via the dictionary, and flexibility and generalization performance are guaranteed by boosting. More important, this general framework can be used to adapt any learning algorithm to (nonlinear) multitask learning. Experimental results on both synthetic and benchmark real-world datasets confirm the effectiveness of the proposed approach for multitask learning.
更多
查看译文
关键词
generalized dictionary,boosting,learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要