Mapping Across Relational Domains for Transfer Learning with Word Embeddings-Based Similarity

INDUCTIVE LOGIC PROGRAMMING (ILP 2021)(2022)

引用 4|浏览0
暂无评分
摘要
Statistical machine learning models are a concise representation of probabilistic dependencies among the attributes of an object. Most of the models assume that training and testing data come from the same distribution. Transfer learning has emerged as an essential technique to handle scenarios where such an assumption does not hold, as it relies on leveraging the knowledge acquired in one or more learning tasks as a starting point to solve a new task. Statistical Relational Learning (SRL) extends statistical learning to represent and learn from data with several objects and their relations. In this way, SRL deals with data with a rich vocabulary composed of classes, objects, their properties, and relationships. When employing transfer learning to SRL, the primary challenge is to transfer the learned structure, mapping the vocabulary from a source domain to a different target domain. To address the problem of transferring across domains, we propose TransBoostler, which uses pre-trained word embeddings to guide the mapping as the name of a predicate usually has a semantic connotation that can be mapped to a vector space model. After transferring, TransBoostler employs theory revision operators further to adapt the mapped model to the target data. In the experimental results, TransBoostler has successfully transferred trees from a source to a distinct target domain, performing equal or better than previous work but requiring less training time.
更多
查看译文
关键词
Transfer learning,Statistical relational learning,Word embeddings
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要