谷歌浏览器插件
订阅小程序
在清言上使用

Function Contrastive Learning of Transferable Representations

INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139(2021)

引用 1|浏览114
暂无评分
摘要
Few-shot-learning seeks to find models that are capable of fast-adaptation to novel tasks. Unlike typical few-shot learning algorithms, we propose a contrastive learning method which is not trained to solve a set of tasks, but rather attempts to find a good representation of the underlying data-generating processes (\emph{functions}). This allows for finding representations which are useful for an entire series of tasks sharing the same function. In particular, our training scheme is driven by the self-supervision signal indicating whether two sets of samples stem from the same underlying function. Our experiments on a number of synthetic and real-world datasets show that the representations we obtain can outperform strong baselines in terms of downstream performance and noise robustness, even when these baselines are trained in an end-to-end manner.
更多
查看译文
关键词
Few-Shot Learning,Transfer Learning,Representation Learning,Robust Learning,Unsupervised Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要