Automated Contrastive Learning Strategy Search for Time Series
arxiv(2024)
摘要
In recent years, Contrastive Learning (CL) has become a predominant
representation learning paradigm for time series. Most existing methods in the
literature focus on manually building specific Contrastive Learning Strategies
(CLS) by human heuristics for certain datasets and tasks. However, manually
developing CLS usually require excessive prior knowledge about the datasets and
tasks, e.g., professional cognition of the medical time series in healthcare,
as well as huge human labor and massive experiments to determine the detailed
learning configurations. In this paper, we present an Automated Machine
Learning (AutoML) practice at Microsoft, which automatically learns to
contrastively learn representations for various time series datasets and tasks,
namely Automated Contrastive Learning (AutoCL). We first construct a principled
universal search space of size over 3x1012, covering data augmentation,
embedding transformation, contrastive pair construction and contrastive losses.
Further, we introduce an efficient reinforcement learning algorithm, which
optimizes CLS from the performance on the validation tasks, to obtain more
effective CLS within the space. Experimental results on various real-world
tasks and datasets demonstrate that AutoCL could automatically find the
suitable CLS for a given dataset and task. From the candidate CLS found by
AutoCL on several public datasets/tasks, we compose a transferable Generally
Good Strategy (GGS), which has a strong performance for other datasets. We also
provide empirical analysis as a guidance for future design of CLS.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要