Toward Learning Joint Inference Tasks for IASS-MTS Using Dual Attention Memory With Stochastic Generative Imputation

IEEE transactions on neural networks and learning systems(2023)

引用 0|浏览6
暂无评分
摘要
Irregularly, asynchronously and sparsely sampled multivariate time series (IASS-MTS) are characterized by sparse and uneven time intervals and nonsynchronous sampling rates, posing significant challenges for machine learning models to learn complex relationships within and beyond IASS-MTS to support various inference tasks. The existing methods typically either focus solely on single-task forecasting or simply concatenate them through a separate preprocessing imputation procedure for the subsequent classification application. However, these methods often ignore valuable annotated labels or fail to discover meaningful patterns from unlabeled data. Moreover, the approach of separate prefilling may introduce errors due to the noise in raw records, and thus degrade the downstream prediction performance. To overcome these challenges, we propose the time-aware dual attention and memory-augmented network (DAMA) with stochastic generative imputation (SGI). Our model constructs a joint task learning architecture that unifies imputation and classification tasks collaboratively. First, we design a new time-aware DAMA that accounts for irregular sampling rates, inherent data nonalignment, and sparse values in IASS-MTS data. The proposed network integrates both attention and memory to effectively analyze complex interactions within and across IASS-MTS for the classification task. Second, we develop the stochastic generative imputation (SGI) network that uses auxiliary information from sequence data for inferring the time series missing observations. By balancing joint tasks, our model facilitates interaction between them, leading to improved performance on both classification and imputation tasks. Third, we evaluate our model on real-world datasets and demonstrate its superior performance in terms of imputation accuracy and classification results, outperforming the baselines.
更多
查看译文
关键词
Deep generative model,dual attention memory,irregularly sampled mutivariate time series,joint task learning,stochastic imputation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要