Deep Neural Networks for Predicting Task Time Series in Cloud Computing Systems

2019 IEEE 16th International Conference on Networking, Sensing and Control (ICNSC)(2019)

引用 28|浏览25
暂无评分
摘要
A large number of cloud services provided by cloud data centers have become the most important part of Internet services. In spite of numerous benefits, cloud providers face some challenging issues in accurate large-scale task time series prediction. Such prediction benefits providers since appropriate resource provisioning can be performed to ensure the full satisfaction of their service-level agreements with users without wasting computing and networking resources. In this work, we first perform a logarithmic operation before task sequence smoothing to reduce the standard deviation. Then, the method of a Savitzky-Golay (S-G) filter is chosen to eliminate the extreme points and noise interference in the original sequence. Next, this work proposes an integrated prediction method that combines the S-G filter with Long Short-Term Memory network models to predict task time series at the next time slot. We further adopt a gradient clipping method to eliminate the gradient exploding problem. Furthermore, in the process of model training, we choose optimizer Adam to achieve the best results. Experimental results demonstrate that it achieves better prediction results than some commonly-used prediction methods.
更多
查看译文
关键词
Cloud data centers,LSTM,recurrent neural networks,task time series prediction,Savitzky-Golay filter
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要