A hybrid CNN-LSTM model for predicting server load in cloud computing

The Journal of Supercomputing(2022)

引用 14|浏览3
暂无评分
摘要
Complex resource usage patterns of scaling Cloud workloads and heterogeneous infrastructure remain a challenge for accurate modelling of server load, which is the key to effective capacity sizing and provisioning in data centers. Recently, Long Short-Term Memory (LSTM) network has been used for host load prediction. However, learning complex noisy variations in host load is still an issue that needs to be addressed. In this work, we propose pCNN-LSTM, a hybrid prediction approach comprising of 1-dimensional Convolution Neural Networks (1D CNN) and LSTM, to predict CPU utilization on Cloud servers at multiple consecutive time-steps. It consists of three parallel dilated 1D CNN layers with different dilation rates for pattern extraction from noisy host CPU usage and an LSTM layer that learns temporal dependencies within the raw usage values as well as within the patterns extracted by the 1D CNN layers. Convolutions with different dilation rates enable the model to learn CPU load variations at different scales. Prediction skill of pCNN-LSTM is demonstrated using Google cluster trace, Alibaba trace and Bitbrains data, and performance is measured using Mean Squared Error (MSE) and Root Mean Squared Error (RMSE). pCNN-LSTM achieves up to 15%, 13% and 16% improvements in host load prediction with Google Trace, Alibaba trace and Bitbrains data set, respectively, over LSTM, Bidirectional LSTM (BLSTM), CNN-LSTM, CNN-BLSTM and two of its variants, showing the effectiveness of multi-scale learning capability of pCNN-LSTM and establishes its applicability as an adaptive prediction method for improved capacity planning and provisioning.
更多
查看译文
关键词
Cloud computing,Capacity planning,Long short-term memory network,1-dimensional Convolution Neural Networks,Dilated convolutions,Receptive fields,Temporal patterns
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要