Opt: An Efficient Optimal Autoscaler for Microservice Applications

2023 IEEE International Conference on Autonomic Computing and Self-Organizing Systems (ACSOS)(2023)

引用 0|浏览0
暂无评分
摘要
Microservices are a popular architecture for cloud-based applications subject to stringent performance requirements. To effectively serve variable workloads, autoscaling allocates computational resources ideally at the lowest possible cost. Although several autoscaling techniques have already been proposed in the literature, they suffer from high computational complexity. Here we propose mu Opt as a computationally efficient model-based autoscaler for microservices. By solving a nonlinear optimization problem that embeds a layered queueing network (LQN) model, mu Opt computes optimal configurations maximizing performance while minimizing allocated resources. We validate mu Opt on a benchmark microservice application, reporting fast solution times (similar to 10(-1) s) that enable prompt reactions to highly variable workloads. Compared to a state-of-the-art autoscaler based on LQN and genetic algorithms, mu Opt achieves higher performance (similar to 6%-8%) with significantly fewer allocated resources (similar to 15%-35%) in the presence of both synthetic and real-world workloads.
更多
查看译文
关键词
Microservices,Autoscaling,Performance Modeling,Optimization,Layered Queueing Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要