Green, Pervasive, and Cloud Computing

Lecture Notes in Computer Science(2019)

引用 6|浏览2
暂无评分
摘要
Machine Learning (ML) solutions need to deal efficiently with a huge amount of data available, addressing scalability concerns without sacrificing predictive performance. Moreover, this data comes in the form of a continuous and evolving stream imposing new constraints, e.g., limited memory and energy resources. In the same way, energyaware ML algorithms are gaining relevance due to the power constraints of hardware platforms in several real-life applications, as the Internet of Things (IoT). Many algorithms have been proposed to cope with the mutable nature of data streams, with the Very Fast Decision Tree (VFDT) being one of the most widely used. An adaptation of the VFDT, called Strict VFDT (SVFDT), can significantly reduce memory usage without putting aside the predictive performance and time efficiency. However, the analysis of energy consumption regarding data stream processing of the VFDT and SVFDT is overlooked. In this work, we compare the four-way relationship between predictive performance, memory costs, time efficiency and energy consumption, tuning the hyperparameters of the algorithms to optimise the resources devoted to it. Experiments over 23 benchmark datasets revealed that the SVFDT-I is the most energyfriendly algorithm and greatly reduced memory consumption, being statistically superior to the VFDT.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要