Reviewing and Discussing Graph Reduction in Edge Computing Context

COMPUTATION(2022)

Cited 0|Views2
No score
Abstract
Much effort has been devoted to transferring efficiently different machine-learning algorithms, and especially deep neural networks, to edge devices in order to fulfill, among others, real-time, storage and energy-consumption issues. The limited resources of edge devices and the necessity for energy saving to lengthen the durability of their batteries, has encouraged an interesting trend in reducing neural networks and graphs, while keeping their predictability almost untouched. In this work, an alternative to the latest techniques for finding these reductions in networks size is proposed, seeking to figure out a simplistic way to shrink networks while maintaining, as far as possible, their predictability testing on well-known datasets.
More
Translated text
Key words
graph reduction,edge computing,artificial intelligence,pruning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined