Chrome Extension
WeChat Mini Program
Use on ChatGLM

Improving Noise Tolerance of Hardware Accelerated Artificial Neural Networks

ICMLA(2018)

Cited 1|Views19
No score
Abstract
The noise effect during training and inference for deep artificial neural network hardware acceleration is analyzed in this paper. The noise effect is extremely important when designing hardware for machine learning due to non-ideal devices and circuits. We found that both forward propagation noise and weight update noise can have detrimental effect on the inference, but may not necessarily harm the training result. Pipelining approach supporting redundant runs for each input image is proposed to address the forward propagation noise, and an additional approach of using parallel memory weight cells to represent one synaptic weight is proposed to address the weight update noise. By using these approaches, MNIST classification accuracy on a deep neural network can be improved close to the ideal accuracy when no noise is present.
More
Translated text
Key words
machine learning hardware,non-volatile memory,deep neural networks,noise,inference,pipelining
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined