Stochastic Data-Driven Hardware Resilience To Efficiently Train Inference Models For Stochastic Hardware Implementations

2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)(2019)

引用 22|浏览10
暂无评分
摘要
Machine-learning algorithms are being employed in an increasing range of applications, spanning high-performance and energy-constrained platforms. It has been noted that the statistical nature of the algorithms can open up new opportunities for throughput and energy efficiency, by moving hardware into design regimes not limited to deterministic models of computation. This work aims to enable high accuracy in machine-learning inference systems, where computations are substantially affected by hardware variability. Previous work has overcome this by training inference model parameters for a particular instance of variation-affected hardware. Here, training is instead performed for the distribution of variation-affected hardware, eliminating the need for instance-by-instance training. The approach is referred to as Stochastic Data-Driven Hardware Resilience (S-DDHR), and it is demonstrated for an in-memory-computing architecture based on magnetoresistive random-access memory (MRAM). S-DDHR successfully address different samples of stochastic hardware, which would otherwise suffer degraded performance due to hardware variability.
更多
查看译文
关键词
Fault tolerance, In-memory Computing, Machine Learning, Statistical Computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要