Challenges in Machine Learning Techniques to Estimate Reliability from Transistors to Circuits

2023 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT)(2023)

引用 0|浏览0
暂无评分
摘要
Transistor and circuit reliability estimations face various challenges in both traditional and machine learning (ML) based approaches. In this work, we provide an overview of the toughest challenges faced by traditional physics-based reliability estimations, such as exposing sensitive transistor data, unfeasible execution times, material defect interactions, etc. Similarly, challenges for ML-based approaches are also highlighted, such as the aging recovery and history effects and high training effort. We present multiple solutions to overcome these challenges, such as high-performance physics-based aging models, history-aware machine learning, and techniques to reduce training data sets. We highlight, for the first time, that circuit reliability estimation can be achieved by bypassing the transistor level with ML-generated degraded standard cell libraries. Our high-performance aging models and circuit simulators provide speedups ranging from 4000x to 240,000x, while our standard cell ML techniques achieve 99.9% accuracy in less than 1 second inference time.
更多
查看译文
关键词
Machine Learning,Reliability,Transistor Reliability,Circuit Reliability,Aging,Bias Temperature Instability,Hot-Carrier,Degradation,Self-Heating
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要