Chrome Extension
WeChat Mini Program
Use on ChatGLM

Neural Network Compiler for Parallel High-Throughput Simulation of Digital Circuits.

IPDPS(2023)

Cited 0|Views11
No score
Abstract
Register Transfer Level (RTL) simulation and verification of Digital Circuits are extremely important and costly tasks in the Integrated Circuits industry. While some simulators have incorporated the exploitation of parallelism in the structure of Digital Circuits to run on multi-core CPUs, the maximum throughput they achieve quickly reaches a plateau, as described by Amdahl's Law. Recent research from Nvidia has obtained much higher throughput in simulations using GPUs, highlighting the potential of these devices for Digital Circuit simulation. However, they were required to incorporate sophisticated algorithms to support GPU simulation. In addition, the unbalanced structure of real-life Digital Circuits provides difficulties for processing on multi-threaded devices. In this paper, we present a Digital Circuit compiler that utilizes Neural Networks to exploit the various parallelisms in RTL simulation, making use of PyTorch, a widely-used Neural Network framework that facilitate their simulation on GPUs. By using properties of Boolean Functions, we developed a novel algorithm that converts any Digital Circuit into a Neural Network, and optimization techniques that help in pushing the thread computational capability to the limit. The results show three orders of magnitude higher throughput than Verilator RTL simulator, an improvement of one order of magnitude compared to the state-of-the-art GPU techniques from Nvidia. We believe that the use of Neural Networks not only provides a significant improvement in simulation and verification tasks in the Integrated Circuits industry, but also opens a line of research for simulators at the logic and physical gate level.
More
Translated text
Key words
Integrated Circuits, Design and Verification of Digital Circuits, RTL Synthesis and Simulation, Neural Network Representation, Parallel Computation, GPU
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined