Scorch: A Library for Sparse Deep Learning
CoRR(2024)
摘要
The rapid growth in the size of deep learning models strains the capabilities
of traditional dense computation paradigms. Leveraging sparse computation has
become increasingly popular for training and deploying large-scale models, but
existing deep learning frameworks lack extensive support for sparse operations.
To bridge this gap, we introduce Scorch, a library that seamlessly integrates
efficient sparse tensor computation into the PyTorch ecosystem, with an initial
focus on inference workloads on CPUs. Scorch provides a flexible and intuitive
interface for sparse tensors, supporting diverse sparse data structures. Scorch
introduces a compiler stack that automates key optimizations, including
automatic loop ordering, tiling, and format inference. Combined with a runtime
that adapts its execution to both dense and sparse data, Scorch delivers
substantial speedups over hand-written PyTorch Sparse (torch.sparse) operations
without sacrificing usability. More importantly, Scorch enables efficient
computation of complex sparse operations that lack hand-optimized PyTorch
implementations. This flexibility is crucial for exploring novel sparse
architectures. We demonstrate Scorch's ease of use and performance gains on
diverse deep learning models across multiple domains. With only minimal code
changes, Scorch achieves 1.05-5.78x speedups over PyTorch Sparse on end-to-end
tasks. Scorch's seamless integration and performance gains make it a valuable
addition to the PyTorch ecosystem. We believe Scorch will enable wider
exploration of sparsity as a tool for scaling deep learning and inform the
development of other sparse libraries.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要