PartIR: Composing SPMD Partitioning Strategies for Machine Learning

Sami Alabed, Daniel Belov, Bart Chrzaszcz,Juliana Franco,Dominik Grewe,Dougal Maclaurin,James Molloy, Tom Natan,Tamara Norman, Xiaoyue Pan,Adam Paszke,Norman A. Rink,Michael Schaarschmidt, Timur Sitdikov, Agnieszka Swietlik,Dimitrios Vytiniotis, Joel Wee

CoRR(2024)

Cited 0|Views11
No score
Abstract
Training of modern large neural networks (NN) requires a combination of parallelization strategies encompassing data, model, or optimizer sharding. When strategies increase in complexity, it becomes necessary for partitioning tools to be 1) expressive, allowing the composition of simpler strategies, and 2) predictable to estimate performance analytically. We present PartIR, our design for a NN partitioning system. PartIR is focused on an incremental approach to rewriting and is hardware-and-runtime agnostic. We present a simple but powerful API for composing sharding strategies and a simulator to validate them. The process is driven by high-level programmer-issued partitioning tactics, which can be both manual and automatic. Importantly, the tactics are specified separately from the model code, making them easy to change. We evaluate PartIR on several different models to demonstrate its predictability, expressibility, and ability to reach peak performance..
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined