Scaling distributed machine learning with the parameter serverEI
OSDI, pp. 583-598, 2014.
We propose a parameter server framework for distributed machine learning problems. Both data and workloads are distributed over worker nodes, while the server nodes maintain globally shared parameters, represented as dense or sparse vectors and matrices. The framework manages asynchronous data communication between nodes, and supports fle...More