Chrome Extension
WeChat Mini Program
Use on ChatGLM

Balancing Communication and Computation in Gradient Tracking Algorithms for Decentralized Optimization

Albert S. Berahas, Raghu Bollapragada,Shagun Gupta

arXiv (Cornell University)(2023)

Cited 0|Views10
No score
Abstract
Gradient tracking methods have emerged as one of the most popular approaches for solving decentralized optimization problems over networks. In this setting, each node in the network has a portion of the global objective function, and the goal is to collectively optimize this function. At every iteration, gradient tracking methods perform two operations (steps): $(1)$ compute local gradients, and $(2)$ communicate information with local neighbors in the network. The complexity of these two steps varies across different applications. In this paper, we present a framework that unifies gradient tracking methods and is endowed with flexibility with respect to the number of communication and computation steps. We establish unified theoretical convergence results for the algorithmic framework with any composition of communication and computation steps, and quantify the improvements achieved as a result of this flexibility. The framework recovers the results of popular gradient tracking methods as special cases, and allows for a direct comparison of these methods. Finally, we illustrate the performance of the proposed methods on quadratic functions and binary classification problems.
More
Translated text
Key words
gradient tracking algorithms
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined