Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

Date
2019-01-01
Authors
Zhang, Xin
Zhu, Zhengyuan
Liu, Jia
Zhu, Zhengyuan
Bentley, Elizabeth
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Computer Science
Organizational Unit
Statistics
Organizational Unit
Journal Issue
Series
Department
Computer ScienceStatistics
Abstract

Network consensus optimization has received increasing attention in recent years and has found important applications in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). However, in networks with slow communication rates, DGD's performance is unsatisfactory for solving high-dimensional network consensus problems due to the communication bottleneck. This motivates us to design a communication-efficient DGD-type algorithm based on compressed information exchanges. Our contributions in this paper are three-fold: i) We develop a communication-efficient algorithm called amplified-differential compression DGD (ADC-DGD) and show that it converges under any unbiased compression operator; ii) We rigorously prove the convergence performances of ADC-DGD and show that they match with those of DGD without compression; iii) We reveal an interesting phase transition phenomenon in the convergence speed of ADC-DGD. Collectively, our findings advance the state-of-the-art of network consensus optimization theory.

Comments

This proceeding was published as Zhang, Xin, Jia Liu, Zhengyuan Zhu, and Elizabeth S. Bentley. "Compressed distributed gradient descent: Communication-efficient consensus over networks." In IEEE INFOCOM 2019-IEEE Conference on Computer Communications. (2019): 2431-2439. DOI: 10.1109/INFOCOM.2019.8737489.

Description
Keywords
Citation
DOI