Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

dc.contributor.author Zhang, Xin
dc.contributor.author Zhu, Zhengyuan
dc.contributor.author Liu, Jia
dc.contributor.author Zhu, Zhengyuan
dc.contributor.author Bentley, Elizabeth
dc.contributor.department Computer Science
dc.contributor.department Statistics
dc.date 2020-06-18T03:05:08.000
dc.date.accessioned 2020-07-02T06:55:44Z
dc.date.available 2020-07-02T06:55:44Z
dc.date.embargo 2020-06-17
dc.date.issued 2019-01-01
dc.description.abstract <p>Network consensus optimization has received increasing attention in recent years and has found important applications in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). However, in networks with slow communication rates, DGD's performance is unsatisfactory for solving high-dimensional network consensus problems due to the communication bottleneck. This motivates us to design a communication-efficient DGD-type algorithm based on compressed information exchanges. Our contributions in this paper are three-fold: i) We develop a communication-efficient algorithm called amplified-differential compression DGD (ADC-DGD) and show that it converges under any unbiased compression operator; ii) We rigorously prove the convergence performances of ADC-DGD and show that they match with those of DGD without compression; iii) We reveal an interesting phase transition phenomenon in the convergence speed of ADC-DGD. Collectively, our findings advance the state-of-the-art of network consensus optimization theory.</p>
dc.description.comments <p>This proceeding was published as Zhang, Xin, Jia Liu, Zhengyuan Zhu, and Elizabeth S. Bentley. "Compressed distributed gradient descent: Communication-efficient consensus over networks." In <em>IEEE INFOCOM 2019-IEEE Conference on Computer Communications</em>. (2019): 2431-2439. DOI: <a href="https://doi.org/10.1109/INFOCOM.2019.8737489" target="_blank">10.1109/INFOCOM.2019.8737489</a>. </p>
dc.format.mimetype application/pdf
dc.identifier archive/lib.dr.iastate.edu/stat_las_conf/13/
dc.identifier.articleid 1012
dc.identifier.contextkey 18147708
dc.identifier.s3bucket isulib-bepress-aws-west
dc.identifier.submissionpath stat_las_conf/13
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/90246
dc.language.iso en
dc.source.bitstream archive/lib.dr.iastate.edu/stat_las_conf/13/2019_ZhuZhengyuan_CompressedDistributed.pdf|||Fri Jan 14 19:41:28 UTC 2022
dc.source.uri 10.1109/INFOCOM.2019.8737489
dc.subject.disciplines Applied Statistics
dc.subject.disciplines Databases and Information Systems
dc.subject.disciplines Programming Languages and Compilers
dc.subject.keywords Convergence
dc.subject.keywords Optimization
dc.subject.keywords Linear programming
dc.subject.keywords Information exchange
dc.subject.keywords Eigenvalues and eigenfunctions
dc.subject.keywords Robot sensing systems
dc.subject.keywords Wireless sensor networks
dc.title Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks
dc.type article
dc.type.genre conference
dspace.entity.type Publication
relation.isAuthorOfPublication 51db2a08-8f9d-4f97-bdbc-6790b3d5a608
relation.isOrgUnitOfPublication f7be4eb9-d1d0-4081-859b-b15cee251456
relation.isOrgUnitOfPublication 264904d9-9e66-4169-8e11-034e537ddbca
File
Original bundle
Now showing 1 - 1 of 1
Name:
2019_ZhuZhengyuan_CompressedDistributed.pdf
Size:
262.73 KB
Format:
Adobe Portable Document Format
Description: