Straggler-Resistant Distributed Matrix Computation via Coding Theory: Removing a Bottleneck in Large-Scale Data Processing

Thumbnail Image
Date
2020-05-01
Authors
Das, Anindya Bijoy
Tang, Li
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

The current BigData era routinely requires the processing of large scale data on massive distributed computing clusters. Such large scale clusters often suffer from the problem of "stragglers", which are defined as slow or failed nodes. The overall speed of a computational job on these clusters is typically dominated by stragglers in the absence of a sophisticated assignment of tasks to the worker nodes. In recent years, approaches based on coding theory (referred to as "coded computation") have been effectively used for straggler mitigation. Coded computation offers significant benefits for specific classes of problems such as distributed matrix computations (which play a crucial role in several parts of the machine learning pipeline). The essential idea is to create redundant tasks so that the desired result can be recovered as long as a certain number of worker nodes complete their tasks. In this survey article, we overview recent developments in the field of coding for straggler-resilient distributed matrix computations.

Series Number
Journal Issue
Is Version Of
Versions
Series
Type
article
Comments

This is a manuscript of an article published as Ramamoorthy, Aditya, Anindya Bijoy Das, and Li Tang. "Straggler-Resistant Distributed Matrix Computation via Coding Theory: Removing a Bottleneck in Large-Scale Data Processing." IEEE Signal Processing Magazine 37, no. 3 (2020): 136-145. DOI: 10.1109/MSP.2020.2974149.Posted with permission.

Rights Statement
Copyright
Wed Jan 01 00:00:00 UTC 2020
Funding
Subject Categories
DOI
Supplemental Resources
Collections