• Publications
  • Influence
Network Coding for Distributed Storage Systems
TLDR
We introduce a general technique to analyze storage architectures that combine any form of coding and replication, as well as presenting two new schemes for maintaining redundancy using erasure codes. Expand
  • 1,572
  • 307
  • PDF
Low-complexity image denoising based on statistical modeling of wavelet coefficients
TLDR
We introduce a simple spatially adaptive statistical model for wavelet image coefficients and apply it to image denoising. Expand
  • 826
  • 67
  • PDF
Rate-distortion methods for image and video compression
TLDR
In this article we provide an overview of rate-distortion (R-D) based optimization techniques and their practical application to image and video coding. Expand
  • 855
  • 52
  • PDF
Speeding Up Distributed Machine Learning Using Codes
TLDR
Codes are widely used in many engineering applications to offer <italic>robustness</italic>. Expand
  • 292
  • 52
  • PDF
Best wavelet packet bases in a rate-distortion sense
TLDR
A fast rate-distortion (R-D) optimal scheme for coding adaptive trees whose individual nodes spawn descendents forming a disjoint and complete basis cover for the space spanned by their parent nodes is presented. Expand
  • 838
  • 48
Distributed source coding using syndromes (DISCUS): design and construction
TLDR
We address the problem of distributed source coding, i.e. compression of correlated sources that are not co-located and/or cannot communicate with each other to minimize their joint description cost. Expand
  • 509
  • 45
Fractional repetition codes for repair in distributed storage systems
TLDR
We introduce a new class of exact Minimum-Bandwidth Regenerating (MBR) codes for distributed storage systems, characterized by a low-complexity uncoded repair process that can tolerate multiple node failures. Expand
  • 177
  • 44
  • PDF
A Survey on Network Codes for Distributed Storage
TLDR
This paper provides an overview of the research results on this topic. Expand
  • 623
  • 41
  • PDF
Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates
TLDR
We develop distributed learning algorithms that are provably robust against Byzantine failures, with a focus on achieving optimal statistical performance. Expand
  • 187
  • 38
  • PDF
Multiple description source coding using forward error correction codes
TLDR
We present an efficient multiple description (MD) source coding scheme to achieve robust communication over unreliable channels. Expand
  • 324
  • 37