# DeEPCA: Decentralized Exact PCA with Linear Convergence Rate

@article{Ye2021DeEPCADE, title={DeEPCA: Decentralized Exact PCA with Linear Convergence Rate}, author={Haishan Ye and Tong Zhang}, journal={ArXiv}, year={2021}, volume={abs/2102.03990} }

Due to the rapid growth of smart agents such as weakly connected computational nodes and sensors, developing decentralized algorithms that can perform computations on local agents becomes a major research direction. This paper considers the problem of decentralized principal components analysis (PCA), which is a statistical method widely used for data analysis. We introduce a technique called subspace tracking to reduce the communication cost, and apply it to power iterations. This leads to a…

## 5 Citations

A Communication-Efficient and Privacy-Aware Distributed Algorithm for Sparse PCA

- Mathematics
- 2021

As a prominent variant of principal component analysis (PCA), sparse PCA attempts to find sparse loading vectors when conducting dimension reduction. This paper aims to calculate sparse PCA through…

FAST-PCA: A Fast and Exact Algorithm for Distributed Principal Component Analysis

- Computer Science, EngineeringArXiv
- 2021

A distributed PCA algorithm called FAST-PCA (Fast and exAct diSTributed PCA) is proposed that is efficient in terms of communication and can be proved to converge linearly and exactly to the principal components that lead to dimension reduction as well as uncorrelated features.

Decentralized Riemannian Gradient Descent on the Stiefel Manifold

- Computer Science, MathematicsICML
- 2021

DRGTA is the first decentralized algorithm with exact convergence for distributed optimization on Stiefel manifold and uses multi-step consensus to preserve the iteration in the local consensus region.

Decentralized Optimization Over the Stiefel Manifold by an Approximate Augmented Lagrangian Function

- Computer Science, MathematicsArXiv
- 2021

This paper proposes a decentralized algorithm, called DESTINY, which only invokes a single round of communications per iteration of the Stiefel manifold, and combines gradient tracking techniques with a novel approximate augmented Lagrangian function.

Distributed Principal Subspace Analysis for Partitioned Big Data: Algorithms, Analysis, and Implementation

- Computer Science, EngineeringIEEE Transactions on Signal and Information Processing over Networks
- 2021

This paper revisits the problem of distributed PSA/PCA under the general framework of an arbitrarily connected network of machines that lacks a central server to study the interplay between network topology and communications cost as well as the effects of straggler machines on the proposed algorithms.

## References

SHOWING 1-10 OF 33 REFERENCES

Can Decentralized Algorithms Outperform Centralized Algorithms? A Case Study for Decentralized Parallel Stochastic Gradient Descent

- Computer Science, MathematicsNIPS
- 2017

This paper studies a D-PSGD algorithm and provides the first theoretical analysis that indicates a regime in which decentralized algorithms might outperform centralized algorithms for distributed stochastic gradient descent.

Multi-consensus Decentralized Accelerated Gradient Descent

- Computer Science, MathematicsArXiv
- 2020

A novel algorithm is proposed that can achieve near optimal communication complexity, matching the known lower bound up to a logarithmic factor of the condition number of the problem.

Harnessing smoothness to accelerate distributed optimization

- Mathematics, Computer Science2016 IEEE 55th Conference on Decision and Control (CDC)
- 2016

This paper proposes a distributed algorithm that, despite using the same amount of communication per iteration as DGD, can effectively harnesses the function smoothness and converge to the optimum with a rate of O(1/t) if the objective function is strongly convex and smooth.

Accelerated linear iterations for distributed averaging

- Mathematics, Computer ScienceAnnu. Rev. Control.
- 2011

It is shown that the augmented linear iteration can solve the distributed averaging problem faster than the original linear iteration, but the adjustable parameter must be chosen carefully.

The decentralized estimation of the sample covariance

- Computer Science2008 42nd Asilomar Conference on Signals, Systems and Computers
- 2008

This work shows how a completely distributed scheme based on near neighbors communications is feasible, and applies the proposed method to the estimation of the direction of arrival of a signal source.

Distributed adaptive estimation of covariance matrix eigenvectors in wireless sensor networks with application to distributed PCA

- Mathematics, Computer ScienceSignal Process.
- 2014

A distributed adaptive algorithm to estimate the eigenvectors corresponding to the Q largest or smallest eigenvalues of the network-wide sensor signal covariance matrix in a wireless sensor network and provides convergence proofs, as well as numerical simulations to demonstrate the convergence and optimality of the algorithm.

Performance Analysis of the Decentralized Eigendecomposition and ESPRIT Algorithm

- Mathematics, Computer ScienceIEEE Transactions on Signal Processing
- 2016

It is shown that the decentralized power method is not an asymptotically consistent estimator of the eigenvectors of the true measurement covariance matrix unless the averaging consensus protocol is carried out over an infinitely large number of iterations.

Fast linear iterations for distributed averaging

- Computer Science, MathematicsSyst. Control. Lett.
- 2004

This work considers the problem of finding a linear iteration that yields distributed averaging consensus over a network, i.e., that asymptotically computes the average of some initial values given at the nodes, and gives several extensions and variations on the basic problem.

Fast and privacy preserving distributed low-rank regression

- Computer Science2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2017

This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint, called the fast DeFW algorithm, which incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network.

Decentralized Riemannian Gradient Descent on the Stiefel Manifold

- Computer Science, MathematicsICML
- 2021

DRGTA is the first decentralized algorithm with exact convergence for distributed optimization on Stiefel manifold and uses multi-step consensus to preserve the iteration in the local consensus region.