A communication-efficient and privacy-aware distributed algorithm for sparse PCA
@article{Wang2021ACA, title={A communication-efficient and privacy-aware distributed algorithm for sparse PCA}, author={Lei Wang and Xin Qi Liu and Yin Zhang}, journal={Computational Optimization and Applications}, year={2021}, pages={1-40} }
Sparse principal component analysis (PCA) improves interpretability of the classic PCA by introducing sparsity into the dimension-reduction process. Optimization models for sparse PCA, however, are generally non-convex, non-smooth and more difficult to solve, especially on large-scale datasets requiring distributed computation over a wide network. In this paper, we develop a distributed and centralized algorithm called DSSAL1 for sparse PCA that aims to achieve low communication overheads by…
2 Citations
Smoothing Gradient Tracking for Decentralized Optimization over the Stiefel Manifold with Non-smooth Regularizers
- Computer Science, Mathematics
- 2023
This paper proposes the first decentralized algorithm for non-smooth optimization over Stiefel manifolds, and establishes the convergence guarantee with the iteration complexity of $\mathcal{O} (\epsilon^{-4})$.
Decentralized Optimization Over the Stiefel Manifold by an Approximate Augmented Lagrangian Function
- Computer ScienceIEEE Transactions on Signal Processing
- 2022
This paper proposes a decentralized algorithm, called DESTINY, which only invokes a single round of communications per iteration, and combines gradient tracking techniques with a novel approximate augmented Lagrangian function.
68 References
A Distributed and Secure Algorithm for Computing Dominant SVD Based on Projection Splitting
- Computer ScienceArXiv
- 2020
This paper proposes a novel formulation for building consensus by equalizing subspaces spanned by splitting variables instead of equalizing the variables themselves, and presents convergence analysis results and extensive experimental results indicating that the proposed algorithm, while safely guarding data privacy, has a strong potential to deliver a cutting-edge performance.
DeEPCA: Decentralized Exact PCA with Linear Convergence Rate
- Computer ScienceJ. Mach. Learn. Res.
- 2021
DeEPCA is the first decentralized PCA algorithm with the number of communication rounds for each power iteration independent of target precision, which has a convergence rate similar to that of the centralized PCA, while achieving the best communication complexity among existing PCA algorithms.
Privacy Preservation in Distributed Subgradient Optimization Algorithms
- Computer ScienceIEEE Transactions on Cybernetics
- 2018
In this paper, some privacy-preserving features for distributed subgradient optimization algorithms are considered and the introduced two mechanisms of projection operation and asynchronous heterogeneous-stepsize optimization can guarantee that agents’ privacy can be effectively protected.
A Linearly Convergent Algorithm for Distributed Principal Component Analysis
- Computer ScienceSignal Process.
- 2022
ADMM Based Privacy-Preserving Decentralized Optimization
- Computer ScienceIEEE Transactions on Information Forensics and Security
- 2019
This work introduces a new ADMM, which allows time-varying penalty matrices and rigorously proves that it has a convergence rate of <inline-formula> <tex-math notation="LaTeX">$O(1/t)$ </tex-Math></inline- formula>.
Nonconvex alternating direction method of multipliers for distributed sparse principal component analysis
- Computer Science2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP)
- 2015
The proposed algorithms are able to handle a few sparse-promoting regularizers as well as different forms of data partition, and they are shown to converge to stationary solutions of various nonconvex SPCA formulations.
Distributed Picard Iteration: Application to Distributed EM and Distributed PCA
- Computer Science, MathematicsArXiv
- 2021
Two distributed algorithms are derived – distributed EM and distributed PCA – whose LLC guarantees follow from those that are proved for the DPI and its local linear convergence guarantees to make several contributions.
Sparse principal component analysis via regularized low rank matrix approximation
- Computer Science
- 2008
On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Computer Science, MathematicsJournal of the American Statistical Association
- 2009
A simple algorithm for selecting a subset of coordinates with largest sample variances is provided, and it is shown that if PCA is done on the selected subset, then consistency is recovered, even if p(n) ≫ n.
A feasible method for optimization with orthogonality constraints
- Computer ScienceMath. Program.
- 2013
The Cayley transform is applied—a Crank-Nicolson-like update scheme—to preserve the constraints and based on it, curvilinear search algorithms with lower flops are developed with high efficiency for polynomial optimization, nearest correlation matrix estimation and extreme eigenvalue problems.