Pursuit of Low-Rank Models of Time-Varying Matrices Robust to Sparse and Measurement Noise

@article{Akhriev2020PursuitOL,
  title={Pursuit of Low-Rank Models of Time-Varying Matrices Robust to Sparse and Measurement Noise},
  author={Albert Akhriev and Jakub Marecek and Andrea Simonetto},
  journal={ArXiv},
  year={2020},
  volume={abs/1809.03550}
}
In tracking of time-varying low-rank models of time-varying matrices, we present a method robust to both uniformly-distributed measurement noise and arbitrarily-distributed ``sparse'' noise. In theory, we bound the tracking error. In practice, our use of randomised coordinate descent is scalable and allows for encouraging results on changedetection net, a benchmark. 
Estimation of Sensitivities: Low-rank Approach and Online Algorithms for Streaming Measurements
TLDR
An online proximal-gradient method is proposed to estimate sensitivities on-the-fly from real-time measurements and convergence results in terms of dynamic regret are offered in this case. Expand
Matrix Completion Under Interval Uncertainty: Highlights
TLDR
An overview of inequality-constrained matrix completion, with a particular focus on alternating least-squares (ALS) methods, and an ALS algorithm MACO by Marecek et al. outperforms others. Expand
Low-Rank Methods in Event Detection and Subsampled Point-to-Subspace Proximity Tests
TLDR
The proposed algorithm uses a variant of low-rank factorisation, which considers interval uncertainty sets around “known entries”, on a suitable flattening of the input data to obtain a low- rank model and bound the one-sided error as a function of the number of coordinates employed using techniques from learning theory and computational geometry. Expand
Time-Varying Convex Optimization: Time-Structured Algorithms and Applications
TLDR
A broad class of state-of-the-art algorithms for time-varying optimization is reviewed, with an eye to performing both algorithmic development and performance analysis, to exemplify wide engineering relevance of analytical tools and pertinent theoretical foundations. Expand
On Sampling Complexity of the Semidefinite Affine Rank Feasibility Problem
TLDR
An analytical bound on the number of relaxations that are sufficient to solve in order to obtain a solution of a generic instance of the semidefinite affine rank feasibility problem or prove that there is no solution is proposed. Expand

References

SHOWING 1-10 OF 94 REFERENCES
Global Optimality of Local Search for Low Rank Matrix Recovery
TLDR
It is shown that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements, which yields a polynomial time global convergence guarantee for stochastic gradient descent. Expand
Convergence of Gradient Descent for Low-Rank Matrix Approximation
TLDR
A proof of global convergence of gradient search for low-rank matrix approximation is provided based on the interpretation of the problem as an optimization on the Grassmann manifold and Fubiny-Study distance on this space. Expand
Large-Scale Convex Minimization with a Low-Rank Constraint
TLDR
This work proposes an efficient greedy algorithm which can scale to large matrices arising in several applications such as matrix completion for collaborative filtering and robust low rank matrix approximation. Expand
Recovery of Low-Rank Plus Compressed Sparse Matrices With Application to Unveiling Traffic Anomalies
TLDR
First-order algorithms are developed to solve the nonsmooth convex optimization problem with provable iteration complexity guarantees and its ability to outperform existing alternatives is corroborated. Expand
An Online Algorithm for Separating Sparse and Low-Dimensional Signal Sequences From Their Sum
This paper designs and extensively evaluates an online algorithm, called practical recursive projected compressive sensing (Prac-ReProCS), for recovering a time sequence of sparse vectors St and aExpand
An Online Parallel and Distributed Algorithm for Recursive Estimation of Sparse Signals
In this paper, we consider a recursive estimation problem for linear regression where the signal to be estimated admits a sparse representation and measurement samples are only sequentiallyExpand
On a Problem of Weighted Low-Rank Approximation of Matrices
  • Aritra Dutta, Xin Li
  • Mathematics, Computer Science
  • SIAM J. Matrix Anal. Appl.
  • 2017
TLDR
An algorithm based on the alternating direction method is proposed to solve the weighted low rank approximation problem and compare it with the state-of-art general algorithms such as the weighted total alternating least squares and the EM algorithm. Expand
A Batch-Incremental Video Background Estimation Model Using Weighted Low-Rank Approximation of Matrices
TLDR
This work builds a batch-incremental background estimation model by using a special weighted low-rank approximation of matrices that is superior to the existing state-of-the-art background estimation algorithms such as GRASTA, ReProCS, incPCP, and GFL. Expand
Robust Matrix Factorization with Unknown Noise
TLDR
A low-rank matrix factorization problem with a Mixture of Gaussians (MoG) noise, which is a universal approximator for any continuous distribution, and hence is able to model a wider range of real noise distributions. Expand
Weighted Low-Rank Approximation of Matrices and Background Modeling
TLDR
This work demonstrates through extensive experiments that by inserting a simple weight in the Frobenius norm, it can be made robust to the outliers similar to the $\ell_1$ norm. Expand
...
1
2
3
4
5
...