Learn More
This paper examines the problem of locating outlier columns in a large, otherwise low-rank, matrix. We propose a simple two-step adaptive sensing and inference approach and establish theoretical guarantees for its performance; our results show that accurate outlier identification is achievable using very few linear summaries of the original data matrix-as(More)
We propose a stochastic variance reduced optimization algorithm for solving sparse learning problems with cardinality constraints. Sufficient conditions are provided, under which the proposed algorithm enjoys strong linear convergence guarantees and optimal estimation accuracy in high dimensions. We further extend the proposed algorithm to an asynchronous(More)
We propose a general theory for studying the geometry of nonconvex objective functions with underlying symmetric structures. In specific, we characterize the locations of stationary points and the null space of the associated Hessian matrices via the lens of invariant groups. As a major motivating example, we apply the proposed general theory to(More)
With the rapid development of both information technology and the management of modern medical regulation, the generation of medical records tends to be increasingly intelligent. In this paper, Case-Based Reasoning is applied to the process of generating records of dental cases. Based on the analysis of the features of dental records, a case base is(More)
This paper describes an R package named flare, which implements a family of new high dimensional regression methods (LAD Lasso, SQRT Lasso, ℓ q Lasso, and Dantzig selector) and their extensions to sparse precision matrix estimation (TIGER and CLIME). These methods exploit different nonsmooth loss functions to gain modeling exibility, estimation robustness,(More)
In this paper we consider the task of locating salient group-structured features in potentially high-dimensional images; the salient feature detection here is modeled as a Robust Principal Component Analysis problem, in which the aim is to locate groups of outlier columns embedded in an otherwise low rank matrix. We adapt an adaptive compressive sensing(More)
This paper describes an R package named flare, which implements a family of new high dimensional regression methods (LAD Lasso, SQRT Lasso, `q Lasso, and Dantzig selector) and their extensions to sparse precision matrix estimation (TIGER and CLIME). These methods exploit different nonsmooth loss functions to gain modeling flexibility, estimation robustness,(More)
The Adaptive Compressive Outlier Sensing (ACOS) method, proposed recently in (Li & Haupt, 2015), is a randomized sequential sampling and inference method designed to locate column outliers in large, otherwise low rank, matrices. While the original ACOS established conditions on the sample complexity (i.e., the number of scalar linear measurements)(More)
This paper examines the problem of locating outlier columns in a large, otherwise low-rank, matrix. We propose a simple two-step adaptive sensing and inference approach and establish theoretical guarantees for its performance. Our results show that accurate outlier identification is achievable using very few linear summaries of the original data matrix - as(More)
In this paper, we study robust principal component analysis on tensors, in the setting where framewise outliers exist. We propose a convex formulation to decompose a tensor into a low rank component and a frame-wise sparse component. Theoretically, we guarantee that exact subspace recovery and outlier identification can be achieved under mild model(More)