#### Filter Results:

- Full text PDF available (26)

#### Publication Year

2009

2017

- This year (6)
- Last 5 years (31)
- Last 10 years (40)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

#### Organism

Learn More

- Tianyi Zhou, Dacheng Tao
- ICML
- 2011

Low-rank and sparse structures have been profoundly studied in matrix completion and compressed sensing. In this paper, we develop “Go Decomposition” (GoDec) to efficiently and robustly estimate the low-rank partL and the sparse part S of a matrix X = L + S + G with noise G. GoDec alternatively assigns the low-rank approximation of X − S to L and the sparse… (More)

- Tianyi Zhou, Dacheng Tao
- IEEE Transactions on Image Processing
- 2013

Learning tasks such as classification and clustering usually perform better and cost less (time and space) on compressed representations than on the original data. Previous works mainly compress data via dimension reduction. In this paper, we propose “double shrinking” to compress image data on both dimensionality and cardinality via building… (More)

- Tianyi Zhou, Dacheng Tao, Xindong Wu
- Machine Learning
- 2011

Directly applying single-label classification methods to the multi-label learning problems substantially limits both the performance and speed due to the imbalance, dependence and high dimensionality of the given label matrix. Existing methods either ignore these three problems or reduce one with the price of aggravating another. In this paper, we propose a… (More)

- Tianyi Zhou, Dacheng Tao
- 2012 IEEE International Symposium on Information…
- 2012

Low-rank structure have been profoundly studied in data mining and machine learning. In this paper, we show a dense matrix X's low-rank approximation can be rapidly built from its left and right random projections Y<sub>1</sub> = XA<sub>1</sub> and Y<sub>2</sub> = X<sup>T</sup> A<sub>2</sub>, or bilateral random projection (BRP). We then show power scheme… (More)

- Tianyi Zhou, Dacheng Tao
- AISTATS
- 2013

Recovering a large low-rank matrix from highly corrupted, incomplete or sparse outlier overwhelmed observations is the crux of various intriguing statistical problems. We explore the power of “greedy bilateral (GreB)” paradigm in reducing both time and sample complexities for solving these problems. GreB models a lowrank variable as a bilateral… (More)

- Tianyi Zhou, Dacheng Tao
- IJCAI
- 2013

In low-rank & sparse matrix decomposition, the entries of the sparse part are often assumed to be i.i.d. sampled from a random distribution. But the structure of sparse part, as the central interest of many problems, has been rarely studied. One motivating problem is tracking multiple sparse object flows (motions) in video. We introduce “shifted subspaces… (More)

- Tianyi Zhou, Dacheng Tao
- 2013 IEEE International Symposium on Information…
- 2012

We consider recovering d-level quantization of a signal from k-level quantization of linear measurements. This problem has great potential in practical systems, but has not been fully addressed in compressed sensing (CS). We tackle it by proposing k-bit Hamming compressed sensing (HCS). It reduces the decoding to a series of hypothesis tests of the bin… (More)

- Tianyi Zhou, Dacheng Tao
- SMC
- 2009

In this paper, we present the manifold elastic net (MEN) for sparse variable selection. MEN combines merits of the manifold regularization and the elastic net regularization, so it considers both the nonlinear manifold structure of a dataset and the sparse property of the redundant data representation. Face based gender recognition has received much… (More)

- Tianyi Zhou, Dacheng Tao, Xindong Wu
- Data Mining and Knowledge Discovery
- 2010

It is difficult to find the optimal sparse solution of a manifold learning based dimensionality reduction algorithm. The lasso or the elastic net penalized manifold learning based dimensionality reduction is not directly a lasso penalized least square problem and thus the least angle regression (LARS) (Efron et al., Ann Stat 32(2):407–499, 2004), one of the… (More)

- Tianyi Zhou, Dacheng Tao, Xindong Wu
- 2010 IEEE International Conference on Data Mining
- 2010

Support vector machines (SVMs) are invaluable tools for many practical applications in artificial intelligence, e.g., classification and event recognition. However, popular SVM solvers are not sufficiently efficient for applications with a great deal of samples as well as a large number of features. In this paper, thus, we present NESVM, a fast gradient SVM… (More)