Learn More
The problem of feature selection has aroused considerable research interests in the past few years. Traditional learning based feature selection methods separate embedding learning and feature ranking. In this paper, we introduce a novel unsupervised feature selection approach via Joint Embedding Learning and Sparse Regression (JELSR). Instead of simply(More)
Feature selection has aroused considerable research interests during the last few decades. Traditional learning-based feature selection methods separate embedding learning and feature ranking. In this paper, we propose a novel unsupervised feature selection framework, termed as the joint embedding learning and sparse regression (JELSR), in which the(More)
Matrices, or more generally, multi-way arrays (tensors) are common forms of data that are encountered in a wide range of real applications. How to classify this kind of data is an important research topic for both pattern recognition and machine learning. In this paper, by analyzing the relationship between two famous traditional classification approaches,(More)
In many real applications of machine learning and data mining, we are often confronted with high-dimensional data. How to cluster high-dimensional data is still a challenging problem due to the curse of dimensionality. In this paper, we try to address this problem using joint dimensionality reduction and clustering. Different from traditional approaches(More)
The problem of image classification has aroused considerable research interest in the field of image processing. Traditional methods often convert an image to a vector and then use a vector-based classifier. In this paper, a novel multiple rank regression model (MRR) for matrix data classification is proposed. Unlike traditional vector-based methods, we(More)
Feature selection and feature transformation, the two main ways to reduce dimensionality, are often presented separately. In this paper, a feature selection method is proposed by combining the popular transformation-based dimensionality reduction method linear discriminant analysis (LDA) and sparsity regularization. We impose row sparsity on the(More)
Citation between papers can be treated as a causal relationship. In addition, some citation networks have a number of similarities to the causal networks in network cosmology, e.g., the similar in-and out-degree distributions. Hence, it is possible to model the citation network using network cosmology. The casual network models built on homogenous(More)