Raman Sankaran

Learn More
Motivated from real world problems, like object categorization, we study a particular mixed-norm regularization for Multiple Kernel Learning (MKL). It is assumed that the given set of kernels are grouped into distinct components where each component is crucial for the learning task at hand. The formulation hence employs l∞ regularization for promoting(More)
This paper1 presents novel algorithms and applications for a particular class of mixed-norm regularization based Multiple Kernel Learning (MKL) formulations. The formulations assume that the given kernels are grouped and employ l1 norm regularization for promoting sparsity within RKHS norms of each group and ls,s ≥ 2 norm regularization for promoting(More)
Multiple Kernel Learning(MKL) on Support Vector Machines(SVMs) has been a popular front of research in recent times due to its success in application problems like Object Categorization. This success is due to the fact that MKL has the ability to choose from a variety of feature kernels to identify the optimal kernel combination. But the initial formulation(More)
Recent literature [1] suggests that embedding a graph on an unit sphere leads to better generalization for graph transduction. However, the choice of optimal embedding and an efficient algorithm to compute the same remains open. In this paper, we show that orthonormal representations, a class of unit-sphere graph embeddings are PAC learnable. Existing(More)
In this article, we provide additional statements and proofs complementing the main paper. We present here the proofs of the statements given in the main paper. The section numbers in this document are arranged in correspondence to the respective sections in the main paper. 3 Related Work: OWL, OSCAR, and SLOPE 3.1 Proof of Proposition 3.1 Proof. 1. Let a =(More)
The failure of LASSO to identify groups of correlated predictors in linear regression has sparked significant research interest. Recently, various norms [1, 2] were proposed, which can be best described as instances of ordered weighted l1 norms (OWL) [3], as an alternative to l1 regularization used in LASSO. OWL can identify groups of correlated variables(More)
  • 1