• Publications
  • Influence
Learning Multilingual Word Embeddings in Latent Metric Space: A Geometric Approach
This work proposes a novel geometric approach for learning bilingual mappings given monolingual embeddings and a bilingual dictionary that outperforms previous approaches on the bilingual lexicon induction and cross-lingual word similarity tasks. Expand
A Convex Feature Learning Formulation for Latent Task Structure Discovery
The main contribution is a convex formulation that employs a graph-based regularizer and simultaneously discovers few groups of related tasks, having close-by task parameters, as well as the feature space shared within each group. Expand
Efficient Output Kernel Learning for Multiple Tasks
It is shown in this paper that for a certain class of regularizers on the output kernel, the constraint of being positive semidefinite can be dropped as it is automatically satisfied for the relaxed problem, leading to an unconstrained dual problem which can be solved efficiently. Expand
On p-norm Path Following in Multiple Kernel Learning for Non-linear Feature Selection
This work proposes a novel conjecture which states that, for certain lp-MKL formulations, the number of features selected in the optimal solution monotonically decreases as p is decreased from an initial value to unity and proves the conjecture, for a generic family of kernel target alignment based formulations. Expand
Generalized hierarchical kernel learning
A generic regularizer enables the proposed formulation of Hierarchical Kernel Learning to be employed in the Rule Ensemble Learning (REL) where the goal is to construct an ensemble of conjunctive propositional rules. Expand
Riemannian adaptive stochastic gradient algorithms on matrix manifolds
This work proposes novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column subspaces of gradients and achieves the convergence rate of order $\mathcal{O}(\log (T)/\sqrt{T})$, where $T$ is the number of iterations. Expand
McTorch, a manifold optimization library for deep learning
McTorch is introduced, a manifold optimization library for deep learning that extends PyTorch that decouples manifold definitions and optimizers, i.e., once a new manifold is added it can be used with any existing optimizer and vice-versa. Expand
A Dual Framework for Low-rank Tensor Completion
This work proposes a variant of the latent trace norm that helps in learning a non-sparse combination of tensors, and develops a dual framework for solving the low-rank tensor completion problem. Expand
A Unified Framework for Structured Low-rank Matrix Learning
We propose a novel optimization framework for learning a low-rank matrix which is also constrained to lie in a linear subspace. Exploiting the duality theory, we present a factorization thatExpand
Multi-task Multiple Kernel Learning
A novel mirror-descent based algorithm is proposed for solving the problem of learning a shared kernel, which is constructed from a given set of base kernels, leading to improved generalization in all the tasks. Expand