This work proposes a novel geometric approach for learning bilingual mappings given monolingual embeddings and a bilingual dictionary that outperforms previous approaches on the bilingual lexicon induction and cross-lingual word similarity tasks.Expand

The main contribution is a convex formulation that employs a graph-based regularizer and simultaneously discovers few groups of related tasks, having close-by task parameters, as well as the feature space shared within each group.Expand

It is shown in this paper that for a certain class of regularizers on the output kernel, the constraint of being positive semidefinite can be dropped as it is automatically satisfied for the relaxed problem, leading to an unconstrained dual problem which can be solved efficiently.Expand

This work proposes a novel conjecture which states that, for certain lp-MKL formulations, the number of features selected in the optimal solution monotonically decreases as p is decreased from an initial value to unity and proves the conjecture, for a generic family of kernel target alignment based formulations.Expand

A generic regularizer enables the proposed formulation of Hierarchical Kernel Learning to be employed in the Rule Ensemble Learning (REL) where the goal is to construct an ensemble of conjunctive propositional rules.Expand

This work proposes novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column subspaces of gradients and achieves the convergence rate of order $\mathcal{O}(\log (T)/\sqrt{T})$, where $T$ is the number of iterations.Expand

McTorch is introduced, a manifold optimization library for deep learning that extends PyTorch that decouples manifold definitions and optimizers, i.e., once a new manifold is added it can be used with any existing optimizer and vice-versa.Expand

This work proposes a variant of the latent trace norm that helps in learning a non-sparse combination of tensors, and develops a dual framework for solving the low-rank tensor completion problem.Expand

We propose a novel optimization framework for learning a low-rank matrix which is also constrained to lie in a linear subspace. Exploiting the duality theory, we present a factorization that… Expand

A novel mirror-descent based algorithm is proposed for solving the problem of learning a shared kernel, which is constructed from a given set of base kernels, leading to improved generalization in all the tasks.Expand