• Publications
  • Influence
Holographic Embeddings of Knowledge Graphs
Learning embeddings of entities and relations is an efficient and versatile method to perform machine learning on relational data such as knowledge graphs. In this work, we propose holographicExpand
  • 464
  • 112
Kernels for Vector-Valued Functions: a Review
Kernel methods are among the most popular techniques in machine learning. From a regularization perspective they play a central role in regularization theory as they provide a natural choice for theExpand
  • 360
  • 46
On regularization algorithms in learning theory
In this paper we discuss a relation between Learning Theory and Regularization of linear ill-posed inverse problems. It is well known that Tikhonov regularization can be profitably used in theExpand
  • 193
  • 22
Less is More: Nyström Computational Regularization
We study Nystrom type subsampling approaches to large scale kernel methods, and prove learning bounds in the statistical learning setting, where random sampling and high probability estimates areExpand
  • 138
  • 22
Generalization Properties of Learning with Random Features
We study the generalization properties of ridge regression with random features in the statistical learning framework. We show for the first time that $O(1/\sqrt{n})$ learning bounds can be achievedExpand
  • 137
  • 15
On Early Stopping in Gradient Descent Learning
AbstractIn this paper we study a family of gradient descent algorithms to approximate the regression function from reproducing kernel Hilbert spaces (RKHSs), the family being characterized by aExpand
  • 400
  • 14
On Learning with Integral Operators
A large number of learning algorithms, for example, spectral clustering, kernel Principal Components Analysis and many manifold methods are based on estimating eigenvalues and eigenfunctions ofExpand
  • 137
  • 13
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deepExpand
  • 230
  • 11
Manifold Regularization
In this lecture we introduce a class of learning algorithms, collectively called manifold regularization algorithms, suited for predicting/classifying data embedded in high-dimensional spaces. WeExpand
  • 90
  • 11
Nonparametric sparsity and regularization
In this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. OurExpand
  • 59
  • 11