• Publications
  • Influence
A dual coordinate descent method for large-scale linear SVM
TLDR
This paper presents a novel dual coordinate descent method for linear SVM with L1-and L2-loss functions. Expand
  • 877
  • 121
  • PDF
A sequential dual method for large scale multi-class linear svms
TLDR
Efficient training of direct multi-class formulations of linear Support Vector Machines with a fast dual method . Expand
  • 152
  • 17
  • PDF
Predictive Approaches for Choosing Hyperparameters in Gaussian Processes
TLDR
We propose and investigate predictive approaches based on Geisser's predictive sample reuse (PSR) methodology and the related Stone's cross-validation (CV) methodology. Expand
  • 113
  • 12
  • PDF
A Sequential Dual Method for Structural SVMs
TLDR
This paper proposes a fast sequential dual method (SDM) for structural SVMs. Expand
  • 25
  • 3
  • PDF
A distributed block coordinate descent method for training $l_1$ regularized linear classifiers
TLDR
In this paper we develop a distributed block coordinate descent (DBCD) method that is efficient on distributed platforms in which communication costs are high. Expand
  • 28
  • 3
  • PDF
Semi-supervised Gaussian Process Ordinal Regression
TLDR
In this work, we propose a novel approach for semi-supervised ordinal regression using Gaussian Processes (GP). Expand
  • 16
  • 2
  • PDF
A Parallel SGD method with Strong Convergence
TLDR
This paper proposes a novel parallel stochastic gradient descent method that is obtained by applying parallel sets of SGD iterations (each set operating on one node using the data residing in it) for finding the direction in each iteration of a batch descent method. Expand
  • 13
  • 2
  • PDF
Semi-supervised classification using sparse Gaussian process regression
TLDR
In this paper, we propose a new algorithm for solving semi-supervised binary classification problem using sparse GP regression (GPR) models. Expand
  • 4
  • 2
Distributed Newton Methods for Deep Neural Networks
TLDR
We propose a novel distributed Newton method for training deep neural networks. Expand
  • 10
  • 1
  • PDF
A Sparse Nonlinear Classifier Design Using AUC Optimization
TLDR
AUC (Area under the ROC curve) is an important performance measure for applications where the data is highly imbalanced. Expand
  • 6
  • 1
  • PDF
...
1
2
3
4
...