Share This Author
Support Vector Regression Machines
This work compares support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space and expects that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.
Learning to rank using gradient descent
- C. Burges, Tal Shaked, Greg Hullender
- Computer ScienceInternational Conference on Machine Learning
- 7 August 2005
RankNet is introduced, an implementation of these ideas using a neural network to model the underlying ranking function, and test results on toy data and on data from a commercial internet search engine are presented.
Advances in kernel methods: support vector learning
Support vector machines for dynamic reconstruction of a chaotic system, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel.
From RankNet to LambdaRank to LambdaMART: An Overview
- C. Burges
- Computer Science
- 23 June 2010
RankNet, LambdaRank, and LambdaMART have proven to be very successful algorithms for solving real world ranking problems and the details are spread across several papers and reports, so here is a self-contained, detailed and complete description of them.
A Tutorial on Support Vector Machines for Pattern Recognition
- C. Burges
- Computer ScienceData mining and knowledge discovery
- 1 June 1998
There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Adapting boosting for information retrieval measures
- Qiang Wu, C. Burges, K. Svore, Jianfeng Gao
- Computer ScienceInformation retrieval (Boston)
- 1 June 2010
This work presents a new ranking algorithm that combines the strengths of two previous methods: boosted tree classification, and LambdaRank, and shows how to find the optimal linear combination for any two rankers, and uses this method to solve the line search problem exactly during boosting.
Input space versus feature space in kernel-based methods
The geometry of feature space is reviewed, and the connection between feature space and input space is discussed by dealing with the question of how one can, given some vector in feature space, find a preimage in input space.
MCTest: A Challenge Dataset for the Open-Domain Machine Comprehension of Text
- Matthew Richardson, C. Burges, Erin Renshaw
- Computer ScienceConference on Empirical Methods in Natural…
- 1 October 2013
MCTest is presented, a freely available set of stories and associated questions intended for research on the machine comprehension of text that requires machines to answer multiple-choice reading comprehension questions about fictional stories, directly tackling the high-level goal of open-domain machine comprehension.
Learning to Rank with Nonsmooth Cost Functions
A class of simple, flexible algorithms, called LambdaRank, which avoids difficulties by working with implicit cost functions by using neural network models, and can be extended to any non-smooth and multivariate cost functions.
Comparing support vector machines with Gaussian kernels to radial basis function classifiers
- B. Schölkopf, K. Sung, V. Vapnik
- Computer ScienceIEEE Transactions on Signal Processing
- 1 November 1997
The results show that on the United States postal service database of handwritten digits, the SV machine achieves the highest recognition accuracy, followed by the hybrid system, and the SV approach is thus not only theoretically well-founded but also superior in a practical application.