#### Filter Results:

- Full text PDF available (47)

#### Publication Year

1977

2016

- This year (0)
- Last 5 years (25)
- Last 10 years (47)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Cell Type

#### Key Phrases

#### Method

Learn More

- Mark D. Reid, Robert C. Williamson
- Journal of Machine Learning Research
- 2011

We unify f -divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary… (More)

- Novi Quadrianto, Kristian Kersting, Mark D. Reid, Tibério S. Caetano, Wray L. Buntine
- 2009 Ninth IEEE International Conference on Data…
- 2009

Quantile regression refers to the process of estimating the quantiles of a conditional distribution and has many important applications within econometrics and data mining, among other domains. In this paper, we show how to estimate these conditional quantile functions within a Bayes risk minimization framework using a Gaussian process prior. The resulting… (More)

- Mark D. Reid, Robert C. Williamson
- Journal of Machine Learning Research
- 2010

We study losses for binary classification and class probability estimation and extend the understanding of them from margin losses to general composite losses which are the composition of a proper loss with a link function. We characterise when margin losses can be proper composite losses, explicitly show how to determine a symmetric loss in full from half… (More)

- Mark D. Reid, Robert C. Williamson
- ICML
- 2009

We present tight surrogate regret bounds for the class of proper (<i>i.e.</i>, Fisher consistent) losses. The bounds generalise the margin-based bounds due to Bartlett et al. (2006). The proof uses Taylor's theorem and leads to new representations for loss and regret and a simple proof of the integral representation of proper losses. We also present a… (More)

- Elodie Vernet, Robert C. Williamson, Mark D. Reid
- NIPS
- 2011

We consider loss functions for multiclass prediction problems. We show when a multiclass loss can be expressed as a “proper composite loss”, which is the composition of a proper loss and a link function. We extend existing results for binary losses to multiclass losses. We subsume results on “classification calibration” by relating it to properness. We… (More)

- Peng Sun, Mark D. Reid, Jie Zhou
- ICML
- 2012

This paper presents an improvement to model learning when using multi-class LogitBoost for classification. Motivated by the statistical view, LogitBoost can be seen as additive tree regression. Two important factors in this setting are: 1) coupled classifier output due to a sum-to-zero constraint, and 2) the dense Hessian matrices that arise when computing… (More)

- Mark D. Reid, Robert C. Williamson
- COLT
- 2009

We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Liebler divergence in two ways: we consider arbitrary f -divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised… (More)

- Tim van Erven, Peter Grünwald, Nishant A. Mehta, Mark D. Reid, Robert C. Williamson
- Journal of Machine Learning Research
- 2015

The speed with which a learning algorithm converges as it is presented with more data is a central problem in machine learning — a fast rate of convergence means less data is needed for the same level of performance. The pursuit of fast rates in online and statistical learning has led to the discovery of many conditions in learning theory under which fast… (More)

- Qinfeng Shi, Mark D. Reid, Tibério S. Caetano, Anton van den Hengel, Zhenhua Wang
- IEEE Transactions on Pattern Analysis and Machine…
- 2015

We propose a novel hybrid loss for multiclass and structured prediction problems that is a convex combination of a log loss for Conditional Random Fields (CRFs) and a multiclass hinge loss for Support Vector Machines (SVMs). We provide a sufficient condition for when the hybrid loss is Fisher consistent for classification. This condition depends on a… (More)

We strengthen recent connections between prediction markets and learning by showing that a natural class of market makers can be understood as performing stochastic mirror descent when trader demands are sequentially drawn from a fixed distribution. This provides new insights into how market prices (and price paths) may be interpreted as a summary of the… (More)