Greedy function approximation: A gradient boosting machine.
- J. Friedman
- Computer Science
- 1 October 2001
A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Classification and Regression Trees
- L. Breiman, J. Friedman, R. Olshen, C. J. Stone
- Computer Science
- 1 September 1984
This chapter discusses tree classification in the context of medicine, where right Sized Trees and Honest Estimates are considered and Bayes Rules and Partitions are used as guides to optimal pruning.
Regularization Paths for Generalized Linear Models via Coordinate Descent.
- J. Friedman, T. Hastie, R. Tibshirani
- Computer ScienceJournal of Statistical Software
- 2 February 2010
In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition
- T. Hastie, R. Tibshirani, J. Friedman
- Computer ScienceSpringer Series in Statistics
- 1 March 2005
This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression and path algorithms for the lasso, non-negative matrix factorization, and spectral clustering.
The Elements of Statistical Learning
- T. Hastie, R. Tibshirani, J. Friedman
- Computer Science
- 2001
Sparse inverse covariance estimation with the graphical lasso.
- J. Friedman, T. Hastie, R. Tibshirani
- Computer ScienceBiostatistics
- 1 July 2008
Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.
Special Invited Paper-Additive logistic regression: A statistical view of boosting
- J. Friedman
- Computer Science
- 1 April 2000
This work shows that this seemingly mysterious phenomenon of boosting can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood, and develops more direct approximations and shows that they exhibit nearly identical results to boosting.
Stochastic gradient boosting
- J. Friedman
- Computer Science
- 28 February 2002
Regularized Discriminant Analysis
- J. Friedman
- Computer Science
- 1 March 1989
Alternatives to the usual maximum likelihood estimates for the covariance matrices are proposed, characterized by two parameters, the values of which are customized to individual situations by jointly minimizing a sample-based estimate of future misclassification risk.
...
...