Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,294,018 papers from all fields of science
Search
Sign In
Create Free Account
Loss functions for classification
Known as:
Logistic loss
In machine learning and mathematical optimization, loss functions for classification are computationally feasible loss functions representing the…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
12 relations
Cross-validation (statistics)
Deep learning
Empirical risk minimization
Gradient descent
Expand
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
Online Anomaly Detection With Minimax Optimal Density Estimation in Nonstationary Environments
Kaan Gokcesu
,
S. Kozat
IEEE Transactions on Signal Processing
2018
Corpus ID: 20173525
We introduce a truly online anomaly detection algorithm that sequentially processes data to detect anomalies in time series. In…
Expand
Highly Cited
2018
Highly Cited
2018
Convolutional neural network-based multimodal image fusion via similarity learning in the shearlet domain
Haithem Hermessi
,
Olfa Mourali
,
E. Zagrouba
Neural computing & applications (Print)
2018
Corpus ID: 4301257
Recently, deep learning has been shown effectiveness in multimodal image fusion. In this paper, we propose a fusion method for CT…
Expand
2017
2017
On the problem of on-line learning with log-loss
Yaniv Fogel
,
M. Feder
International Symposium on Information Theory
2017
Corpus ID: 572176
In this paper we consider the problem of on-line learning with respect to the logarithmic loss, where the learner provides a…
Expand
Highly Cited
2015
Highly Cited
2015
Online Stochastic Linear Optimization under One-bit Feedback
Lijun Zhang
,
Tianbao Yang
,
Rong Jin
,
Y. Xiao
,
Zhi-Hua Zhou
International Conference on Machine Learning
2015
Corpus ID: 12620738
In this paper, we study a special bandit setting of online stochastic linear optimization, where only one-bit of information is…
Expand
Highly Cited
2015
Highly Cited
2015
A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization
Wei-Sheng Chin
,
Yong Zhuang
,
Yu-Chin Juan
,
Chih-Jen Lin
Pacific-Asia Conference on Knowledge Discovery…
2015
Corpus ID: 17495221
Stochastic gradient methods are effective to solve matrix factorization problems. However, it is well known that the performance…
Expand
2014
2014
Dropout Training for Support Vector Machines
Ning Chen
,
Jun Zhu
,
Jianfei Chen
,
Bo Zhang
AAAI Conference on Artificial Intelligence
2014
Corpus ID: 2946429
Dropout and other feature noising schemes have shown promising results in controlling over-fitting by artificially corrupting…
Expand
2012
2012
Probabilistic estimation while ignoring outliers
2012
Corpus ID: 16523868
Logistic regression learns a parameterized mapping from feature vectors to probability vectors and is for example central to…
Expand
2011
2011
A Primal-Dual Convergence Analysis of Boosting
Matus Telgarsky
Journal of machine learning research
2011
Corpus ID: 8650577
Boosting combines weak learners into a predictor with low empirical risk. Its dual constructs a high entropy distribution upon…
Expand
2011
2011
Efficiently Learning a Distance Metric for Large Margin Nearest Neighbor Classification
Kyoungup Park
,
Chunhua Shen
,
Zhihui Hao
,
Junae Kim
AAAI Conference on Artificial Intelligence
2011
Corpus ID: 7208464
We concern the problem of learning a Mahalanobis distance metric for improving nearest neighbor classification. Our work is…
Expand
Highly Cited
2008
Highly Cited
2008
A quasi-Newton approach to non-smooth convex optimization
Jin Yu
,
S. Vishwanathan
,
Simon Günter
,
N. Schraudolph
International Conference on Machine Learning
2008
Corpus ID: 47157025
We extend the well-known BFGS quasi-Newton method and its limited-memory variant LBFGS to the optimization of non-smooth convex…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE