Corpus ID: 15105362

Learning Halfspaces and Neural Networks with Random Initialization

@article{Zhang2015LearningHA,
title={Learning Halfspaces and Neural Networks with Random Initialization},
author={Yuchen Zhang and J. Lee and M. Wainwright and Michael I. Jordan},
journal={ArXiv},
year={2015},
volume={abs/1511.07948}
}
We study non-convex empirical risk minimization for learning halfspaces and neural networks. For loss functions that are $L$-Lipschitz continuous, we present algorithms to learn halfspaces and multi-layer neural networks that achieve arbitrarily small excess risk $\epsilon>0$. The time complexity is polynomial in the input dimension $d$ and the sample size $n$, but exponential in the quantity $(L/\epsilon^2)\log(L/\epsilon)$. These algorithms run multiple rounds of random initialization… Expand

References

SHOWING 1-10 OF 40 REFERENCES
Learning Kernel-Based Halfspaces with the 0-1 Loss
• Mathematics, Computer Science
• SIAM J. Comput.
• 2011
Agnostically learning halfspaces
• Mathematics, Computer Science
• 46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05)
• 2005
Efficient Learning of Linear Perceptrons
• Computer Science, Mathematics
• NIPS
• 2000
Hardness of Learning Halfspaces with Noise
• Mathematics, Computer Science
• 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06)
• 2006
Learning Halfspaces with Malicious Noise
• Mathematics, Computer Science
• ICALP
• 2009