ℓ1-regularized Neural Networks are Improperly Learnable in Polynomial Time

@inproceedings{Zhang20161regularizedNN,
  title={ℓ1-regularized Neural Networks are Improperly Learnable in Polynomial Time},
  author={Yuchen Zhang and Jason D. Lee and Michael I. Jordan},
  booktitle={ICML},
  year={2016}
}
We study the improper learning of multi-layer neural networks. Suppose that the neural network to be learned has k hidden layers and that the l1-norm of the incoming weights of any neuron is bounded by L. We present a kernel-based method, such that with probability at least 1− δ, it learns a predictor whose generalization error is at most ǫ worse than that of the neural network. The sample complexity and the time complexity of the presented method are polynomial in the input dimension and in (1… CONTINUE READING
Highly Cited
This paper has 45 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 15 times over the past 90 days. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 34 extracted citations

Similar Papers

Loading similar papers…