Identification of Shallow Neural Networks by Fewest Samples
@article{Fornasier2018IdentificationOS, title={Identification of Shallow Neural Networks by Fewest Samples}, author={M. Fornasier and J. Vyb{\'i}ral and I. Daubechies}, journal={ArXiv}, year={2018}, volume={abs/1804.01592} }
We address the uniform approximation of sums of ridge functions $\sum_{i=1}^m g_i(a_i\cdot x)$ on ${\mathbb R}^d$, representing the shallowest form of feed-forward neural network, from a small number of query samples, under mild smoothness assumptions on the functions $g_i$'s and near-orthogonality of the ridge directions $a_i$'s. The sample points are randomly generated and are universal, in the sense that the sampled queries on those points will allow the proposed recovery algorithms to… CONTINUE READING
Figures and Topics from this paper
6 Citations
Robust and Resource Efficient Identification of Two Hidden Layer Neural Networks
- Computer Science, Mathematics
- ArXiv
- 2019
- 3
- PDF
Subspace power method for symmetric tensor decomposition and generalized PCA
- Mathematics, Computer Science
- ArXiv
- 2019
- 3
- PDF
Stable Recovery of Entangled Weights: Towards Robust Identification of Deep Neural Networks from Minimal Samples
- Computer Science
- ArXiv
- 2021
- PDF
References
SHOWING 1-10 OF 68 REFERENCES
Breaking the Curse of Dimensionality with Convex Neural Networks
- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2017
- 292
- Highly Influential
- PDF
Learning Functions of Few Arbitrary Linear Parameters in High Dimensions
- Mathematics, Computer Science
- Found. Comput. Math.
- 2012
- 69
- PDF
Mean-field theory of two-layers neural networks: dimension-free bounds and kernel limit
- Mathematics, Physics
- COLT
- 2019
- 82
- PDF
On the Connection Between Learning Two-Layers Neural Networks and Tensor Decomposition
- Computer Science, Mathematics
- AISTATS
- 2019
- 35
- PDF