Corpus ID: 231979073

Training Neural Networks is ER-complete

@article{Abrahamsen2021TrainingNN,
  title={Training Neural Networks is ER-complete},
  author={M. Abrahamsen and L. Kleist and Tillmann Miltzow},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.09798}
}
Given a neural network, training data, and a threshold, it was known that it is NP-hard to find weights for the neural network such that the total error is below the threshold. We determine the algorithmic complexity of this fundamental problem precisely, by showing that it is ∃R-complete. This means that the problem is equivalent, up to polynomial time reductions, to deciding whether a system of polynomial equations and inequalities with integer coefficients and real unknowns has a solution… Expand

Figures from this paper

References

SHOWING 1-10 OF 32 REFERENCES
Training a 3-node neural network is NP-complete
  • 755
  • PDF
Complexity of Training ReLU Neural Network
  • 17
  • PDF
Training a Sigmoidal Node Is Hard
  • D. Hush
  • Mathematics, Computer Science
  • Neural Computation
  • 1999
  • 26
The computational intractability of training sigmoidal neural networks
  • L. Jones
  • Computer Science
  • IEEE Trans. Inf. Theory
  • 1997
  • 42
Training a Single Sigmoidal Neuron Is Hard
  • J. Síma
  • Mathematics, Computer Science
  • Neural Comput.
  • 2002
  • 45
On the complexity of loading shallow neural networks
  • J. S. Judd
  • Computer Science, Mathematics
  • J. Complex.
  • 1988
  • 220
Globally Optimal Gradient Descent for a ConvNet with Gaussian Inputs
  • 240
  • PDF
Approximation Algorithms for Training One-Node ReLU Neural Networks
  • 1
Smoothing the gap between NP and ER
  • 6
  • PDF
...
1
2
3
4
...