You are currently offline. Some features of the site may not work correctly.

Corpus ID: 231979073

Training Neural Networks is ER-complete

@article{Abrahamsen2021TrainingNN,
title={Training Neural Networks is ER-complete},
author={M. Abrahamsen and L. Kleist and Tillmann Miltzow},
journal={ArXiv},
year={2021},
volume={abs/2102.09798}
}

Given a neural network, training data, and a threshold, it was known that it is NP-hard to find weights for the neural network such that the total error is below the threshold. We determine the algorithmic complexity of this fundamental problem precisely, by showing that it is ∃R-complete. This means that the problem is equivalent, up to polynomial time reductions, to deciding whether a system of polynomial equations and inequalities with integer coefficients and real unknowns has a solution… Expand