# Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function

@article{Langer2020AnalysisOT, title={Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function}, author={Sophie Langer}, journal={J. Multivar. Anal.}, year={2020}, volume={182}, pages={104695} }

## 15 Citations

### On the universal consistency of an over-parametrized deep neural network estimate learned by gradient descent

- Computer Science
- 2022

It is shown that in case of a suitable random initialization of the network, a suitable small stepsize of the gradient descent, and a number of gradient descent steps which is slightly larger than the reciprocal of the stepsize, the estimate is universally consistent in the sense that its expected L 2 error converges to zero for all distributions of the data where the response variable is square integrable.

### Estimation of a regression function on a manifold by fully connected deep neural networks

- Computer Science, MathematicsJournal of Statistical Planning and Inference
- 2022

### VC dimension of partially quantized neural networks in the overparametrized regime

- Computer ScienceICLR
- 2022

It is shown that HANNs can have VC dimension signiﬁcantly smaller than the number of weights, while being highly expressive, and empirical risk minimization over HANNS in the overparametrized regime achieves the minimax rate for classi ﬁcation with Lipschitz posterior class probability.

### Analysis of convolutional neural network image classifiers in a rotationally symmetric model

- Computer Science, Mathematics
- 2022

Under suitable structural and smoothness assumptions on the functional a posteriori probability, it is shown that least squares plug-in classiﬁers based on convolutional neural networks are able to circumvent the curse of dimensionality in binary image classi⬁cation if the authors neglect a resolution-dependent error term.

### Research on improved convolutional wavelet neural network

- Computer ScienceScientific reports
- 2021

Wavelet neural network (WNN) is implemented, which can solve the problems of BPNN and RBFNN and have better performance and the proposed wavelet-based Convolutional Neural Network (WCNN) can reduce the mean square error and the error rate of CNN, which means WCNN has better maximum precision than CWNN.

### Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss

- Computer Science
- 2020

Under suitable assumptions on the smoothness and structure of the aposteriori probability it is shown that these estimates achieve a rate of convergence which is independent of the dimension of the image.

### Music Genre Classification Based on Deep Learning

- Computer ScienceMobile Information Systems
- 2022

Experimental outcomes prove that the anticipated method can meritoriously increase the correctness of music classification and is helpful for music channel classification.

### On the Rate of Convergence of a Classifier Based on a Transformer Encoder

- Computer ScienceIEEE Transactions on Information Theory
- 2022

It is shown that this Transformer classifier is able to circumvent the curse of dimensionality provided the a posteriori probability satisfies a suitable hierarchical composition model.

### A Nonlinear Autoregressive Exogenous (NARX) Neural Network Model for the Prediction of Timestamp Influence on Bitcoin Value

- BusinessIEEE Access
- 2021

Simulation analysis indicates that bitcoin digital currency’s performance variation is highly influenced by its transaction timestamp with the prediction accuracy of 96% and the contributions of this research lies with the fact that specific Bitcoin transaction events repeat themselves over and over again.

### A Material Removal Prediction Method Based On Multi-Scale Attention Mechanism

- Materials Science
- 2022

The exact removal of material in abrasive belt grinding determines the final machining quality of the workpiece. However, it is difficult to determine the removal state of materials in actual…

## References

SHOWING 1-10 OF 32 REFERENCES

### On the rate of convergence of fully connected very deep neural network regression estimates

- Computer ScienceThe Annals of Statistics
- 2021

This paper shows that it is possible to get similar results also for least squares estimates based on simple fully connected neural networks with ReLU activation functions, based on new approximation results concerning deep neural networks.

### Nonparametric regression using deep neural networks with ReLU activation function

- Computer ScienceThe Annals of Statistics
- 2020

The theory suggests that for nonparametric regression, scaling the network depth with the sample size is natural and the analysis gives some insights into why multilayer feedforward neural networks perform well in practice.

### Estimation of a Function of Low Local Dimensionality by Deep Neural Networks

- Computer ScienceIEEE Transactions on Information Theory
- 2022

It is shown that the least squares regression estimates using DNNs are able to achieve dimensionality reduction in case that the regression function has locally low dimensionality.

### Convergence rates for single hidden layer feedforward networks

- Computer Science, MathematicsNeural Networks
- 1994

### Universal approximation bounds for superpositions of a sigmoidal function

- Computer ScienceIEEE Trans. Inf. Theory
- 1993

The approximation rate and the parsimony of the parameterization of the networks are shown to be advantageous in high-dimensional settings and the integrated squared approximation error cannot be made smaller than order 1/n/sup 2/d/ uniformly for functions satisfying the same smoothness assumption.

### Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results

- Computer ScienceNeural Networks
- 1998

### Approximation and estimation bounds for artificial neural networks

- Computer ScienceMachine Learning
- 2004

The analysis involves Fourier techniques for the approximation error, metric entropy considerations for the estimation error, and a calculation of the index of resolvability of minimum complexity estimation of the family of networks.

### On deep learning as a remedy for the curse of dimensionality in nonparametric regression

- Computer ScienceThe Annals of Statistics
- 2019

It is shown that least squares estimates based on multilayer feedforward neural networks are able to circumvent the curse of dimensionality in nonparametric regression.

### The phase diagram of approximation rates for deep neural networks

- Computer ScienceNeurIPS
- 2020

It is proved that using both sine and ReLU activations theoretically leads to very fast, nearly exponential approximation rates, thanks to the emerging capability of the network to implement efficient lookup operations.

### Neural Network Learning - Theoretical Foundations

- Computer Science
- 1999

The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction, and discuss the computational complexity of neural network learning.