Consistency of Extreme Learning Machines and Regression under Non-Stationarity and Dependence for ML-Enhanced Moving Objects
@inproceedings{Steland2020ConsistencyOE, title={Consistency of Extreme Learning Machines and Regression under Non-Stationarity and Dependence for ML-Enhanced Moving Objects}, author={Ansgar Steland}, year={2020} }
Supervised learning by extreme learning machines resp. neural networks with random weights is studied under a non-stationary spatial-temporal sampling design which especially addresses settings where an autonomous object moving in a non-stationary spatial environment collects and analyzes data. The stochastic model especially allows for spatial heterogeneity and weak dependence. As efficient and computationally cheap learning methods (unconstrained) least squares, ridge regression and `s…
Figures from this paper
References
SHOWING 1-10 OF 39 REFERENCES
High-Dimensional Probability: An Introduction with Applications in Data Science
- Mathematics
- 2020
© 2018, Cambridge University Press Let us summarize our findings. A random projection of a set T in R n onto an m-dimensional subspace approximately preserves the geometry of T if m ⪆ d ( T ) . For...
Testing and estimating change-points in the covariance matrix of a high-dimensional time series
- Mathematics, Computer ScienceJ. Multivar. Anal.
- 2020
The Rotation of Eigenvectors by a Perturbation. III
- Mathematics
- 1970
When a Hermitian linear operator is slightly perturbed, by how much can its invariant subspaces change? Given some approximations to a cluster of neighboring eigenvalues and to the corresponding…
High-Dimensional Statistics
- Computer Science
- 2014
The focus is on the review and comments of his six recent papers in four areas, but only three of them are reproduced here due to limit of the space.
Is Extreme Learning Machine Feasible? A Theoretical Assessment (Part I)
- Computer ScienceIEEE Trans. Neural Networks Learn. Syst.
- 2015
A comprehensive feasibility analysis of ELM is conducted and it is revealed that there also exists some activation functions, which makes the corresponding ELM degrade the generalization capability.
A Method of Generating Random Weights and Biases in Feedforward Neural Networks with Random Hidden Nodes
- Computer ScienceInf. Sci.
- 2019