# Accounting for uncertainty of non-linear regression models by divisive data resorting

@inproceedings{Polar2021AccountingFU, title={Accounting for uncertainty of non-linear regression models by divisive data resorting}, author={Andrew Polar and Michael Poluektov}, year={2021} }

This paper focuses on building models of stochastic systems with aleatoric uncertainty. The nature of the considered systems is such that the identical inputs can result in diﬀerent outputs, i.e. the output is a random variable. This paper sug-gests a novel algorithm of boosted ensemble training of multiple models for obtaining a probability distribution of an individual output as a function of a system input. The deterministic component in the ensemble can be an arbitrarily-chosen regression…

## References

SHOWING 1-10 OF 22 REFERENCES

Linearly Combining Density Estimators via Stacking

- Computer ScienceMachine Learning
- 2004

Stacking is used to form a linear combination of finite mixture model and kernel density estimators for non-parametric multivariate density estimation and outperforms other strategies such as choosing the single best model based on cross-validation, combining with uniform weights, and even using the singlebest model chosen by “Cheating” and examining the test set.

A decision-theoretic generalization of on-line learning and an application to boosting

- Computer ScienceEuroCOLT
- 1995

The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and it is shown that the multiplicative weight-update Littlestone?Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.

Bagging predictors

- Computer ScienceMachine Learning
- 2004

Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy.

Bayesian Learning via Stochastic Dynamics

- Computer ScienceNIPS
- 1992

Bayesian methods avoid overfitting and poor generalization by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data, by simulating a stochastic dynamical system that has the posterior as its stationary distribution.

A deep machine learning algorithm for construction of the Kolmogorov-Arnold representation

- Computer Science, MathematicsEng. Appl. Artif. Intell.
- 2021

The Strength of Weak Learnability

- Computer ScienceMachine Learning
- 2005

In this paper, a method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy, and it is shown that these two notions of learnability are equivalent.

A Review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges

- Computer ScienceInf. Fusion
- 2021

Analysis of Kolmogorov's superpostion theorem and its implementation in applications with low and high dimensional data

- Computer Science, Mathematics
- 2008

This dissertation analyzes Kolmogorov’s superposition theorem for high dimensions and provides a thorough discussion on the proof and its numerical implementation of the theorem in dimension two, and presents high dimensional extensions with complete and detailed proofs.

Modelling of Non-linear Control Systems using the Discrete Urysohn Operator

- Mathematics, Computer ScienceJ. Frankl. Inst.
- 2020