• Mathematics, Computer Science
  • Published in ArXiv 2019

Noise Regularization for Conditional Density Estimation

@article{Rothfuss2019NoiseRF,
  title={Noise Regularization for Conditional Density Estimation},
  author={Jonas Rothfuss and F{\'a}bio Ferreira and Simon B{\"o}hm and Simon Walther and Maxim Ulrich and Tamim Asfour and Andreas Krause},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.08982}
}
Modelling statistical relationships beyond the conditional mean is crucial in many settings. Conditional density estimation (CDE) aims to learn the full conditional probability density from data. Though highly expressive, neural network based CDE models can suffer from severe over-fitting when trained with the maximum likelihood objective. Due to the inherent structure of such models, classical regularization approaches in the parameter space are rendered ineffective. To address this issue, we… CONTINUE READING

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.

PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees

VIEW 2 EXCERPTS
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 51 REFERENCES

Machine learning - a probabilistic perspective

  • Kevin P. Murphy
  • Computer Science
  • Adaptive computation and machine learning series
  • 2012
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Nonparametric Econometrics : Theory and Practice

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Variational Inference with Normalizing Flows

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Using additive noise in back-propagation training

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

ON ESTIMATION OF A PROBABILITY DENSITY FUNCTION AND MODE

VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Overall, the target variable is one-dimensional, i.e. y ∈ Y ⊆ R, whereas the conditional variable x constitutes a 14-dimensional

  • Rothfuss
  • 2019