Corpus ID: 53294779

Neural Likelihoods via Cumulative Distribution Functions

  title={Neural Likelihoods via Cumulative Distribution Functions},
  author={Pawel M. Chilinski and Ricardo Silva},
  • Pawel M. Chilinski, Ricardo Silva
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • We leverage neural networks as universal approximators of monotonic functions to build a parameterization of conditional cumulative distribution functions. By a modification of backpropagation as applied both to parameters and outputs, we show that we are able to build black box density estimators which are competitive against recently proposed models, while avoiding assumptions concerning the base distribution in a mixture model. That is, it makes no use of parametric models as building blocks… CONTINUE READING

    Figures, Tables, and Topics from this paper.


    Publications referenced by this paper.
    The Neural Autoregressive Distribution Estimator
    • 389
    • PDF
    MADE: Masked Autoencoder for Distribution Estimation
    • 328
    • PDF
    A Deep and Tractable Density Estimator
    • 113
    • PDF
    Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
    • 2,290
    • PDF
    Density estimation using Real NVP
    • 948
    • PDF
    NICE: Non-linear Independent Components Estimation
    • 646
    • PDF
    Transformation Autoregressive Networks
    • 30
    • PDF
    Neural Autoregressive Flows
    • 149
    • Highly Influential
    • PDF
    Probability density estimation using artificial neural networks
    • 24
    • PDF