• Corpus ID: 17236010

Posterior convergence rates in non-linear latent variable models

  title={Posterior convergence rates in non-linear latent variable models},
  author={Debdeep Pati and Anirban Bhattacharya and David B. Dunson},
  journal={arXiv: Statistics Theory},
Non-linear latent variable models have become increasingly popular in a variety of applications. However, there has been little study on theoretical properties of these models. In this article, we study rates of posterior contraction in univariate density estimation for a class of non-linear latent variable models where unobserved U(0,1) latent variables are related to the response variables via a random non-linear regression with an additive error. Our approach relies on characterizing the… 

Latent factor density regression models

The proposed approach is based on a novel extension of the recently proposed latent variable density estimation model in which the response variables conditioned on the predictors are modeled as unknown functions of uniformly distributed latent variables and theredictors with an additive Gaussian error.

Bayesian nonparametric modeling and theory for complex data

The dissertation focuses on solving some important theoretical and methodological problems associated with Bayesian modeling of infinite dimensional ‘objects’, popularly called nonparametric Bayes.

Rates of convergence for nonparametric estimation of singular distributions using generative adversarial networks

The convergence rate of a GAN type estimator with respect to the Wasserstein metric is found to be faster than that obtained by likelihood approaches, which provides insights into why GAN approaches perform better in many real problems.

Some Recent Advances in Non- and Semiparametric Bayesian Modeling with Copulas, Mixtures, and Latent Variables

Some Recent Advances in Nonand Semiparametric Bayesian Modeling with Copulas, Mixtures, and Latent Variables by Jared S. Murray Department of Statistical Science Duke University

Density Regression with Bayesian Additive Regression Trees

This work considers a continuous latent variable model in general covariate spaces, which it calls DR-BART, and proves that the posterior induced by the model concentrates quickly around true generative functions that are sufficiently smooth.

Bayesian Factor Analysis for Inference on Interactions

A latent factor joint model is proposed, which includes shared factors in both the predictor and response components while assuming conditional independence, and a Bayesian approach to inference is proposed under this Factor analysis for INteractions (FIN) framework.



Single Factor Transformation Priors for Density Regression

A flexible class of priors based on random nonlinear functions of a uniform latent variable with an additive residual is proposed, related to Gaussian process latent variable models proposed in the machine learning literature.

Rates of contraction of posterior distributions based on Gaussian process priors

The rate of contraction of the posterior distribution based on sampling from a smooth density model when the prior models the log density as a (fractionally integrated) Brownian motion is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process.

Posterior convergence rates of Dirichlet mixtures at smooth densities

We study the rates of convergence of the posterior distribution for Bayesian density estimation with Dirichlet mixtures of normal distributions as the prior. The true density is assumed to be twice

Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models

A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.

Gaussian Process Latent Variable Models for Visualisation of High Dimensional Data

A new underlying probabilistic model for principal component analysis (PCA) is introduced that shows that if the prior's covariance function constrains the mappings to be linear the model is equivalent to PCA, and is extended by considering less restrictive covariance functions which allow non-linear mappings.

The Logistic Normal Distribution for Bayesian, Nonparametric, Predictive Densities

Abstract This article models the common density of an exchangeable sequence of observations by a generalization of the process derived from a logistic transform of a Gaussian process. The support of

Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities

We study the rates of convergence of the maximum likelihood estimator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of

Towards a Faster Implementation of Density Estimation With Logistic Gaussian Process Priors

It is established that imputation results in quite accurate computation of the Bayes estimate for a logistic Gaussian process prior for density estimation, and simulation studies show that accuracy and high speed can be combined.

Adaptive Bayesian density estimation with location-scale mixtures

Convergence rates of Bayesian density estimators based on finite location-scale mixtures of exponential power distributions are studied to derive posterior concentration rates, with priors based on these mixture models.


An exact and easily computable expression for the mean integrated squared error (MISE) for the kernel estimator of a general normal mixture density, is given for Gaussian kernels of arbitrary order.