A comparison of variational approximations for fast inference in mixed logit models

@article{Depraetere2017ACO,
  title={A comparison of variational approximations for fast inference in mixed logit models},
  author={Nicolas Depraetere and Martina Vandebroek},
  journal={Computational Statistics},
  year={2017},
  volume={32},
  pages={93-125}
}
Variational Bayesian methods aim to address some of the weaknesses (computation time, storage costs and convergence monitoring) of mainstream Markov chain Monte Carlo based inference at the cost of a biased but more tractable approximation to the posterior distribution. We investigate the performance of variational approximations in the context of the mixed logit model, which is one of the most used models for discrete choice data. A typical treatment using the variational Bayesian methodology… Expand
Variational Bayesian Inference for Mixed Logit Models with Unobserved Inter- and Intra-Individual Heterogeneity
TLDR
This paper derives a VB method for posterior inference in mixed logit models with unobserved inter- and intra-individual heterogeneity and demonstrates that VB can be a fast, scalable and accurate alternative to MSL and MCMC estimation, especially in applications in which fast predictions are paramount. Expand
Bayesian Estimation of Mixed Multinomial Logit Models: Advances and Simulation-Based Evaluations
TLDR
Extending several VB methods for MMNL to admit utility specifications including both fixed and random utility parameters suggests that all VB variants perform as well as MCMC and MSLE at prediction and recovery of all model parameters with the exception of the covariance matrix of the multivariate normal mixing distribution. Expand
Mixed Variational Inference
  • Nikolaos Gianniotis
  • Mathematics, Computer Science
  • 2019 International Joint Conference on Neural Networks (IJCNN)
  • 2019
TLDR
This work proposes a method that combines the Laplace approximation with the variational approach and maintains: applicability on non-conjugate models, posterior correlations and a reduced number of free variational parameters. Expand
Semi-Parametric Hierarchical Bayes Estimates of New Yorkers' Willingness to Pay for Features of Shared Automated Vehicle Services
TLDR
This paper compares the performance of the MVN, F-MON and DP-MON mixing distributions using simulated data and real data sourced from a stated choice study on preferences for SAV services in New York City and shows that the DP- MON mixing distribution provides superior fit to the data and performs at least as well as the competing methods at out-of-sample prediction. Expand
A Dirichlet process mixture model of discrete choice: Comparisons and a case study on preferences for shared automated vehicles
TLDR
This work empirically validate the model framework in a case study on motorists' route choice preferences and find that the proposed Dirichlet process mixture model of discrete choice outperforms a latent class MNL model and mixed MNL models with common parametric mixing distributions in terms of both in-sample fit and out-of-sample predictive ability. Expand
Learning Document Embeddings Along With Their Uncertainties
TLDR
Bayesian subspace multinomial model (Bayesian SMM), a generative log-linear model that learns to represent documents in the form of Gaussian distributions, thereby encoding the uncertainty in its covariance is presented. Expand
Discrete Choice Analysis with Machine Learning Capabilities
TLDR
This paper discusses capabilities that are essential to models applied in policy analysis settings and the limitations of direct applications of off-the-shelf machine learning methodologies to such settings, and identifies an area where machine learning paradigms can be leveraged. Expand
Variational approximation for importance sampling
TLDR
Numerical results show that using variational approximation as the proposal can improve the performance of importance sampling and sequential importance sampling. Expand
An online updating method for time-varying preference learning
TLDR
A new online-updating model that can accurately and efficiently estimate an individual’s preferences from his discrete choices is proposed and has the highest accuracy in preference learning and behavior prediction. Expand
Scalable and Accurate Variational Bayes for High-Dimensional Binary Regression Models
State-of-the-art methods for Bayesian inference on regression models with binary responses are either computationally impractical or inaccurate in high dimensions. To cover this gap we propose aExpand

References

SHOWING 1-10 OF 44 REFERENCES
Variational Inference for Large-Scale Models of Discrete Choice
Discrete choice models are commonly used by applied statisticians in numerous fields, such as marketing, economics, finance, and operations research. When agents in discrete choice models are assumedExpand
Variational Bayesian Multinomial Probit Regression with Gaussian Process Priors
TLDR
This is the first time that a fully variational Bayesian treatment for multiclass GP classification has been developed without having to resort to additional explicit approximations to the nongaussian likelihood term. Expand
Efficient Bounds for the Softmax Function and Applications to Approximate Inference in Hybrid models
The softmax link is used in many probabilistic model dealing with both discrete and continuous data. However, efficient Bayesian inference for this type of model is still an open problem due to theExpand
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalized) linear models, (generalized) additiveExpand
Gaussian Variational Approximate Inference for Generalized Linear Mixed Models
Variational approximation methods have become a mainstay of contemporary machine learning methodology, but currently have little presence in statistics. We devise an effective variationalExpand
Bayesian parameter estimation via variational methods
TLDR
It is shown that an accurate variational transformation can be used to obtain a closed form approximation to the posterior distribution of the parameters thereby yielding an approximate posterior predictive model. Expand
Comparing different sampling schemes for approximating the integrals involved in the efficient design of stated choice experiments
The semi-Bayesian approach for constructing efficient stated choice designs requires the evaluation of the design selection criterion value over numerous draws taken from the prior parameterExpand
Variational Message Passing
TLDR
Variational Message Passing is introduced, a general purpose algorithm for applying variational inference to Bayesian Networks and can be applied to very general class of conjugate-exponential models because it uses a factorised variational approximation. Expand
Inadequacy of interval estimates corresponding to variational Bayesian approximations
TLDR
It is shown that the covariance matrices from the variational Bayes approximations are normally ‘too small’ compared with those for the maximum likelihood estimator, so that resulting interval estimates for the parameters will be unrealistically narrow. Expand
Discrete Choice Methods with Simulation
TLDR
Discrete Choice Methods with Simulation by Kenneth Train has been available in the second edition since 2009 and contains two additional chapters, one on endogenous regressors and one on the expectation–maximization (EM) algorithm. Expand
...
1
2
3
4
5
...