# A comparison of variational approximations for fast inference in mixed logit models

@article{Depraetere2017ACO, title={A comparison of variational approximations for fast inference in mixed logit models}, author={Nicolas Depraetere and Martina Vandebroek}, journal={Computational Statistics}, year={2017}, volume={32}, pages={93-125} }

Variational Bayesian methods aim to address some of the weaknesses (computation time, storage costs and convergence monitoring) of mainstream Markov chain Monte Carlo based inference at the cost of a biased but more tractable approximation to the posterior distribution. We investigate the performance of variational approximations in the context of the mixed logit model, which is one of the most used models for discrete choice data. A typical treatment using the variational Bayesian methodology… Expand

#### 10 Citations

Variational Bayesian Inference for Mixed Logit Models with Unobserved Inter- and Intra-Individual Heterogeneity

- Mathematics, Economics
- 2019

This paper derives a VB method for posterior inference in mixed logit models with unobserved inter- and intra-individual heterogeneity and demonstrates that VB can be a fast, scalable and accurate alternative to MSL and MCMC estimation, especially in applications in which fast predictions are paramount. Expand

Bayesian Estimation of Mixed Multinomial Logit Models: Advances and Simulation-Based Evaluations

- Computer Science, Mathematics
- ArXiv
- 2019

Extending several VB methods for MMNL to admit utility specifications including both fixed and random utility parameters suggests that all VB variants perform as well as MCMC and MSLE at prediction and recovery of all model parameters with the exception of the covariance matrix of the multivariate normal mixing distribution. Expand

Mixed Variational Inference

- Mathematics, Computer Science
- 2019 International Joint Conference on Neural Networks (IJCNN)
- 2019

This work proposes a method that combines the Laplace approximation with the variational approach and maintains: applicability on non-conjugate models, posterior correlations and a reduced number of free variational parameters. Expand

Semi-Parametric Hierarchical Bayes Estimates of New Yorkers' Willingness to Pay for Features of Shared Automated Vehicle Services

- Computer Science, Economics
- 2019

This paper compares the performance of the MVN, F-MON and DP-MON mixing distributions using simulated data and real data sourced from a stated choice study on preferences for SAV services in New York City and shows that the DP- MON mixing distribution provides superior fit to the data and performs at least as well as the competing methods at out-of-sample prediction. Expand

A Dirichlet process mixture model of discrete choice: Comparisons and a case study on preferences for shared automated vehicles

- Computer Science, Mathematics
- 2018

This work empirically validate the model framework in a case study on motorists' route choice preferences and find that the proposed Dirichlet process mixture model of discrete choice outperforms a latent class MNL model and mixed MNL models with common parametric mixing distributions in terms of both in-sample fit and out-of-sample predictive ability. Expand

Learning Document Embeddings Along With Their Uncertainties

- Computer Science
- IEEE/ACM Transactions on Audio, Speech, and Language Processing
- 2020

Bayesian subspace multinomial model (Bayesian SMM), a generative log-linear model that learns to represent documents in the form of Gaussian distributions, thereby encoding the uncertainty in its covariance is presented. Expand

Discrete Choice Analysis with Machine Learning Capabilities

- Computer Science, Economics
- ArXiv
- 2021

This paper discusses capabilities that are essential to models applied in policy analysis settings and the limitations of direct applications of off-the-shelf machine learning methodologies to such settings, and identifies an area where machine learning paradigms can be leveraged. Expand

Variational approximation for importance sampling

- Computer Science
- Comput. Stat.
- 2021

Numerical results show that using variational approximation as the proposal can improve the performance of importance sampling and sequential importance sampling. Expand

An online updating method for time-varying preference learning

- Computer Science
- 2020

A new online-updating model that can accurately and efficiently estimate an individual’s preferences from his discrete choices is proposed and has the highest accuracy in preference learning and behavior prediction. Expand

Scalable and Accurate Variational Bayes for High-Dimensional Binary Regression Models

- Mathematics
- 2019

State-of-the-art methods for Bayesian inference on regression models with binary responses are either computationally impractical or inaccurate in high dimensions. To cover this gap we propose a… Expand

#### References

SHOWING 1-10 OF 44 REFERENCES

Variational Inference for Large-Scale Models of Discrete Choice

- Mathematics
- 2010

Discrete choice models are commonly used by applied statisticians in numerous fields, such as marketing, economics, finance, and operations research. When agents in discrete choice models are assumed… Expand

Variational Bayesian Multinomial Probit Regression with Gaussian Process Priors

- Computer Science, Mathematics
- Neural Computation
- 2006

This is the first time that a fully variational Bayesian treatment for multiclass GP classification has been developed without having to resort to additional explicit approximations to the nongaussian likelihood term. Expand

Efficient Bounds for the Softmax Function and Applications to Approximate Inference in Hybrid models

- 2008

The softmax link is used in many probabilistic model dealing with both discrete and continuous data. However, efficient Bayesian inference for this type of model is still an open problem due to the… Expand

Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

- Mathematics
- 2009

Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalized) linear models, (generalized) additive… Expand

Gaussian Variational Approximate Inference for Generalized Linear Mixed Models

- Mathematics
- 2012

Variational approximation methods have become a mainstay of contemporary machine learning methodology, but currently have little presence in statistics. We devise an effective variational… Expand

Bayesian parameter estimation via variational methods

- Mathematics, Computer Science
- Stat. Comput.
- 2000

It is shown that an accurate variational transformation can be used to obtain a closed form approximation to the posterior distribution of the parameters thereby yielding an approximate posterior predictive model. Expand

Comparing different sampling schemes for approximating the integrals involved in the efficient design of stated choice experiments

- Mathematics
- 2010

The semi-Bayesian approach for constructing efficient stated choice designs requires the evaluation of the design selection criterion value over numerous draws taken from the prior parameter… Expand

Variational Message Passing

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2005

Variational Message Passing is introduced, a general purpose algorithm for applying variational inference to Bayesian Networks and can be applied to very general class of conjugate-exponential models because it uses a factorised variational approximation. Expand

Inadequacy of interval estimates corresponding to variational Bayesian approximations

- Computer Science
- AISTATS
- 2005

It is shown that the covariance matrices from the variational Bayes approximations are normally ‘too small’ compared with those for the maximum likelihood estimator, so that resulting interval estimates for the parameters will be unrealistically narrow. Expand

Discrete Choice Methods with Simulation

- Computer Science
- 2016

Discrete Choice Methods with Simulation by Kenneth Train has been available in the second edition since 2009 and contains two additional chapters, one on endogenous regressors and one on the expectation–maximization (EM) algorithm. Expand