#### Filter Results:

- Full text PDF available (54)

#### Publication Year

1956

2017

- This year (8)
- Last 5 years (40)
- Last 10 years (56)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Matthew D. Hoffman, David M. Blei, Francis R. Bach
- NIPS
- 2010

We develop an online variational Bayes (VB) algorithm for Latent Dirichlet Allocation (LDA). Online LDA is based on online stochastic optimization with a natural gradient step, which we show converges to a local optimum of the VB objective function. It can handily analyze massive document collections, including those arriving in a stream. We study the… (More)

- Matthew D. Hoffman, David M. Blei, Chong Wang, John William Paisley
- Journal of Machine Learning Research
- 2013

The distinction between local and global variables will be important for us to develop online inference. In Bayesian statistics, for example, think of β as parameters with a prior and z1:n as hidden variables which are individual to each observation. In a Bayesian mixture of Gaussians the global variables β are the mixture components and mixture… (More)

- Matthew D. Hoffman, Andrew Gelman
- Journal of Machine Learning Research
- 2014

Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) algorithm that avoids the random walk behavior and sensitivity to correlated parameters that plague many MCMC methods by taking a series of steps informed by first-order gradient information. These features allow it to converge to high-dimensional target distributions much more quickly than… (More)

- David M. Mimno, Matthew D. Hoffman, David M. Blei
- ICML
- 2012

We present a hybrid algorithm for Bayesian topic models that combines the efficiency of sparse Gibbs sampling with the scalability of online stochastic inference. We used our algorithm to analyze a corpus of 1.2 million books (33 billion words) with thousands of topics. Our approach reduces the bias of variational inference and generalizes to many Bayesian… (More)

- Matthew D. Hoffman, David M. Blei, Perry R. Cook
- ISMIR
- 2009

Many songs in large music databases are not labeled with semantic tags that could help users sort out the songs they want to listen to from those they do not. If the words that apply to a song can be predicted from audio, then those predictions can be used both to automatically annotate a song with tags, allowing users to get a sense of what qualities… (More)

- Matthew D. Hoffman, David M. Blei, Perry R. Cook
- ICML
- 2010

Recent research in machine learning has focused on breaking audio spectrograms into separate sources of sound using latent variable decompositions. These methods require that the number of sources be specified in advance, which is not always possible. To address this problem, we develop Gamma Process Nonnegative Matrix Factorization (GaP-NMF), a Bayesian… (More)

Artificial neural networks typically have a fixed, non-linear activation function at each neuron. We have designed a novel form of piecewise linear activation function that is learned independently for each neuron using gradient descent. With this adaptive activation function, we are able to improve upon deep neural network architectures composed of static… (More)

- Samuel J Gershman, Matthew D. Hoffman, David M. Blei
- ICML
- 2012

Variational methods are widely used for approximate posterior inference. However, their use is typically limited to families of distributions that enjoy particular conjugacy properties. To circumvent this limitation, we propose a family of variational approximations inspired by nonparametric kernel density estimation. The locations of these kernels and… (More)

Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It uses Bayesian methods to sample the objective efficiently using an acquisition… (More)

- Paris Smaragdis, Cédric Févotte, Gautham J. Mysore, Nasser Mohammadiha, Matthew D. Hoffman
- IEEE Signal Processing Magazine
- 2014

Source separation models that make use of nonnegativity in their parameters have been gaining increasing popularity in the last few years, spawning a significant number of publications on the topic. Although these techniques are conceptually similar to other matrix decompositions, they are surprisingly more effective in extracting perceptually meaningful… (More)