Learn More
There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks. Scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the <i>convolutional deep belief network</i>, a hierarchical generative model which scales to realistic image sizes.(More)
There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks (DBNs); however, scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the <i>convolutional deep belief network</i>, a hierarchical generative model that scales to(More)
Automatically detecting human social intentions from spoken conversation is an important task for dialogue understanding. Since the social intentions of the speaker may differ from what is perceived by the hearer, systems that analyze human conversations need to be able to extract both the perceived and the intended social meaning. We investigate this(More)
Stochastic variational inference finds good posterior approximations of probabilistic models with very large data sets. It optimizes the vari-ational objective with stochastic optimization, following noisy estimates of the natural gradient. Operationally, stochastic inference iteratively subsamples from the data, analyzes the subsample, and updates(More)
Automatically extracting social meaning and intention from spoken dialogue is an important task for dialogue systems and social computing. We describe a system for detecting elements of interactional style: whether a speaker is awkward, friendly, or flirtatious. We create and use a new spoken corpus of 991 4-minute speed-dates. Participants rated their(More)
We develop a Bayesian nonparametric Pois-son factorization model for recommendation systems. Poisson factorization implicitly models each user's limited budget of attention (or money) that allows consumption of only a small subset of the available items. In our Bayesian nonparametric variant, the number of latent components is theoretically unbounded and(More)
Models for recommender systems use latent factors to explain the preferences and behaviors of users with respect to a set of items (e.g., movies, books, academic papers). Typically, the latent factors are assumed to be static and, given these factors, the observed pref- erences and behaviors of users are assumed to be generated without order. These(More)
Modern data analysis requires an iterative cycle: in the probabilistic modeling framework, a sim-1 ple model is fit to the data, and it is refined as we gather more knowledge about the data's hidden 2 structure. However, fitting complex models to large datasets is mathematically and computationally 3 challenging. We develop an automated tool called(More)