Corpus ID: 6756014

Learning Programs: A Hierarchical Bayesian Approach

@inproceedings{Liang2010LearningPA,
  title={Learning Programs: A Hierarchical Bayesian Approach},
  author={Percy Liang and Michael I. Jordan and Dan Klein},
  booktitle={ICML},
  year={2010}
}
We are interested in learning programs for multiple related tasks given only a few training examples per task. Since the program for a single task is underdetermined by its data, we introduce a nonparametric hierarchical Bayesian prior over programs which shares statistical strength across multiple tasks. The key challenge is to parametrize this multi-task sharing. For this, we introduce a new representation of programs based on combinatory logic and provide an MCMC algorithm that can perform… Expand
Sampling for Bayesian Program Learning
TLDR
An algorithm that uses a symbolic solver to efficiently sample programs via random parity constraints is proposed, giving theoretical guarantees on how well the samples approximate the true posterior, and empirical results showing the algorithm is efficient in practice. Expand
How To Train Your Program
TLDR
This work frames the approach as a design pattern of probabilistic programming referred to herein as ‘stump and fungus’, and illustrates realization of the pattern on a didactic case study. Expand
Learning Probabilistic Programs
TLDR
Encouraging empirical results are established that suggest that Markov chain Monte Carlo probabilistic programming inference techniques coupled with higher-order probabilism programming languages are now sufficiently powerful to enable successful inference of this kind in nontrivial domains. Expand
Library learning for neurally-guided Bayesian program induction
TLDR
This work contributes a program induction algorithm called EC2 that learns a DSL while jointly training a neural network to efficiently search for programs in the learned DSL to synthesize functions on lists, edit text, and solve symbolic regression problems. Expand
Probabilistic Meta-Representations Of Neural Networks
TLDR
This work considers a richer prior distribution in which units in the network are represented by latent variables, and the weights between units are drawn conditionally on the values of the collection of those variables. Expand
Semantics-Aware Program Sampling
TLDR
An algorithm for specifying and sampling from distributions over programs via a perturb-max approximation and a simple modification to the Perturb-and-MAP sampling algorithm that allows interpolation between syntactic priors and priors over semantics. Expand
A Machine Learning Framework for Programming by Example
TLDR
It is shown how machine learning can be used to speed up this seemingly hopeless search problem, by learning weights that relate textual features describing the provided input-output examples to plausible sub-components of a program. Expand
Unsupervised Learning by Program Synthesis
TLDR
An unsupervised learning algorithm is introduced that combines probabilistic modeling with solver-based techniques for program synthesis and can learn many visual concepts from only a few examples and recover some English inflectional morphology. Expand
Learning to Learn Programs from Examples : Going Beyond Program Structure
Programming-by-example technologies let end users construct and run new programs by providing examples of the intended program behavior. But, the few provided examples seldom uniquely determine theExpand
Bootstrap Learning via Modular Concept Discovery
TLDR
An iterative procedure for exploring domain of problems, where the solution space is that of typed functional programs and the gained information is stored as a stochastic grammar over programs, and it is shown that the learner discovers modular concepts for these domains. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 16 REFERENCES
Adaptor Grammars: A Framework for Specifying Compositional Nonparametric Bayesian Models
TLDR
This paper presents a general-purpose inference algorithm for adaptor grammars, making it easy to define and use such models, and illustrates how several existing nonparametric Bayesian models can be expressed within this framework. Expand
A Bayesian Model of the Acquisition of Compositional Semantics
We present an unsupervised, cross-situational Bayesian learning model for the acquisition of compositional semantics. We show that the model acquires the correct grammar for a toy version of EnglishExpand
Church: a language for generative models
TLDR
This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. Expand
A Rational Analysis of Rule-Based Concept Learning
TLDR
A new model of human concept learning that provides a rational analysis of learning feature-based concepts is proposed, built upon Bayesian inference for a grammatically structured hypothesis space-a concept language of logical rules. Expand
Natively probabilistic computation
TLDR
Stochastic digital circuits that model the probability algebra just as traditional Boolean circuits model the Boolean algebra are introduced, which can be used to build massively parallel, fault-tolerant machines for sampling and allow one to efficiently run Markov chain Monte Carlo methods on models with hundreds of thousands of variables in real time. Expand
Hierarchical Dirichlet Processes
We consider problems involving groups of data where each observation within a group is a draw from a mixture model and where it is desirable to share mixture components between groups. We assume thatExpand
Programming by Demonstration Using Version Space Algebra
TLDR
This work formalizes programming by demonstration as a machine learning problem: given the changes in the application state that result from the user's demonstrated actions, learn the general program that maps from one application state to the next. Expand
Functional Genetic Programming with Combinators
Prior program representations for genetic programming that incorporated features of modern programming languages solved harder problems than earlier representations, but required more complex geneticExpand
Watch what I do: programming by demonstration
Part 1 Systems: Pygmalion tinker a predictive calculator rehearsal world smallStar peridot metamouse TELS eager garnet the Turvy experience chimera the geometer's sketchpad tourmaline a history-basedExpand
The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator
The two-parameter Poisson-Dirichlet distribution, denoted PD(α,θ), is a probability distribution on the set of decreasing positive sequences with sum 1. The usual Poisson-Dirichlet distribution withExpand
...
1
2
...