How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate… (More)
Figure 2: (a) A probabilistic program in Julia defining a simple 3-state hidden Markov model (HMM), inspired by an example in [62]. An HMM is a widely-used probabilistic model for sequential and time series data which assumes the data were obtained by transitioning stochastically between a discrete number of hidden states [98]. The first four lines define the model parameters and the data. Here trans is the 3× 3 state-transition matrix, initial is the initial state distribution, and state mean are the mean observations for each of the three states; actual observations are assumed to be noisy versions of this mean with Gaussian noise. The function hmm() starts the definition of the HMM, drawing the sequence of states with the @assume statements, and conditioning on the observed data with the @observe statements. Finally @predict states that we wish to infer the states and data; this inference is done automatically via the universal inference engine which reasons over the configurations of this computer program. It would be trivial to modify this program so that the HMM parameters are unknown rather than fixed. (b) A graphical model corresponding to the HMM probabilistic program showing depe dencies between the parameters (cyan), hidden state variables (green) and observed data (yellow). This graphical model highlights the compositional nature of probabilistic models.