Jouni Hartikainen

Learn More
In this paper, we show how temporal (i.e., time-series) Gaussian process regression models in machine learning can be reformulated as linear-Gaussian state space models, which can be solved exactly with classical Kalman filtering theory. The result is an efficient non-parametric learning algorithm, whose computational complexity grows linearly with respect(More)
Nonlinear Kalman filter and Rauch-Tung-Striebel smoother type recursive estimators for nonlinear discrete-time state space models with multivariate Student's t-distributed measurement noise are presented. The methods approximate the posterior state at each time step using the variational Bayes method. The nonlinearities in the dynamic and measurement models(More)
Gaussian process-based machine learning is a powerful Bayesian paradigm for nonparametric nonlinear regression and classification. In this article, we discuss connections of Gaussian process regression with Kalman filtering and present methods for converting spatiotemporal Gaussian process regression problems into infinite-dimensional state-space models.(More)
We show how spatio-temporal Gaussian process (GP) regression problems (or the equivalent Kriging problems) can be formulated as infinite-dimensional Kalman filtering and Rauch-Tung-Striebel (RTS) smoothing problems, and present a procedure for converting spatio-temporal covariance functions into infinite-dimensional stochastic differential equations (SDEs).(More)
Latent force models (LFMs) are hybrid models combining mechanistic principles with non-parametric components. In this article, we shall show how LFMs can be equivalently formulated and solved using the state variable approach. We shall also show how the Gaussian process prior used in LFMs can be equivalently formulated as a linear statespace model driven by(More)
In this note we shall present a new Gaussian approximation based framework for approximate optimal smoothing of non-linear stochastic state space models. The approximation framework can be used for efficiently solving non-linear fixedinterval, fixed-point and fixed-lag optimal smoothing problems. We shall also numerically compare accuracies of(More)
Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function. The inference can then be(More)
Latent force models (LFMs) are flexible models that combine mechanistic modelling principles (i.e., physical models) with nonparametric data-driven components. Several key applications of LFMs need nonlinearities, which results in analytically intractable inference. In this work we show how non-linear LFMs can be represented as nonlinear white noise driven(More)