Learn More
Despite the breakthroughs in accuracy and speed of single image super-resolution using faster and deeper con-volutional neural networks, one central problem remains largely unsolved: how do we recover the finer texture details when we super-resolve at large upscaling factors? The behavior of optimization-based super-resolution methods is principally driven(More)
Recently, several models based on deep neural networks have achieved great success in terms of both reconstruction accuracy and computational performance for single image super-resolution. In these methods, the low resolution (LR) input image is upscaled to the high resolution (HR) space using a single filter, commonly bicubic interpolation, before(More)
Information theoretic active learning has been widely studied for prob-abilistic models. For simple regression an optimal myopic policy is easily tractable. However, for other tasks and with more complex models, such as classification with nonparametric models, the optimal solution is harder to compute. Current approaches make approximations to achieve(More)
Herding and kernel herding are determinis-tic methods of choosing samples which sum-marise a probability distribution. A related task is choosing samples for estimating inte-grals using Bayesian quadrature. We show that the criterion minimised when selecting samples in kernel herding is equivalent to the posterior variance in Bayesian quadra-ture. We then(More)
We present a new model based on Gaussian processes (GPs) for learning pair-wise preferences expressed by multiple users. Inference is simplified by using a preference kernel for GPs which allows us to combine supervised GP learning of user preferences with unsupervised dimensionality reduction for multiuser systems. The model not only exploits collaborative(More)
Humans develop rich mental representations that guide their behavior in a variety of everyday tasks. However, it is unknown whether these representations, often formalized as priors in Bayesian inference, are specific for each task or subserve multiple tasks. Current approaches cannot distinguish between these two possibilities because they cannot extract(More)
A central challenge in cognitive science is to measure and quantify the mental representations humans develop – in other words, to 'read' subject's minds. In order to eliminate potential biases in reporting mental contents due to verbal elaboration, subjects' responses in experiments are often limited to binary decisions or discrete choices that do not(More)
Modern applications and progress in deep learning research have created renewed interest for generative models of text and of images. However, even today it is unclear what objective functions one should use to train and evaluate these models. In this paper we present two contributions. Firstly, we present a critique of scheduled sampling, a(More)
Executive summary In this framework talk, we consider the impact of approximate inference in the context of Bayesian decision theory. We argue for the need to shift focus from the traditional approach of solely approximating the Bayesian posterior to developing loss-calibrated approximate Bayesian inference methods, in analogy to what is done in(More)