Junier B. Oliva

Learn More
Background. Modern neuroimaging data has provided a much needed window into the intricacies of the human brain. Neuroimaging techniques such as functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), and diffusion tensor imaging (DTI), often contain many thousands of functional observations per subject. While some success has been had(More)
We analyze ‘Distribution to Distribution regression’ where one is regressing a mapping where both the covariate (inputs) and response (outputs) are distributions. No parameters on the input or output distributions are assumed, nor are any strong assumptions made on the measure from which input distributions are drawn from. We develop an estimator and derive(More)
We study the problem of distribution to real regression, where one aims to regress a mapping f that takes in a distribution input covariate P ∈ I (for a non-parametric family of distributions I) and outputs a real-valued response Y = f(P ) + . This setting was recently studied in [15], where the “KernelKernel” estimator was introduced and shown to have a(More)
Kernel methods are ubiquitous tools in machine learning. However, there is often little reason for the common practice of selecting a kernel a priori. Even if a universal approximating kernel is selected, the quality of the finite sample estimator may be greatly affected by the choice of kernel. Furthermore, when directly applying kernel methods, one(More)
In many scientific and engineering applications, we are tasked with the optimisation of an expensive to evaluate black box function f . Traditional methods for this problem assume just the availability of this single function. However, in many cases, cheap approximations to f may be obtainable. For example, the expensive real world behaviour of a robot can(More)
We analyze the problem of regression when both input covariates and output responses are functions from a nonparametric function class. Function to function regression (FFR) covers a large range of interesting applications including timeseries prediction problems, and also more general tasks like studying a mapping between two separate types of(More)
Many interesting machine learning problems are best posed by considering instances that are distributions, or sample sets drawn from distributions. Previous work devoted to machine learning tasks with distributional inputs has done so through pairwise kernel evaluations between pdfs (or sample sets). While such an approach is fine for smaller datasets, the(More)
A grand challenge of the 21 century cosmology is to accurately estimate the cosmological parameters of our Universe. A major approach in estimating the cosmological parameters is to use the large scale matter distribution of the Universe. Galaxy surveys provide the means to map out cosmic large-scale structure in three dimensions. Information about galaxy(More)
Multi-task learning attempts to simultaneously leverage data from multiple domains in order to estimate related functions on each domain. For example, a special case of multi-task learning, transfer learning, is often employed when one has a good estimate of a function on a source domain, but is unable to estimate a related function well on a target domain(More)