#### Filter Results:

- Full text PDF available (18)

#### Publication Year

2012

2017

- This year (3)
- Last 5 years (18)
- Last 10 years (18)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Junier B. Oliva
- 2016

Background. Modern neuroimaging data has provided a much needed window into the intricacies of the human brain. Neuroimaging techniques such as functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), and diffusion tensor imaging (DTI), often contain many thousands of functional observations per subject. While some success has been had… (More)

- Junier B. Oliva, Barnabás Póczos, Jeff G. Schneider
- ICML
- 2013

We analyze ‘Distribution to Distribution regression’ where one is regressing a mapping where both the covariate (inputs) and response (outputs) are distributions. No parameters on the input or output distributions are assumed, nor are any strong assumptions made on the measure from which input distributions are drawn from. We develop an estimator and derive… (More)

- Junier B. Oliva, Willie Neiswanger, Barnabás Póczos, Jeff G. Schneider, Eric P. Xing
- AISTATS
- 2014

We study the problem of distribution to real regression, where one aims to regress a mapping f that takes in a distribution input covariate P ∈ I (for a non-parametric family of distributions I) and outputs a real-valued response Y = f(P ) + . This setting was recently studied in [15], where the “KernelKernel” estimator was introduced and shown to have a… (More)

The use of distributions and high-level features from deep architecture has become commonplace in modern computer vision. Both of these methodologies have separately achieved a great deal of success in many computer vision tasks. However, there has been little work attempting to leverage the power of these to methodologies jointly. To this end, this paper… (More)

Kernel methods are ubiquitous tools in machine learning. However, there is often little reason for the common practice of selecting a kernel a priori. Even if a universal approximating kernel is selected, the quality of the finite sample estimator may be greatly affected by the choice of kernel. Furthermore, when directly applying kernel methods, one… (More)

In many scientific and engineering applications, we are tasked with the optimisation of an expensive to evaluate black box function f . Traditional methods for this problem assume just the availability of this single function. However, in many cases, cheap approximations to f may be obtainable. For example, the expensive real world behaviour of a robot can… (More)

- Junier B. Oliva, Willie Neiswanger, Barnabás Póczos, Eric P. Xing, Jeff G. Schneider
- AISTATS
- 2015

We analyze the problem of regression when both input covariates and output responses are functions from a nonparametric function class. Function to function regression (FFR) covers a large range of interesting applications including timeseries prediction problems, and also more general tasks like studying a mapping between two separate types of… (More)

Many interesting machine learning problems are best posed by considering instances that are distributions, or sample sets drawn from distributions. Previous work devoted to machine learning tasks with distributional inputs has done so through pairwise kernel evaluations between pdfs (or sample sets). While such an approach is fine for smaller datasets, the… (More)

- Siamak Ravanbakhsh, Junier B. Oliva, +4 authors Barnabás Póczos
- ICML
- 2016

A grand challenge of the 21 century cosmology is to accurately estimate the cosmological parameters of our Universe. A major approach in estimating the cosmological parameters is to use the large scale matter distribution of the Universe. Galaxy surveys provide the means to map out cosmic large-scale structure in three dimensions. Information about galaxy… (More)

- Xuezhi Wang, Junier B. Oliva, Jeff G. Schneider, Barnabás Póczos
- IJCAI
- 2016

Multi-task learning attempts to simultaneously leverage data from multiple domains in order to estimate related functions on each domain. For example, a special case of multi-task learning, transfer learning, is often employed when one has a good estimate of a function on a source domain, but is unable to estimate a related function well on a target domain… (More)