#### Filter Results:

#### Publication Year

2012

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We propose and analyse estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are derived from the von Mises expansion and are based on the theory of influence functions, which appear in the semiparametric statistics literature. We show that estimators based either on data-splitting or a… (More)

We consider nonparametric estimation of L 2 , Rényi-α and Tsallis-α divergences between continuous distributions. Our approach is to construct estimators for particular integral function-als of two densities and translate them into divergence estimators. For the integral function-als, our estimators are based on corrections of a preliminary plug-in… (More)

Bayesian Optimisation (BO) is a technique used in optimising a D-dimensional function which is typically expensive to evaluate. While there have been many successes for BO in low dimensions , scaling it to high dimensions has been notoriously difficult. Existing literature on the topic are under very restrictive settings. In this paper, we identify two key… (More)

- Kirthevasan Kandasamy, Akshay Krishnamurthy, Barnabás Póczos, Larry Wasserman, James M Robins
- 2014

We propose and analyze estimators for statistical functionals of one or more distributions under nonparametric assumptions. Our estimators are based on the theory of influence functions, which appear in the semiparametric statistics literature. We show that estimators based either on data-splitting or a leave-one-out technique enjoy fast rates of… (More)

- Akshay Krishnamurthy, Kirthevasan Kandasamy, Barnabás Poczós, Larry Wasserman
- 2015

We give a comprehensive theoretical characterization of a nonparametric estimator for the L 2 2 divergence between two continuous distributions. We first bound the rate of convergence of our estimator, showing that it is √ n-consistent provided the densities are sufficiently smooth. In this smooth regime, we then show that our estimator is asymp-totically… (More)

This paper studies active posterior estimation in a Bayesian setting when the likelihood is expensive to evaluate. Existing techniques for posterior estimation are based on generating samples representative of the posterior. Such methods do not consider efficiency in terms of likelihood evaluations. In order to be query efficient we treat posterior… (More)

Bayesian Optimization (BO) is commonly used to optimize blackbox objective functions which are expensive to evaluate. A common approach is based on using Gaussian Process (GP) to model the objective function. Applying GP to higher dimensional settings is generally difficult due to the curse of dimen-sionality for nonparametric regression. Existing works… (More)

In many scientific and engineering applications, we are tasked with the optimisation of an expensive to evaluate black box function f. Traditional methods for this problem assume just the availability of this single function. However, in many cases, cheap approximations to f may be obtainable. For example, the expensive real world behaviour of a robot can… (More)