• Publications
  • Influence
High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
TLDR
It is shown that with particular choices of kernel functions, nonredundant features with strong statistical dependence on output values can be found in terms of kernel-based independence measures such as the Hilbert-Schmidt independence criterion and the globally optimal solution can be efficiently computed. Expand
K2-ABC: Approximate Bayesian Computation with Kernel Embeddings
TLDR
This paper proposes a fully nonparametric ABC paradigm which circumvents the need for manually selecting summary statistics, and uses maximum mean discrepancy (MMD) as a dissimilarity measure between the distributions over observed and simulated data. Expand
A Linear-Time Kernel Goodness-of-Fit Test
TLDR
A novel adaptive test of goodness-of-fit, with computational cost linear in the number of samples, and it is proved that under a mean-shift alternative, the test always has greater relative efficiency than a previous linear-time kernel test, regardless of the choice of parameters for that test. Expand
Interpretable Distribution Features with Maximum Testing Power
TLDR
In real-world benchmarks on high-dimensional text and image data, linear-time tests using the proposed semimetrics achieve comparable performance to the state-of-the-art quadratic-time maximum mean discrepancy test, while returning human-interpretable features that explain the test results. Expand
Squared-loss Mutual Information Regularization: A Novel Information-theoretic Approach to Semi-supervised Learning
TLDR
SMIR is convex under mild conditions and thus improves the nonconvexity of mutual information regularization, and offers all of the following four abilities to semi-supervised algorithms: Analytical solution, out-of-sample/multi-class classification, and probabilistic output. Expand
Bayesian Manifold Learning: The Locally Linear Latent Variable Model (LL-LVM)
TLDR
The LL-LVM encapsulates the local-geometry preserving intuitions that underlie non-probabilistic methods such as locally linear embedding (LLE) and makes it easy to evaluate the quality of hypothesised neighbourhood relationships, select the intrinsic dimensionality of the manifold, construct out-of-sample extensions and to combine the manifold model with additional probabilistic models that capture the structure of coordinates within the manifold. Expand
An Adaptive Test of Independence with Analytic Kernel Embeddings
TLDR
A new computationally efficient dependence measure, and an adaptive statistical test of independence, are proposed, which perform comparably to the state-of-the-art quadratic-time HSIC test, and outperform competing O( n) and O(n log n) tests. Expand
Large sample analysis of the median heuristic
In kernel methods, the median heuristic has been widely used as a way of setting the bandwidth of RBF kernels. While its empirical performances make it a safe choice under many circumstances, thereExpand
Kernel Distributionally Robust Optimization
TLDR
This paper provides both theoretical analyses that extend the robustness properties of kernel methods, as well as practical algorithms that can be applied to general optimization problems, not limited to kernelized models. Expand
Cognitive Bias in Ambiguity Judgements: Using Computational Models to Dissect the Effects of Mild Mood Manipulation in Humans
TLDR
A novel variant of the cognitive (judgement) bias task aimed at dissecting the effects of affect manipulations on perceptual and value computations for decision-making under ambiguity in humans found that Unpleasant Room subjects were (‘pessimistically’) biased towards choosing the SAFE key under ambiguity, but also weighed WINS more heavily than LOSSes compared to Pleasant Room subjects. Expand
...
1
2
3
4
5
...