Learn More
Person independent and pose invariant estimation of facial expressions and action unit (AU) intensity estimation is important for situation analysis and for automated video annotation. We evaluated raw 2D shape data of the CK+ database, used Procrustes transformation and the multi-class SVM leave-one-out method for classification. We found close to 100%(More)
Since the pioneering work of Shannon, entropy, mutual information, association, divergence measures and kernels on distributions have found a broad range of applications in many areas of machine learning. Entropies provide a natural notion to quantify the uncertainty of random variables, mutual information and association indices measure the dependence(More)
Kernel methods represent one of the most powerful tools in machine learning to tackle problems expressed in terms of function values and derivatives due to their capability to represent and model complex relations. While these methods show good versatility, they are computationally intensive and have poor scalability to large data as they require operations(More)
Estimation of facial expressions, as spatio-temporal processes, can take advantage of kernel methods if one considers facial landmark positions and their motion in 3D space. We applied support vector classification with kernels derived from dynamic time-warping similarity measures. We achieved over 99% accuracy - measured by area under ROC curve - using(More)
We introduce the Locally Linear Latent Variable Model (LL-LVM), a probabilistic model for non-linear manifold discovery that describes a joint distribution over observations , their manifold coordinates and locally linear maps conditioned on a set of neighbourhood relationships. The model allows straightforward variational op-timisation of the posterior(More)
Here, we introduce the blind subspace deconvolution (BSSD) problem, which is the extension of both the blind source deconvolution (BSD) and the independent subspace analysis (ISA) tasks. We treat the undercomplete BSSD (uBSSD) case. Applying temporal concatenation we reduce this problem to ISA. The associated 'high dimensional' ISA problem can be handled by(More)
We treat the problem of searching for hidden multi-dimensional independent auto-regressive processes. First, we transform the problem to Independent Subspace Analysis (ISA). Our main contribution concerns ISA. We show that under certain conditions, ISA is equivalent to a combi-natorial optimization problem. For the solution of this optimization we apply the(More)
In this paper the classical detection filter design problem is considered as an input reconstruction problem. Input reconstruction is viewed as a dynamic inversion problem. This approach is based on the existence of the left inverse and arrives at detector architectures whose outputs are the fault signals while the inputs are the measured system inputs and(More)
Here, we address the problem of Independent Subspace Analysis (ISA). We develop a technique that (i) builds upon joint decorrelation for a set of functions, (ii) can be related to kernel based techniques, (iii) can be interpreted as a self-adjusting, self-grouping neural network solution, (iv) can be used both for real and for complex problems, and (v) can(More)
We develop a dictionary learning method which is (i) online, (ii) enables overlapping group structures with (iii) non-convex sparsity-inducing regularization and (iv) handles the partially observable case. Structured sparsity and the related group norms have recently gained widespread attention in group-sparsity regularized problems in the case when the(More)