Learn More
The model order reduction methodology of reduced basis (RB) techniques offers efficient treatment of parametrized partial differential equations (P 2 DEs) by providing both approximate solution procedures and efficient error estimates. RB-methods have so far mainly been applied to finite element schemes for elliptic and parabolic problems. In the current(More)
Kernel methods are becoming increasingly popular for various kinds of machine learning tasks, the most famous being the support vector machine (SVM) for classification. The SVM is well understood when using conditionally positive definite (cpd) kernel functions. However, in practice, non-cpd kernels arise and demand application in SVM. The procedure of(More)
In this contribution we describe a novel classification approach for on-line handwriting recognition. The technique combines dynamic time warping (DTW) and support vector machines (SVMs) by establishing a new SVM kernel. We call this kernel Gaussian DTW (GDTW) kernel. This kernel approach has a main advantage over common HMM techniques. It does not assume a(More)
Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of this paper is twofold. First, we derive an additional kernel(More)
We present a new approach to treat nonlinear operators in reduced basis approximations of parametrized evolution equations. Our approach is based on empirical interpolation of nonlinear differential operators and their Frechet derivatives. Efficient offline/online decomposition is obtained for discrete operators that allow an efficient evaluation for a(More)
In many learning problems prior knowledge about pattern variations can be formalized and beneficially incorporated into the analysis system.The corresponding notion of invariance is commonly used in conceptionally different ways. We propose a more distinguishing treatment in particular in the active field of kernel methods for machine learning and pattern(More)
During the last decades, reduced basis (RB) methods have been developed to a wide methodology for model reduction of problems that are governed by parametrized partial differential equations (P 2 DEs). In particular equations of elliptic and parabolic type for linear, low polynomial or monotonic nonlinearities have been treated successfully by RB methods(More)
Within the framework of kernel methods, linear data methods have almost completely been extended to their nonlinear counterparts. In this paper, we focus on nonlinear kernel techniques based on the Mahalanobis distance. Two approaches are distinguished here. The first one assumes an invertible covariance operator , while the second one uses a regularized(More)