Learn More
Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given <i>l</i> training samples, SVR is formulated as a convex quadratic programming (QP) problem with <i>l</i> pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for(More)
Sequential minimal optimization (SMO) algorithm is one of the simplest decomposition methods for learning of support vector machines (SVMs). Keerthi and Gilbert have recently studied the convergence property of SMO algorithm and given a proof that SMO algorithm always stops within a finite number of iterations. In this letter, we point out the(More)
Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods(More)
A novel method for training support vector machines (SVMs) is proposed to speed up the SVMs in test phase. It has three main steps. First, an SVM is trained on all the training samples, thereby producing a number of support vectors. Second, the support vectors, which contribute less to the shape of the decision surface, are excluded from the training set.(More)
Computational Optimization and Applications manuscript No. Abstract Nonnegative matrix factorization (NMF) is the problem of approximating a given nonnegative matrix by the product of two nonnegative matrices. The multi-plicative updates proposed by Lee and Seung are widely used as efficient computational methods for NMF. However, the global convergence of(More)