Norikazu Takahashi

Learn More
Sequential minimal optimization (SMO) algorithm is one of the simplest decomposition methods for learning of support vector machines (SVMs). Keerthi and Gilbert have recently studied the convergence property of SMO algorithm and given a proof that SMO algorithm always stops within a finite number of iterations. In this letter, we point out the(More)
Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods(More)
Computational Optimization and Applications manuscript No. Abstract Nonnegative matrix factorization (NMF) is the problem of approximating a given nonnegative matrix by the product of two nonnegative matrices. The multi-plicative updates proposed by Lee and Seung are widely used as efficient computational methods for NMF. However, the global convergence of(More)
Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given l training samples, SVR is formulated as a convex quadratic programming (QP) problem with l pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each(More)