#### Filter Results:

- Full text PDF available (9)

#### Publication Year

1990

2017

- This year (2)
- Last 5 years (9)
- Last 10 years (20)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Norikazu Takahashi, Tetsuo Nishi
- IEEE Transactions on Neural Networks
- 2005

Sequential minimal optimization (SMO) algorithm is one of the simplest decomposition methods for learning of support vector machines (SVMs). Keerthi and Gilbert have recently studied the convergence property of SMO algorithm and given a proof that SMO algorithm always stops within a finite number of iterations. In this letter, we point out the… (More)

- Ryota Hibi, Norikazu Takahashi
- ICONIP
- 2011

- Jun Guo, Norikazu Takahashi, Tetsuo Nishi
- IEICE Transactions
- 2006

- J. Guo, N. Takahashi, T. Nishi
- Proceedings of the 2005 European Conference on…
- 2005

A novel method for training support vector machines (SVMs) is proposed to speed up the SVMs in test phase. It has three main steps. First, an SVM is trained on all the training samples, thereby producing a number of support vectors. Second, the support vectors, which contribute less to the shape of the decision surface, are excluded from the training set.… (More)

- Norikazu Takahashi, Jun Guo, Tetsuo Nishi
- IEEE Transactions on Neural Networks
- 2008

Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given <i>l</i> training samples, SVR is formulated as a convex quadratic programming (QP) problem with <i>l</i> pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for… (More)

- Snehal A. Mulay, Sandya Peddabachigari, +23 authors Jun Ma
- 2011

Support Vector Machines (SVM) are the classifiers which were originally designed for binary classification. The classification applications can solve multi-class problems. Decision-tree-based support vector machine which combines support vector machines and decision tree can be an effective way for solving multi-class problems. This method can decrease the… (More)

- Norikazu Takahashi, Tetsuo Nishi
- ISCAS
- 2006

Global dynamical behavior of one-dimensional celIn this paper, we analyze the global dynamical behavior of lular neural networks (1-D CNNs) with the antisymmetric tem1-D CNNs with the Dirichlet boundary condition: plate A = [s, p, -s] is studied in this paper. Under the assumption that the outputs of the boundary cells are fixed to 1 or -1, a YO(t) =… (More)

- Yasushi Otsuka, Satoshi Tsuchikawa, Aiko Mitake, Norikazu Takahashi, Chikao Kouno
- Rinsho shinkeigaku = Clinical neurology
- 2002

We here reported a case of typical transient global amnesia (TGA). A 64-year-old right-handed man was suddenly unable to keep his recent memory without any event. There were no neurological deficits except for recent memory disturbances. He showed no causative findings of the attack on examinations of laboratory data, EEG and MRI. Examination of single… (More)

- Norikazu Takahashi, Tetsuo Nishi
- IEEE Transactions on Neural Networks
- 2006

Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods… (More)

- Jun Guo, Norikazu Takahashi, Tetsuo Nishi
- ICONIP
- 2006