Chong Jin Ong

Learn More
In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the(More)
Sequential minimal optimization (SMO) is one popular algorithm for training support vector machine (SVM), but it still requires a large amount of computation time for solving large size problems. This paper proposes one parallel implementation of SMO for training SVM. The parallel SMO is developed using message passing interface (MPI). Specifically, the(More)
In this paper, we apply popular Bayesian techniques on support vector classifier. We propose a novel differentiable loss function called trigonometric loss function with the desirable characteristics of natural normalization in the likelihood function, and then follow standard Gaussian processes techniques to set up a Bayesian framework. In this framework,(More)
This paper describes an improved algorithm for the numerical solution to the support vector machine (SVM) classification problem for all values of the regularization parameter <i>C</i> . The algorithm is motivated by the work of Hastie and follows the main idea of tracking the optimality conditions of the SVM solution for ascending value of <i>C</i> . It(More)
This paper considers a nonlinear feedback control policy that is an extension of those provided by command governors and reference governors. As in these control approaches it applies to discrete-time linear systems with hard constraints and set bounded disturbances. The control policy retains the main properties of traditional governors, such as(More)
This paper considers fast algorithms for computing the Euclidean distance between objects that are modeled by convex polytopes in three-dimensional space. The algorithms, designated by RGJK, are modifications of the Gilbert–Johnson–Keerthi algorithm that follow the scheme originated by Cameron. Each polytope is represented by its vertices and a list of(More)
In this paper, we propose a unified non-quadratic loss function for regression known as soft insensitive loss function (SILF). SILF is a flexible model and possesses most of the desirable characteristics of popular non-quadratic loss functions, such as Laplacian, Huber’s and Vapnik’s ε-insensitive loss function. We describe the properties of SILF and(More)
The least square support vector machines (LS-SVM) formulation corresponds to the solution of a linear system of equations. Several approaches to its numerical solutions have been proposed in the literature. In this letter, we propose an improved method to the numerical solution of LS-SVM and show that the problem can be solved using one reduced system of(More)
This brief deals with the estimator design problem for discrete-time switched neural networks with time-varying delay. One main problem is the asynchronous-mode switching between the neuron state and the estimator. Our goal is to design a mode-dependent estimator for the switched neural networks under average dwell time switching such that the estimation(More)