#### Filter Results:

- Full text PDF available (5)

#### Publication Year

1988

2000

- This year (0)
- Last 5 years (0)
- Last 10 years (0)

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Jacqueline W. T. Wong, Wing-Kay Kan, Gilbert H. Young
- SIGIR Forum
- 1996

An important step in building up the document database of a full-text retrieval system is to classify each document under one or more <i>classes</i> according to the <i>topical domains</i> that the document discusses. This is commonly referred to as <i>classification. Automatic classification</i> attempts to replace human classifiers by using computers to… (More)

- John Sum, Andrew Chi-Sing Leung, Peter Kwong-Shun Tam, Gilbert H. Young, Wing-Kay Kan, Lai-Wan Chan
- IEEE Trans. Neural Networks
- 1999

Recently we have proposed a simple circuit of winner-take-all (WTA) neural network. Assuming no external input, we have derived an analytic equation for its network response time. In this paper, we further analyze the network response time for a class of winner-take-all circuits involving self-decay and show that the network response time of such a class of… (More)

- Chi-Hong Leung, Wing-Kay Kan
- JASIS
- 1997

- Andrew Chi-Sing Leung, Gilbert H. Young, John Sum, Wing-Kay Kan
- IEEE Trans. Neural Networks
- 1999

In this paper, the regularization of employing the forgetting recursive least square (FRLS) training technique on feedforward neural networks is studied. We derive our result from the corresponding equations for the expected prediction error and the expected training error. By comparing these error equations with other equations obtained previously from the… (More)

- John Sum, Andrew Chi-Sing Leung, Gilbert H. Young, Wing-Kay Kan
- IEEE Trans. Neural Networks
- 1999

In the use of extended Kalman filter approach in training and pruning a feedforward neural network, one usually encounters the problems on how to set the initial condition and how to use the result obtained to prune a neural network. In this paper, some cues on the setting of the initial condition will be presented with a simple example illustrated. Then… (More)

- Chi-Hong Leung, Wing-Kay Kan
- Neural Parallel & Scientific Comp.
- 1996

- John Sum, Wing-Kay Kan, Gilbert H. Young
- Neural Computing & Applications
- 1999

This paper presents several aspects with regards the application of the NARX model and Recurrent Neural Network (RNN) model in system identification and control. We show that every RNN can be transformed to a first order NARX model, and vice versa, under the condition that the neuron transfer function is similar to the NARX transfer function. If the neuron… (More)

- John Sum, Andrew Chi-Sing Leung, Gilbert H. Young, Lai-Wan Chan, Wing-Kay Kan
- Neural Computation
- 1999

Pruning a neural network to a reasonable smaller size, and if possible to give a better generalization, has long been investigated. Conventionally the common technique of pruning is based on considering error sensitivity measure, and the nature of the problem being solved is usually stationary. In this article, we present an adaptive pruning algorithm for… (More)

In this paper we describe and evaluate a set of Web-based and workflow-sensitive educational techniques that can increase the effectiveness of teaching, collaboration, and resource sharing. These techniques allow educators to facilitate the goal of developing high quality teaching and learning, using the fast growing Internet technologies. We construct a… (More)

- Wing-Kay Kan
- 1988