#### Filter Results:

- Full text PDF available (84)

#### Publication Year

1986

2017

- This year (2)
- Last 5 years (25)
- Last 10 years (51)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Steve Lawrence, C. Lee Giles, Ah Chung Tsoi, Andrew D. Back
- IEEE Trans. Neural Networks
- 1997

We present a hybrid neural-network for human face recognition which compares favourably with other methods. The system combines local image sampling, a self-organizing map (SOM) neural network, and a convolutional neural network. The SOM provides a quantization of the image samples into a topological space where inputs that are nearby in the original space… (More)

- Andrew D. Back, Ah Chung Tsoi
- Neural Computation
- 1991

A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward… (More)

- Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, Gabriele Monfardini
- IEEE Transactions on Neural Networks
- 2009

Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural network model, called graph neural network (GNN) model, that extends existing neural network… (More)

- Markus Hagenbuchner, Alessandro Sperduti, Ah Chung Tsoi
- IEEE Trans. Neural Networks
- 2003

Recent developments in the area of neural networks produced models capable of dealing with structured data. Here, we propose the first fully unsupervised model, namely an extension of traditional self-organizing maps (SOMs), for the processing of labeled directed acyclic graphs (DAGs). The extension is obtained by using the unfolding procedure adopted in… (More)

- Ah Chung Tsoi, Andrew D. Back
- IEEE Trans. Neural Networks
- 1994

In this paper, we will consider a number of local-recurrent-global-feedforward (LRGF) networks that have been introduced by a number of research groups in the past few years. We first analyze the various architectures, with a view to highlighting their differences. Then we introduce a general LRGF network structure that includes most of the network… (More)

- Franco Scarselli, Ah Chung Tsoi
- Neural Networks
- 1998

- Ah Chung Tsoi, Liangsuo Ma
- ICASSP
- 2003

In this paper, the problem of blind deconvolution of dynamical systems is considered using a state space approach. A balanced parameterized canonical form is used as a model for the underlying dynamical system instead of the more common controller or observable canonical form. The results are compared with those obtained using a controller canonical form.… (More)

Financial forecasting is an example of a signal processing problem which is challenging due to small sample sizes, high noise, non-stationarity, and non-linearity. Neural networks have been very successful in a number of signal processing applications. We discuss fundamental limitations and inherent difficulties when using neural networks for the processing… (More)

One of the most important aspects of any machine learning paradigm is how it scales according to problem size and complexity. Using a task with known optimal training error, and a pre-specified maximum number of training updates, we investigate the convergence of the backpropagation algorithm with respect to a) the complexity of the required function… (More)

- Ah Chung Tsoi, Andrew D. Back
- Neurocomputing
- 1997

Paper [1] aimed at providing a unified presentation of neural network architectures. We show in the present comment (i) that the canonical form of recurrent neural networks presented by Nerrand et al. [2] many years ago provides the desired unification, (ii) that what Tsoi and Back call Nerrand's canonical form is not the canonical form presented by Nerrand… (More)