#### Filter Results:

- Full text PDF available (1)

#### Publication Year

1988

2003

- This year (0)
- Last 5 years (0)
- Last 10 years (0)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Paul Kang-Hoh Phua, Daohua Ming
- IEEE Trans. Neural Networks
- 2003

In this paper, we propose the use of parallel quasi-Newton (QN) optimization techniques to improve the rate of convergence of the training process for neural networks. The parallel algorithms are developed by using the self-scaling quasi-Newton (SSQN) methods. At the beginning of each iteration, a set of parallel search directions is generated. Each of… (More)

- X. Zou, Ionel Michael Navon, M. Berger, Paul Kang-Hoh Phua, Tamar Schlick, François-Xavier Le Dimet
- SIAM Journal on Optimization
- 1993

Computational experience with several limited-memory quasi-Newton and truncated Newton methods for unconstrained nonlinear optimization is described. Comparative tests were conducted on a well-known test library [J. J. Mor, B. S. Garbow, and K. E. Hillstrom, ACM Trans. Math. Software, 7 (1981), pp. 17-41], on several synthetic problems allowing control of… (More)

- Paul Kang-Hoh Phua
- International Journal of High Speed Computing
- 1991

Vectorization techniques are applied here for the vectorization of the non-linear conjugate-gradient method for large-scale unconstrained minimization. Until now the main thrust of vectorization techniques has been directed towards vectorization of linear conjugate-gradient methods designed to solve symmetric linear systems of algebraic equations.… (More)

- Paul Kang-Hoh Phua, Yuelin Zeng, Daohua Ming
- PDPTA
- 1999

- ‹
- 1
- ›