Partial least-squares algorithm for weights initialization of backpropagation network

Abstract

This paper proposes a hybrid scheme to set the weights initialization and the optimal number of hidden nodes of the backpropagation network (BPN) by applying the loading weights and factor numbers of the partial least-squares (PLS) algorithm. The joint PLS and BPN method (PLSBPN) starts with a small residual error, modi6es the latent weight matrices, and obtains a near-global minimum in the calibration phase. Performances of the BPN, PLS, and PLSBPN were compared for the near infrared spectroscopic analysis of glucose concentrations in aqueous matrices. The results showed that the PLSBPN had the smallest root mean square error. The PLSBPN approach signi6cantly solves some conventional problems of the BPN method by providing the good initial weights, reducing the calibration time, obtaining an optimal solution, and easily determining the number of hidden nodes. c © 2002 Elsevier Science B.V. All rights reserved.

DOI: 10.1016/S0925-2312(01)00708-1

Extracted Key Phrases

7 Figures and Tables

Cite this paper

@article{Hsiao2003PartialLA, title={Partial least-squares algorithm for weights initialization of backpropagation network}, author={Tzu-Chien Ryan Hsiao and Chii-Wann Lin and Huihua Kenny Chiang}, journal={Neurocomputing}, year={2003}, volume={50}, pages={237-247} }