TY - GEN
T1 - Superior training of artificial neural networks using weight-space partitioning
AU - Gupta, H. V.
AU - Hsu, Kuo Lin
AU - Sorooshian, S.
PY - 1997
Y1 - 1997
N2 - Linear least squares simplex (LLSSIM) is a new algorithm for batch training of three-layer feedforward artificial neural networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a «multi-start downhill simplex» global search algorithm, and the hidden-output weights are estimated using «conditional linear least squares». Monte-Carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional backpropagation, adaptive backpropagation, and conjugate gradient strategies.
AB - Linear least squares simplex (LLSSIM) is a new algorithm for batch training of three-layer feedforward artificial neural networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a «multi-start downhill simplex» global search algorithm, and the hidden-output weights are estimated using «conditional linear least squares». Monte-Carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional backpropagation, adaptive backpropagation, and conjugate gradient strategies.
UR - http://www.scopus.com/inward/record.url?scp=0030682699&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0030682699&partnerID=8YFLogxK
U2 - 10.1109/ICNN.1997.614192
DO - 10.1109/ICNN.1997.614192
M3 - Conference contribution
AN - SCOPUS:0030682699
SN - 0780341228
SN - 9780780341227
T3 - IEEE International Conference on Neural Networks - Conference Proceedings
SP - 1919
EP - 1923
BT - 1997 IEEE International Conference on Neural Networks, ICNN 1997
T2 - 1997 IEEE International Conference on Neural Networks, ICNN 1997
Y2 - 9 June 1997 through 12 June 1997
ER -