Superior training of artificial neural networks using weight-space partitioning

H. V. Gupta, Kuo Lin Hsu, S. Sorooshian

Research output: Chapter in Book/Report/Conference proceedingConference contribution

17 Scopus citations

Abstract

Linear least squares simplex (LLSSIM) is a new algorithm for batch training of three-layer feedforward artificial neural networks (ANN), based on a partitioning of the weight space. The input-hidden weights are trained using a «multi-start downhill simplex» global search algorithm, and the hidden-output weights are estimated using «conditional linear least squares». Monte-Carlo testing shows that LLSSIM provides globally superior weight estimates with significantly fewer function evaluations than the conventional backpropagation, adaptive backpropagation, and conjugate gradient strategies.

Original languageEnglish (US)
Title of host publication1997 IEEE International Conference on Neural Networks, ICNN 1997
Pages1919-1923
Number of pages5
DOIs
StatePublished - 1997
Event1997 IEEE International Conference on Neural Networks, ICNN 1997 - Houston, TX, United States
Duration: Jun 9 1997Jun 12 1997

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
Volume3
ISSN (Print)1098-7576

Conference

Conference1997 IEEE International Conference on Neural Networks, ICNN 1997
Country/TerritoryUnited States
CityHouston, TX
Period6/9/976/12/97

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Superior training of artificial neural networks using weight-space partitioning'. Together they form a unique fingerprint.

Cite this