Finite size scaling in neural networks

Walter Nadler, Wolfgang Fink

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We demonstrate that the fraction of pattern sets that can be stored in single- and hidden-layer perceptrons exhibits finite size scaling. This feature allows one to estimate the critical storage capacity αc from simulations of relatively small systems. We illustrate this approach by determining αc, together with the finite size scaling exponent υ, for storing Gaussian patterns in committee and parity machines with binary couplings and up to K = 5 hidden units.

Original languageEnglish (US)
Pages (from-to)555-558
Number of pages4
JournalPhysical review letters
Volume78
Issue number3
DOIs
StatePublished - Jan 20 1997
Externally publishedYes

ASJC Scopus subject areas

  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'Finite size scaling in neural networks'. Together they form a unique fingerprint.

Cite this