Abstract
We introduce a novel type of neural network, termed the parallelHopfield network, that can simultaneously effect the dynamics of many different, independent Hopfield networks in parallel in the same piece of neural hardware. Numerically we find that under certain conditions, each Hopfield subnetwork has a finite memory capacity approaching that of the equivalent isolated attractor network, while a simple signal-to-noise analysis sheds qualitative, and some quantitative, insight into the workings (and failures) of the system.
Original language | English (US) |
---|---|
Pages (from-to) | 831-850 |
Number of pages | 20 |
Journal | Neural computation |
Volume | 21 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2009 |
Externally published | Yes |
ASJC Scopus subject areas
- Arts and Humanities (miscellaneous)
- Cognitive Neuroscience