In this paper we consider the ART1 neural network architecture introduced by Carpenter and Grossberg. In their original paper, Carpenter and Grossberg made the following conjecture: In the fast learning case, if the F2layer in ART1 has at least N nodes, then each member of a list of N input patterns presented cyclically at the F1layer of ART1 will have direct access to an F2layer node after at most N list presentations. In this paper, we demonstrate that the conjecture is not valid for certain large L values, where L is a network parameter associated with the adaptation of the bottom-up traces in ART1. It is worth noting that previous work has shown the conjecture to be true for small L values.
|Title of host publication
|Proceedings - 1992 International Joint Conference on Neural Networks, IJCNN 1992
|Institute of Electrical and Electronics Engineers Inc.
|Number of pages
|Published - 1992
|1992 International Joint Conference on Neural Networks, IJCNN 1992 - Baltimore, United States
Duration: Jun 7 1992 → Jun 11 1992
|Proceedings of the International Joint Conference on Neural Networks
|1992 International Joint Conference on Neural Networks, IJCNN 1992
|6/7/92 → 6/11/92
ASJC Scopus subject areas
- Artificial Intelligence