Abstract
Recurrent neural networks are complex parametric dynamic systems that can exhibit a wide range of different behavior. We consider the task of grammatical inference with recurrent neural networks. Specifically, we consider the task of classifying natural language sentences as grammatical or ungrammatical - can a recurrent neural network be made to exhibit the same kind of discriminatory power which is provided by the Principles and Parameters linguistic framework, or Government and Binding theory? We attempt to train a network, without the bifurcation into learned vs. innate components assumed by Chomsky, to produce the same judgments as native speakers on sharply grammatical/ungrammatical data. We consider how a recurrent neural network could possess linguistic capability, and investigate the properties of Elman, Narendra & Parthasarathy (N&P) and Williams & Zipser (W&Z) recurrent networks, and Frasconi-Gori-Soda (FGS) locally recurrent networks in this setting. We show that both Elman and W&Z recurrent neural networks are able to learn an appropriate grammar.
Original language | English (US) |
---|---|
Pages | 1853-1858 |
Number of pages | 6 |
State | Published - 1996 |
Externally published | Yes |
Event | Proceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4) - Washington, DC, USA Duration: Jun 3 1996 → Jun 6 1996 |
Other
Other | Proceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4) |
---|---|
City | Washington, DC, USA |
Period | 6/3/96 → 6/6/96 |
ASJC Scopus subject areas
- Software