Abstract
Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.
Original language | English (US) |
---|---|
Pages (from-to) | 25-55 |
Number of pages | 31 |
Journal | Connection Science |
Volume | 24 |
Issue number | 1 |
DOIs | |
State | Published - Mar 2012 |
Externally published | Yes |
Keywords
- generalisation
- representational grounding
- self-organising map
- sentence processing
- systematicity
ASJC Scopus subject areas
- Software
- Human-Computer Interaction
- Artificial Intelligence