Abstract
We critically evaluate Piantadosi’s claim that deep neural networks have obsoleted linguistic theory. The Generative Enterprise seeks to explain why human language exhibits discrete infinity, yet is not unrestricted. In fact, all languages appear to obey the same basic underlying properties. Through the lens of the Strong Minimalist Thesis (SMT), driven by evolutionary considerations, inquiry has been focused on maximally simple operations such as Merge for structure, and Minimal Search for establishing structural relations. By contrast, the computationally expensive setting of billions of parameters in current deep neural networks perform provides no biologically plausible explanation for human language. Moreover, we show through simple examples that the performance of current systems turn out to be highly overrated.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 59-74 |
| Number of pages | 16 |
| Journal | Italian Journal of Linguistics |
| Volume | 37 |
| Issue number | 1 |
| DOIs | |
| State | Published - 2025 |
Keywords
- discrete infinity
- Merge
- Minimal Search
- Strong Minimalist Thesis (SMT)
ASJC Scopus subject areas
- Language and Linguistics
- Linguistics and Language