](https://deep-paper.org/en/paper/file-2965/images/cover.png)
The Basque Problem: Do AI Models Actually Understand Universal Grammar?
The debate over Artificial Intelligence and language is often framed as a battle between “nature” and “nurture.” On one side, you have the nativist view, championed historically by linguists like Noam Chomsky. This view argues that human beings are born with an innate “Universal Grammar”—a set of hard-wired constraints that allow children to learn complex languages from relatively little data. On the other side, you have the empiricist view, currently dominating the field of Deep Learning. This view posits that general-purpose learning algorithms (like Transformers), given enough data, can learn anything, including the complex rules of syntax, without any pre-wired grammatical knowledge. ...
](https://deep-paper.org/en/paper/file-2964/images/cover.png)
](https://deep-paper.org/en/paper/2410.13343/images/cover.png)
](https://deep-paper.org/en/paper/2411.01136/images/cover.png)
](https://deep-paper.org/en/paper/2205.11472/images/cover.png)
](https://deep-paper.org/en/paper/2408.11443/images/cover.png)
](https://deep-paper.org/en/paper/2402.01512/images/cover.png)
](https://deep-paper.org/en/paper/2403.08424/images/cover.png)
](https://deep-paper.org/en/paper/2307.09233/images/cover.png)
](https://deep-paper.org/en/paper/2310.03084/images/cover.png)
](https://deep-paper.org/en/paper/2406.14868/images/cover.png)
](https://deep-paper.org/en/paper/2410.18481/images/cover.png)
](https://deep-paper.org/en/paper/2406.19356/images/cover.png)
](https://deep-paper.org/en/paper/2402.15951/images/cover.png)
](https://deep-paper.org/en/paper/2407.00211/images/cover.png)
](https://deep-paper.org/en/paper/2406.19874/images/cover.png)
](https://deep-paper.org/en/paper/file-2945/images/cover.png)
](https://deep-paper.org/en/paper/2406.13009/images/cover.png)
](https://deep-paper.org/en/paper/2410.17972/images/cover.png)
](https://deep-paper.org/en/paper/2312.06648/images/cover.png)