Evolved structure of language shows lineage-specific trends in word-order universals

Via Simon Greenhill:

Dunn M, Greenhill SJ, Levinson SC, & Gray RD (2011). Evolved structure of language shows lineage-specific trends in word-order universals. Nature.

Some colleagues and I have a new paper out in Nature showing that the evolved structure of language shows lineage-specific trends in word-order universals. I’ve written an overview/FAQ on this paper here, and there’s a nice review of it here and here.

The Abstract:

Languages vary widely but not without limit. The central goal of linguistics is to describe the diversity of human languages and explain the constraints on that diversity. Generative linguists following Chomsky have claimed that linguistic diversity must be constrained by innate parameters that are set as a child learns a language. In contrast, other linguists following Greenberg have claimed that there are statistical tendencies for co-occurrence of traits reflecting universal systems biases, rather than absolute constraints or parametric variation. Here we use computational phylogenetic methods to address the nature of constraints on linguistic diversity in an evolutionary framework. First, contrary to the generative account of parameter setting, we show that the evolution of only a few word-order features of languages are strongly correlated. Second, contrary to the Greenbergian generalizations, we show that most observed functional dependencies between traits are lineage-specific rather than universal tendencies. These findings support the view that—at least with respect to word order—cultural evolution is the primary factor that determines linguistic structure, with the current state of a linguistic system shaping and constraining future states.

 

4 thoughts on “Evolved structure of language shows lineage-specific trends in word-order universals”

  1. LINGUISTIC NON SEQUITURS

    (1) The Dunn et al article in Nature is not about language evolution (in the Darwinian sense); it is about language history.

    (2) Universal grammar (UG) is a complex set of rules, discovered by Chomsky and his co-workers. UG turns out to be universal (i.e., all known language are governed by its rules) and its rules turn out to be unlearnable on the basis of what the child says and hears, so they must be inborn in the human brain and genome.

    (3) Although UG itself is universal, it has some free parameters that are set by learning. Word-order (subject-object vs. object-subject) is one of those learned parameters. The parameter-settings themselves differ for different language families, and are hence, of course, not universal, but cultural.

    (4) Hence the Dunn et al results on the history of word-order are not, as claimed, refutations of UG.

    Harnad, S. (2008) Why and How the Problem of the Evolution of Universal Grammar (UG) is Hard. Behavioral and Brain Sciences 31: 524-525

  2. Couldn’t these broad patterns, or supposed rules, simply be the product of domain general biases that shape language? Our brain is highly flexible precisely because it is adaptive to do so: we can rapidly learn things over a shorter time scale than is possible through genetically encoding a behaviour. This, of course, doesn’t rule out a co-evolutionary scenario…

    I really think you should read Deacon’s latest paper on the topic: A role for relaxed selection in the evolution of the language capacity

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.