Monday, 27 September 2010

Evolution of Language and the Evolution of Syntax: Same Debate, Same Solution?

I have recently realised that only after submitting my Masters thesis have I really started to understand my main arguement (a bit late perhaps?).  The arguement was located in the paper, but perhaps not clearly enough, probably because I wasn't thinking about it enough myself at the time.  I'll try to explain:

One of the debates that exist in language evolution research is whether language evolved abruptly, or gradually in many steps.  It is one of the topics of research I am most fascinated with, and was the subject of study for my MSc dissertation (you can read it here).

Abruptist arguments follow that language evolved in a mutation (suggested in writings by Gould and Lewontin, Piatelli-Palmerini, Crow, Klein, Hornstein, Lanyon, and even Chomsky), and possibly argue language could not have evolved gradually as an intermediate stage between language and non-language could not have existed (such as in Berwick's writings).  Gradualist arguments, which most contemporary evolutionary linguists side with and I too find much more biologically likely, approach language as not a monolithic thing, but made up of many different components that evolved in several stages over time (such as in the writings of Pinker, Jackendoff, Burling, Hurford, Kirby, Aitcheson, Kinsella, Heine and Kuteva, Fitch, and Johansson to name a few).

In the gradualist explanations of language evolution however, you often see reconstructions where there is a single step from a protolanguage (essentially a syntax-less language) and complex grammar.  I view this step as containing the same problems the idea that language evolved abruptly.  Syntax too is not a monolithic thing, and takes more than one cognitive step.  Modern languages contain complex syntax that requires certain memory capacities, and I think certain types of grammatical relationships would take more than just the learning of one rule, such as "merge".

In my thesis, I went through a literature review of gradual and abruptist arguments for language evolution, and posited an intermediate stage of syntactic complexity where a language might have only one level of embedding in its grammar.  It's a shaky and underdeveloped example of an intermediate stage of language, and requires a lot of exploration; but my reason for positing it in the first place is that I think we need to think of the evolution of syntax the way many researchers are seeing the evolution of language as a whole, not as a monolithic thing that evolved in one fell swoop as a consequence of a genetic mutation, but as a series of steps in increasing complexity.

Derek Bickerton, one of my favourite authors of evolutionary linguistics material, has written a number of excellent books and papers on the subject.  But he also argues that language likely experienced a jump from a syntax-less protolanguage to a fully modern version of complex syntax seen in languages today.  To me that seems unintuitive.  Children learn syntax in steps, and non-human species seem to only be able to grasp simple syntax.  Does this not suggest that it's possible to have a stable stage of intermediate syntax?

So my arguement is not very well developed, and I have a terrible example of what an intermediate stage of syntax might look like in my thesis.  But the errors you will likely find in my suppositions is beside the point; I've come to realise now that what I'm really trying to say is that we are treating the gap between non-syntax and syntax the way we have historically treated non-language and language- as a great leap.  And I would really like to explore the idea that relationships between words could first have been simple, and then relationships grew in complexity over time.



  1. As with any data that describes gradual change, we can choose to draw various thresholds at any point along it. Or we can go to the furthest extreme and say that every individual datum in the sequence represents something slightly different. Neither an absence of thresholds, nor a preponderance of them, is an effective interpretive device, and both lead to a state remarkably similar to the other, a sort of epistemological stale-mate. There has to be a certain acceptance that thresholds must exist in our interpretations in order to make them viable. In a sense, our models can never be gradual. It's of course complicated by the fact that we do not have records of unbroken chains of individuals until quite recent times. The nature of our data must be a factor in the thresholds that we do choose to draw. I think I'm right in asserting that linguistics dominated by Chomsky, as it had been for a large part of the last century, was not a discipline that engaged much with any temporal aspect to language, and for this reason we might expect to find that those linguists who have a Chomskyan bent also advocate simpler threshold models (simpler in terms of thresholds, but more complex -read 'unlikely'- in terms of biology!)

    The reduction of syntax is the addition of thresholds, and from an interpretive stance, is something we desire as gradualists. I wouldn't be surprised though if we deduce ultimately that syntax as a complex structure does emerge piecemeal from an analogue capacity, such as memory, as you suggest.

  2. To me that seems unintuitive. Online dissertation topics today are quite focusing still on the recent language.


Related Posts Plugin for WordPress, Blogger...