Email Calvin || Email Bickerton || Book's Table of Contents || Calvin Home Page || September 1999

COPY-AND-PASTE CITATION


William H. Calvin and Derek Bickerton, Lingua ex Machina: Reconciling Darwin and Chomsky with the human brain (MIT Press, 2000), chapter 11.  See also http://WilliamCalvin.com/
LEM/LEMch
11.htm

copyright ©2000 by William H. Calvin and Derek Bickerton

The nonvirtual book is available from  amazon.com or direct from MIT Press.

Webbed Reprint Collection
This 'tree' is really a pyramidal neuron of cerebral cortex.  The axon exiting at bottom goes long distances, eventually splitting up into 10,000 small branchlets to make synapses with other brain cells.
William H. Calvin

University of Washington
Seattle WA 98195-1800 USA



11

Role Links for Words





I would be the last person to suggest that the social calculus described in the preceding chapter was the only thing causing representation of thematic roles in primate minds. Other factors have surely contributed to the same end. But if you think about it, only the social calculus could have built up categories that had the right degree of clarity and abstractness.

If you understand what causes things you may be able to predict them. Accurate prediction may save your life. Saving your life gives you more time to procreate, so current evolutionary thought suggests that prediction should be favored. If you know that movements of long grass when there is no wind may be caused by a predator stalking you, you can take appropriate action in time, rather than passively awaiting the tiger=s charge. But are we to suppose that the genes code for every eventuality, and abstract across categories as diverse as moving grass and falling rain? Or is it more likely that each animal has to learn the consequences of these things from its own experience?

I suspect that the answer may lie somewhere in between: that nothing so specific as the detection of tell-tale grass swirls is hardwired, but that prey animals have vision that responds to things like movement in grass, and also mechanisms that tell them that something is causing that movement and that that something is a predator. Nothing here needs to incorporate an abstract concept of agency. All the information is specific to that one kind of event.

However, we=re not talking about one kind of event. We=re talking about a variety of events B grooming, food sharing, chasing, fighting B in which any one animal is sometimes the performer, sometimes the object of the action. (Sometimes it=s me grooming you, sometimes you grooming me, sometimes a third party, and so on.) When all of this takes place among animals of a high social intelligence who need to keep track of one another=s behavior to avoid being shortchanged by freeloaders, it would hardly be surprising if some quite abstract analysis of the roles developed.

What I think happened was that the social calculus set up the categories of agent, theme, and goal, and that these categories (or thematic roles, which is what linguists call them) were then exapted to produce the basis for sentence structures. Some linguists might object that thematic roles are semantic, and syntax is, well, syntax. Syntax is autonomous, a totally self-contained country with its own rules and regulations, and ever more shall be so. But I think here that the linguists are ignoring the nature of evolution. It=s like saying, well, swim-bladders were a floatation device, and lungs are for breathing, so lungs can=t have evolved from swim-bladders. However, we know that they did.

Nothing in evolution is a complete novelty. Everything is a modified version of something that came before B even if the modification sometimes changes the original past recognition. So, syntax could not have emerged as a pure novelty. But there was semantics before there was syntax, and if some aspect of semantics could be expressed in terms of syntax, then that aspect makes a prime suspect for the source of syntax. What happened to it afterwards is another matter.

What matters now is not so much the semantic nature of the roles themselves. What was important was getting the primary roles of goal, agent, and theme represented in the primate mind. It didn=t have to go on being important, once the function of those roles diverged, once they were exapted to play a part in language. The really important thing is that those key roles had to be expressed in sentences:

» With a verb like Asleep@ or Arun,@ you have to express one role.

» With a verb like Amake@ or Abreak,@ you have to express two.

» With a verb like Agive@ or Apersuade,@ you have to express three.

In other words, you know in advance, in every clause of every sentence depending on the clause=s verb, whether you will have to be looking for one, two or three noun-phrases to which these roles are attached B that is to say, for one, two, or three Aobligatory arguments.@

I have assumed throughout, for reasons discussed at length elsewhere, that language began in the form of a structureless protolanguage, something like an early stage pidgin, without any formal structure B just handfuls of words or gestures strung together. Syntax began when people began to map thematic roles onto their protolinguistic output. What this means is simply that when they talked about anything that had happened, they would put in the obligatory arguments. Instead of saying things like AIg take,@ they would have to say AIg take meat,@ even if everyone knew it was meat they were talking about. Instead of saying Ahit Og,@ they would have to say AIg hit Og,@ even though everyone knew it was Ig who had done it. And once they knew what had to be there, they could go on to longer and longer sentences, for the following reason.

An argument is simply the combination of a thematic role (agent, etc.) with whatever words represent a participant in an action, state or event. More often than not, those words take the form of a noun-phrase (remember, we call it a noun-phrase even if there=s only a single noun or pronoun there). But they could just as easily be a whole clause, because what an argument actually is depends on the verb. If the verb is Abreak,@ then its theme is going to be some hard physical object that might conceivably be broken (AHe couldn=t break the lock@ is fine, but AHe broke the blanket@ is the kind of sentence you get from little kids who haven=t yet learned the verb Atear@). But if the verb is Asay,@ then its theme is going to be anything that can be said, including clauses that could, if they stood alone, be sentences in their own right. Thus, in AHe said he was tired,@ the clause Ahe was tired@ is just as much an argument of the verb as is the noun-phrase Ahis prayers@ in AHe said his prayers.


They say there=s no free lunch, but here=s an exception. However long a sentence is, you can always make it longer, thanks to recursivity. By mapping argument structure onto utterances, you get recursivity for free. And recursivity is one of the defining characteristics of true language B it=s why sentences can be infinite, why Faulkner=s 1,300-word sentence can never be the longest English sentence, why there never can be a longest English sentence.

Recursivity should be distinguished from mere iteration. Iteration looks like

1+ 1 + 1 + 1 + 1..... 

Recursivity looks, for anyone who may be familiar with obsolete versions of generative grammar, like this: 

S ± NP VP

[A Sentence can be rewritten as a Noun-Phrase and a Verb-Phrase]

 

VP ± V ( NP )

( S )

[A Verb-Phrase can be rewritten as a Verb and

(optionally) a Noun-Phrase or even a Sentence]

What this means is that units at higher levels of structure (like NP and S in the first line) can be reintroduced at lower levels, so that the process of sentence building simply recycles and recycles for as long as you want it to. It=s recursivity that gives language its wonderful flexibility, allowing us to introduce several ideas into the same sentence, let them brush against one another and strike sparks B the sparks of novelty and creativity that form the most distinctive feature of our species.

Updating the old generative formula, recursivity now looks like this:

 

AS ± V + A1 (+A2 (+A3))

[An Argument Structure can be rewritten as a Verb

plus one, two, or three Arguments]

 

Ai ± NP/PP/AS

[An Argument can be rewritten as a Noun-Phrase,

Prepositional Phrase, or Argument Structure]

However, this formula still lacks an important element that the old formula had: linearization.


Linearization is not something we wanted. It was something that was forced on us by our choice of a physical medium for language. When we speak, we use only a single channel, the vocal channel, and we can=t make more than one sound at a time. If we=d chosen sign, we could in principle have produced two or three units simultaneously B even though this possibility isn=t exploited as much as it might be in the sign languages of the deaf. But with the vocal channel we=re stuck with one sound after another, one word after another. Thus, sounds and words have to be linked in some definite order, and that means word order becomes one possible means of resolving ambiguities and indicating structural relationships.

To many, especially those who are not syntacticians, word order looms large in syntax. To some, it=s the be-all and end-all. Once words have been arranged in a fixed order, they think,
For every complex problem, there is a simple, easy to understand, incorrect answer.

--Albert Szent-Gyorgy

we
=re done. Far from it, for several reasons. First, even in English (a fairly strict word-order language, as languages go) you can put the same words in any number of orders B AMary made the video,@ AThe video was made by Mary,@ AIt was the video that Mary made.@

And secondly, the really crucial relationships in language are not horizontal but vertical (whether something is higher or lower in a tree diagram of a sentence, whether a particular tree branch includes or doesn=t include a particular item, and so on). Horizontal linear relationships can=t explain why ABob=s sister hurt herself@ is grammatical but ABob=s sister hurt himself@ isn=t. They can=t explain why AHow do you know who left?@ is grammatical but AWho do you know how left?@ isn=t, even though the corresponding statement, AI know how Mary left B by Yellow Cab@ is fine. They can=t explain why the subject of Awork for@ in ABill wants someone to work for@ is ABill,@ while in ABill wants someone to work for him@ it=s Asomeone.@ And these are just three randomly chosen examples out of countless thousands that horizontal relationships can=t explain and vertical relationships can.

However, you can easily get from an abstract map of argument structure to a linear string by starting with the verb and adding arguments to it in a certain order. This order isn=t set in marble (we can monkey about with it, even in English, as the AMary and the video@ examples showed). But there is (in English at least) an order we can take as basic. Interestingly enough, this order is the basic order that we find in creole languages worldwide, and because the original creole language speakers didn=t have a fixed word order in their input, it=s a fair bet that this order is the most natural among all the variants.

First, if there is a goal argument, this is attached to the right of the verb, followed by the theme argument, which is attached to the right of Verb + goal. Then, if there are any optional arguments like time, they are attached to the right of Verb + goal + theme. Finally, agent is attached to the left of everything else. In other words, if you have a sentence like

Mary gave Bill a watch last week.

it=s built up in the following stages:

gave Bill [Verb + goal]

gave Bill a watch [Verb + goal + theme]

gave Bill a watch last week [Verb + goal + theme + time]

Mary gave Bill a watch last week. [agent prefixed onto verb phrase]

gave Bill a watch [Verb + goal + theme]

gave Bill a watch last week [Verb + goal + theme + time]

Mary gave Bill a watch last week. [agent prefixed onto verb phrase]

When there=s no goal, then theme moves up next to the verb, but nothing else changes:

Mary kissed Bill.

When there=s no agent, then theme moves up into its vacant slot:

Mary dreamed.

The same thing happens if you want to tell a story from the perspective of Mary, or if you don=t know (or don=t care) who the agent was:

Mary was kissed [by Bill].

Once you know this much, you can really start to parse and understand complex sentences.


Let=s look at an example of how this works in practice by comparing it to protolanguage limitations. First imagine you=re a protolanguage speaker and somebody has said something like

I saw Og taking Ug=s meat.

Well, not exactly that, because before there was true language, it=s highly unlikely that there were things like past tenses or -ing forms or apostrophe -s. So it would probably have come out more like AI see Og take Ug meat.@ How could you have parsed this?

Well, you couldn=t. You might have got to AI see,@ with a small problem about whether the speaker had seen, would see, or was seeing right now. But then you would have been confronted by AOg,@ and you would have no way of telling whether AOg@ was the theme argument of Asee@ or the agent argument of Atake,@ or both. The third option, both, would have helped you to understand the sentence better, but it=s the least likely to be chosen, because in protolanguage, words are in one place and do one thing and one thing only. A speaker familiar with protolanguage would have been far more likely to assume that either the theme of Asee@ had been omitted (so AOg@ would be merely agent of Atake@) or that the agent of Atake@ had been omitted (so AOg@ would be the theme of Asee@), because that=s the kind of thing that happens all the time in protolanguage.

But this is nothing compared with AUg meat.@ No surprise to find two nouns together (this happens all the time in protolanguage) but their relationship is up for grabs, even with the verb Atake@ around to help out. Could it mean that Ug was taken to the meat, or from the meat, or the meat was taken to him, or from him?

Now there=s no doubt that an intelligent protolanguage speaker could eventually, with the help of context, of knowing the people concerned and their behavior, have figured out the correct meaning. But life goes by too fast to allow us the luxury of sitting around trying to figure out the meaning of every six-word sentence, and there will always be times when we fail. We need something automatic, like what we have in language today. Let=s see how, with our modern equipment, we might get to understand

I saw Og taking Ug=s meat.

even without all the -ing=s and s=es and so on that we nowadays have to help us.

The first verb is Asee.@ We know that Asee@ needs two arguments. AI@ as agent of Asee@ precedes it and is easily identified. The theme should follow it, but the word immediately following it, AOg,@ is then followed by another verb, that also requires a preceding agent. The alternative conclusion is that the theme of Asee@ is all the rest of the sentence B a reasonable conclusion because what one sees can be a situation or event just as easily as it can be an object. Now all we have to do is look for an agent of Atake@ (AOg,@ obviously) and a theme of Atake.@ Since Atake@ is a two-place and not a three-place predicate, we know there can=t be two more obligatory arguments, so again the plausible conclusion is that AUg meat@ is a compound phrase of the type Apossessor-possessed.@

The precise identification of this last phrase is the only thing that could give us trouble. What tells a modern speaker the relationship between the last two words is, of course, the apostrophe As@ (AUg=s meat@) that marks the first noun as possessor of the second. And that=s just one example of grammatical morphology B all the words (and things smaller than words, inflections and the like) that don=t refer to real-world entities, but indicate relationships between entities (like Ato,@ which gives the idea that things change places), or more formal relationships between different bits of the sentence (like the As@ on present-tense English verbs that tells you their subject is third-person singular).

Notice that even the functions that don=t relate to formal structure tell us things about that structure. An apostrophe As@ on a noun indicates part of a genitive (possessive) construction. AThe@ signals that the rest of a noun-phrase will follow it B we know not to look for any part of that noun-phrase to the left of Athe.@ As markers of structure, and in particular of boundaries between phrases and clauses (not all, but many grammatical terms serve this function), all of these things help us to parse and understand sentences. They don=t put an end to ambiguity B we don=t know, unless explicitly told, whether AI saw the boy with the telescope@ means AI saw the boy who had a telescope@ or AI used a telescope to see the boy.@ But they reduce ambiguity to an acceptable level.


We=ve looked at how syntax works when we=re parsing and comprehending sentences. Now let=s see how it works in production. In the epoch of protolanguage, the impulse codes that represented words were presumably dispatched one at a time to the motor control areas controlling speech. Once the social calculus had been mapped onto protolanguage, things would have changed. If the brain happened to come up with a verb, the regions where nouns and verbs were stored B the temporal and frontal lobes B immediately started talking with one another.

Say the verb was Akick.@ The frontal lobe called up the temporal lobe, so to speak, and said, AHey, this needs two arguments, so send me up an agent and a theme, a kicker and a kickee.@ Say the temporal lobe sent Agirl@ and Acat.@ Quick as a flash, the frontal lobe started to fit together first Akickcat@ and then Agirlkickcat.@ If you want to know when this started happening, let=s say maybe 150,000 years ago.

Nowadays, of course, it=s a bit more bureaucratic. Before being incorporated, nouns now have to be checked with the memory store to see if the hearer ought to know what you=re talking about; depending on the result, you=ll get either Aa girl@ or Athe girl.@ Verbs before being incorporated have to hook up with morphemes that indicate things like number and person and tense, adding things like A-ed@ to make Akicked@, so the message that finally arrives at the motor controls of speech is whatever sequence of signals that spells out Athegirlkickedthecat.@

This is probably not how you think that you say a sentence. You likely think that you first have a thought, something like <THE GIRL KICKED THE CAT>, and then you find words to fit it, AThe girl kicked the cat.@ But that=s just your conscious mind=s imaginary reconstruction of something you can=t possibly have experienced, and is probably about as accurate as our naïve inference about the sun going around the earth. What actually happens is something much more complex and chaotic than what I described in the previous paragraph.

The thing that you think you deliberately and consciously decided to say is, however, simply the winner in a Darwinian competition between it and dozens of other things, some of similar meaning, others quite different, that you might have said instead. The parts of your mind that deal with language are incessantly bouncing words and phrases about, whether you want them to or not. If you don=t utter them, they still appear in the form of inner speech, that James Joycean internal monologue that, when you try to meditate, or sleep after a hectic day, you find almost impossible to turn off. Even sleep doesn=t turn them off entirely; they go on to script the dialog in your dreams.


Being human, we like to fool around with our language. Precisely because this template of argument structure remains forever in our subconscious minds, we can afford to fool around, if only within strict limits and in pretty specific situations. For instance, I can omit themes after certain verbs ( ABill=s eaten already,@ AMary sang nicely@). But what do you suppose Bill ate? Coal? Sawdust? And did Mary sing an aria or the New York phone directory? Obviously what they ate or sang was something edible or singable, and if the kind of food or song doesn=t matter, you needn=t mention it. There are quite a few verbs like that, verbs that have implicit arguments like Awrite@ (what else can you write but words?) or Adrink@ (what can you drink but liquids?). Or take AThey played golf.@ AThey played tennis,@ AThey played together.@ Nobody is going to think that Atogether@ is some new game they hadn=t heard of before. When you acquire syntax, you don=t lose words, and the meanings of words will clue you if someone takes liberties with regular syntax.

In addition to leaving things out, we can put things in. We=re not limited to just the thematic role(s) that a verb specifies. We can add in optional ones, so long as they=re drawn from the very short list available: place (the only other role that=s ever obligatory, and that only for a tiny handful of verbs like Aput@), time, beneficiary (doing something for someone), instrument (doing it with something), source (taking it from someone or something). Maybe there are one or two more thematic roles B you can get into arguments with linguists (who will argue about almost anything) over exactly how many roles there really are, but there are very few and I=ve mentioned all the important and common ones.

But, you may say, all this is predicated on the supposition that things start with a verb B that your brain throws up a verb first and then pulls out things that can carry the thematic roles specified by that verb. Suppose it started with a noun, wouldn=t that mess things up?

Not at all. Probably what=s driving your train of thought (if you can call it that; it=s more like a bunch of Roman chariots racing toward the finish line, jockeying for position, each one trying to knock others out of the race) is the memory or imagination of some complete episode, so that if a noun comes uppermost, it drags the verb of an accompanying action with it.

WHC: That is, by the way, the verb test that I said  made the brain area in front of your left ear work so much harder, compared to merely naming the noun.

Even without an episode to hold things together, there=s no problem. A very common psychological test involves giving subjects a noun and then asking for a verb that goes with it. Try it on your friends. If you say Abicycle@ they=ll probably say Aride,@ if you say Aknife@ they=ll probably say Acut@ (if they say Astab@, watch them carefully). If they say Acut@ for Abicycle@ or Aride@ for Aknife@, or if they just gawk at you, you can assume they=re brain-damaged in some way. There are very fast links between verbs and the nouns they habitually associate with, links that can work in either direction.

So far, we=ve pretty much stuck to nouns and verbs and skipped all the other stuff B the articles and prepositions and particles, not to mention the inflections of nouns and verbs that add information about number and tense. Where do all these things come from?

Well, they=re totally absent from ape Atalk@, and extremely rare, if present at all, in early pidgins and toddler talk. In other words, they=re absent from varieties of protolanguage. And this is exactly what you would expect if one of their major functions is that of signaling structure.


But, as pointed out already, argument structure on its own can=t remove all the ambiguities from syntax. And so, in the millennia that followed the birth of syntax, our ancestors must have been competing with one another to produce devices that would make that syntax more readily parsable, hence easier to understand automatically.

This means there would have been Darwinian competition, with the best and the brightest trying all manner of ways to disambiguate the ambiguous sentences as economically as they could. In other words, the newly emerged syntax would itself have acted as a selective pressure, tilting the balance in favor of any changes in the nervous system that would lead to the construction of more readily parsable sentences. The consequent adaptations would have improved the fitness of individuals, because those who could get their points across better would have tended to occupy leadership roles and thus gained access to a wider selection of mates. All it takes in evolution is a marginal edge, and if you have it, sooner or later your genes will replace those of others who don=t.


Wait, you say, isn=t this discredited Lamarckism, the belief that what you do in your lifetime can somehow get into your genes? Not at all. It=s a form-follows-function principle known as the Baldwin effect.

James Mark Baldwin, a late nineteenth-century psychologist, pointed out that changes in behavior could change selection pressures. For instance, in the words of one authority on Baldwin, Robert J. Richards,

Some ground-feeding birds that happened to enter a new swampy terrain might learn in each generation to wade out onto a pond to feed off the bottom. Those that were flexible enough to acquire such responses would survive. In time, congenital variations might begin slowly to replace acquired traits, with natural selection molding them into instincts for wading and pecking at the right-sized objects. So what began as learned behavior and acquired modification might in time become innately determined and part of the hereditary legacy of the species. Organic selection [Baldwin=s own name for the Baldwin effect] thus imitated Lamarckian inheritance but remained strictly neo-Darwinian.

What happened next in the evolution of syntax most probably followed along these lines. Once our ancestors had changed their behavior by producing complex sentences, a varying aptitude for developing means to make those sentences easier to understand would have come under the automatic and irresistible pressure of natural selection. Those children who could accomplish, spontaneously and automatically, what their elders could only do by conscious and difficult effort Bproduce devices for disambiguating sentences B would have had a leg up on their less well endowed competitors in the social competitions. There may have been many different devices at first. The way evolution normally works is to throw up all manner of possibilities and then let competition thin them out. 

WHC: Not necessarily the ‘ cream of the crop’ of early devices , Derek — not any more than VHS was better than Betamax, or Windows was better than the Mac’s operating system. Sometimes you just standardize on something that is merely " good enough." In economics, they talk of "market capture." The best doesn’t always win, because not everyone can afford to stay in the game.

We can be fairly sure that what we find today represents the cream of whatever crop there was then.

Don=t get me wrong, though. What got fixed was definitely not particular markers B words with a specific shape and sound for marking tense or structural boundaries. What got fixed was that there had to be specific markers for such things. Children came to expect them and look for them. And if they didn=t find them B if, for example, they received input from a primitive pidgin B they put them back in. If you want to know the formula for what turned out to be the most favored solution, check it out in the appendix.

In other words, what I formerly conceived as a single step from protolanguage to true language can be broken down into two stages, one of exaptation (the core phrase-and-clause producing argument-structure machine) and one of Baldwinian evolution (adding mechanisms useful for marking the new structures with grammatical morphemes and making them more readily processable). These Baldwinian universals simply formed part of the cascade of change that was triggered the moment that the syntax engine started running: a cascade that included more rapid neural processing, clearer and faster articulation, as well as other ambiguity-reducing devices.


However, the need for grammatical morphology to mark unit boundaries and other features meant that the language had to produce new words and inflections (which probably started their lives as full words, see below). These words had little if any referential meaning. (What does Aof@ mean in Athe discovery of America@ except ALook out, this noun phrase doesn=t end with >discovery=@?) Where did they come from?

Well, for a typical example, look at what happened in Tokpisin, a new language in New Guinea that=s often called a pidgin but has been a creole since World War II. Its earlier pidgin form had only Ahim@ as third-person-singular pronoun, regardless of position B AHim catch him,@ for example, though Acatch@ came out sounding something like Akis.@ The second Ahim@ got reduced over time to lightly stressed Aim@, so that speakers began to hear it as attached to the verb that preceded it. Soon it lost its meaning as a separate pronoun and just meant something like, AWait, a theme argument is coming next.@ Simultaneously, things were kept acoustically and perceptually separate by changing the form of the third-person-singular pronoun to Aem.@ So Ahe catches him@ is now Aem i kisim em@, which looks like Ahim he catches him him@ but isn=t, of course (the Ai@ is just another grammatical marker that means, ALook out, a verb is coming next!@).

We can only suppose that the earliest true language did the same as modern creoles have done: took content words from the prior protolanguage and bleached and downgraded them, first into free grammatical morphemes, then into mere inflections. Most contemporary creoles have yet to reach the final stage, but the whole cycle has been experienced by many, perhaps all older languages.

But, you may say, surely protolanguage had some grammatical morphemes. What about words like Ain@ or Aon@ or Ato@ or Afrom@? Surely they would have needed words like these just to find their way around and tell people where things were.

We=ve suggested that protolanguage most probably arose out of extractive foraging. Whether it did or not, extractive foraging must have been one of the major uses to which protolanguage was put. In the last few years, biologists, ethologists and anthropologists alike have so focused their attention on the social life of primates, they seem to have forgotten that primates had to eat. And because, in order to eat, they would almost certainly have foraged in small groups and reported back to the main group, they would have needed to be able to give directions and describe locations as well as tell what kind of food could be found at the end of the trail.

But how do you arrive at something as abstract as an Aon@ or an Ain@? Again creoles give us clues. In a number of these languages, the preposition corresponding to Aon@ comes from a noun meaning Athe top.@ The preposition corresponding to Ain@ comes from a noun meaning Athe inside.@ The preposition corresponding to Aunder@ comes from a noun meaning Athe bottom,@ and so on. In Guyanese Creole you often hear an expression that sounds like Aa road kaana.@ If you recognize the third word as the local pronunciation of Acorner,@ you will assume this means Aat the corner of the road.@ Wrong. It actually means Aby the road@ or Abeside the road.@ The Aa@ is a general locative/directional particle that just marks the optional thematic role place, and the noun Acorner@ is turned into a postposition (that=s just a preposition coming after, instead of before, the noun) equivalent to Abeside.@

So it=s quite likely that the original protolanguage got nouns that meant Atop,@ Abottom,@ Aside,@ Acorner,@ and so on, and that when the syntactic engine came along, these came in handy both to mark the thematic role place and to distinguish the various kinds of place where things might be. Indeed, if you look at creoles, you can find examples of every kind of grammatical morpheme a language might need, all produced by this process of bleaching and downgrading content words. If modern humans can do this kind of thing, why not our ancestors of a hundred-odd thousand years ago?


This chapter has shown how a protolanguage could be turned into a full language by a single exaptation followed by a series of Baldwin effects that any such exaptation would be bound to bring in its wake. But I=m sure a rather obvious question has been bugging you for quite some time now.

I=ve claimed that protolanguage has been around for maybe as long as two million years, and that a social calculus has been around for a lot longer than that. So how was it that, when protolanguage first emerged, the social calculus didn=t immediately get mapped onto it to give us syntax and modern language a couple of million years ago? Why was syntax delayed?

This puzzled me for a long time, in fact until I talked to Bill. Then I knew that what he proposed just had to be right. So I=ll let him explain it to you.


On to the NEXT CHAPTER

Notes and References for this chapter

Copyright ©2000 by
William H. Calvin and Derek Bickerton

The handheld traditionally comfortable version of this book is available from
  amazon.com or direct from
 MIT Press.

 Email Calvin  
 Email Bickerton  

  Book's Table of Contents  
  Calvin Home Page