(Note: In European
Review, 12(4): 551-565 (2004).
version may differ slightly from the printed version; the
printed version is the `authorized' version.)
Human language is qualitatively different from animal communication systems in at least two separate ways. Human languages contain tens of thousands of arbitrary learned symbols (mainly words). No other animal communication system involves learning the component symbolic elements afresh in each individual's lifetime, and certainly not in such vast numbers. Human language also has complex compositional syntax. The meanings of our sentences are composed from the meanings of the constituent parts (e.g. the words). This is obvious to us, but no other animal communication system (with honeybees as an odd but distracting exception) puts messages together in this way. A recent theoretical claim that the sole distinguishing feature of human language is recursion is discussed, and related to these features of learned symbols and compositional syntax. It is argued that recursive thought could have existed in prelinguistic hominids, and that the key step to language was the innovative disposition to learn massive numbers of arbitrary symbols.
Pre-theoretically we all have a good idea what a language is. The first thing that comes to mind is our own native language. Hellenocentric ancient Greeks are said to have drawn the boundary of real language at that point, labelling all forms other than their own as merely `barbar'. Many of us still commit the mistake of equating `language' with the standard variety dominant in a culture. Many Arabs, for instance, do not believe that Colloquial varieties of Arabic, which they use systematically every day, are proper languages. Racist myths are still sometimes propagated that languages of peoples with pre-modern material culture, for instance Australian aborigines, may only have vocabularies of a few hundred words, or have no real grammars. In fact, there is no evidence that any existing branch of humanity differs in any way whatsoever, as far as a capacity for complex language is concerned. An overwhelming body of natural experiments, in the form of adoption of children between societies belonging to different language families, testifies that (pathology aside) any human child is born with the capacity to master any human language. Nor has any known human society failed to exploit this capacity; there are no natural self-sustaining groups of humans without a very complex language. Having mentioned (language) pathology, this does not seem to correlate with particular populations. A well-known case of hereditary language pathology affects the KE family, who live in London (Gopnik, 1990, 1994). This is now safely attributed to a particular mutation (Lai et al., 2001; Enard et al., 2002). But such language-affecting genetic abnormalities are, as far as we know, equally rare across the world. Complex language defines humans.
The boundary between humans and other species is very clear. Both in terms of phenotypic behavioural traits, the most salient of which is language, and in terms of capacity for interbreeding, there are no semi-humans or subhumans. The statistical success of breeding between members of different groups may vary somewhat. But Homo sapiens is not problematic like a `ring-species', whose members form a continuum, with neighbours on the continuum able to interbreed, but no interbreeding capacity between the distant extremes of the continuum. (See Stebbins, 1949; Din et al., 1996; Wake, 1997; Highton, 1998; Irwin, 2000; Irwin, Bensch & Price, 2001; Wiltenmuth & Nishikawa, 1998.) The relative recency of the spread of humans across the globe, after leaving Africa, in a space of roughly 100,000 years, has left our species with a remarkable uniformity. It seems likely that this geographical explosion was hugely facilitated by the uniform species-specific capacity for language, which enables efficient sharing of complex practical information.
Language makes science possible, by making available a vast web of concepts at least partly defined in terms of words. Language makes it possible to theorize about language itself. At a very basic level, English, for example, provides us with the opportunity to distinguish between the meanings of language and communication. Let's take advantage of what language gives us, and not be loose with such words. In a scientific enterprise, it is counterproductive to conflate language with communication. Almost all species communicate in some way, but this does not equate with having language. To apply the term language to the communication of honeybees, or of vervet monkeys, or of whales, is to miss the chance to express an important qualitative distinction. It is not only that human communication is more complex, as might possibly be measured along some continuum. There are two features of human language (including manual sign languages) that are simply absent from the natural communication systems of any other species. One is learned arbitrary symbols, and the other is recursive, semantically compositional, syntax.
I will in this article concentrate on clarifying what is intended by the descriptors `Learned, arbitrary symbols' and `Semantically compositional syntax', and on exploring the relationship between them that builds the impressive human language capacity. But, while acknowledging the qualitative presence of these features in language, and their qualitative absence in animal communication, a range of other features which humans do share, to some degree or other, with non-humans, should not be forgotten. I will not give an exhaustive list, but such features include a capacity for imitation, some `Theory of Mind', an altruistic, cooperative society, well developed motor control of the organs exapted for communication, and the like. One cannot study, or theorize about, the presence of learned arbitrary symbols or syntax, away from the context of a host of enabling conditions which provided the landscape in which these distinctive features could be exploited. And obviously, even qualitatively new features have precursors -- they don't just come from nowhere. But in this article, I will concentrate on learned arbitrary symbols and recursive compositional syntax.
One can find arbitrary relationships between signified and signifier in animal communication. The most celebrated example is that of vervet monkeys (Seyfarth & Cheney, 1990, 1996), who have been shown to use a `bark', a `cough' or a `chutter' to communicate the presence, respectively, of a leopard, an eagle, or a snake. There is nothing (as far as we know) inherently leopardlike in a bark, or inherently barklike in a leopard. It seems more reasonable to grant that the vervets are using genuinely arbitrary symbols, than to assume that vervets perceive the world in ways radically different from our own, so that for them the relationships here are somehow iconic or causal. So some arbitrary symbolic behaviour can be found in nature. And in fact such very limited arbitrary symbol systems can be seen in many species, especially in their alarm calls. There is a vast difference in degree between the inventories of arbitrary symbols used by animals (up to about thirty distinct calls used by wild chimpanzees) and the vocabularies of human languages, which contain many tens of thousands of items.
The huge quantitative gap between non-human call systems and human vocabularies is complemented by what appears to be a qualitative difference. Human vocabularies are completely learned, in the early lifetimes of individuals. The arbitrary calls of vervet monkeys, and the predator-specific alarm calls of chickens seem to be innate. This needs some discussion. Take the category MARTIAL EAGLE, the main aerial predator on vervets. Young vervets will sometimes give the martial eagle alarm call inappropriately, say in response to a crow, or even to a falling leaf. As they mature, they refine their behaviour, such that as adults they only give the martial eagle call in response to martial eagles. It seems that as newborns they already have a crude elementary category, or concept, of MOVING-THING-OF-A-CERTAIN-SIZE-IN-THE-SKY. This category gets refined as they mature, so there is a certain amount of learning involved in fixing the signified category itself. Vervets do not adapt the acoustic fidelity of their calls as they grow up. ``There is ... no suggestion of experiential effects on the physical characteristics of these signals. Vocal learning does not appear to occur either in the chicken (Konishi, 1963) or in nonhuman primates (Snowdon, 1990; Owren et al., 1992).'' (Evans, 1997:10) Vervet monkey genes determine a vocal tract physiology which itself defines a range of possible noises `within reach'. And probably the barks, coughs and chutters are at the top end of the range of noises which come naturally to vervets, given their physiology.
A symbol embodies a relationship between a signified (some concept or intention) and a signifier (the actual signal, a noise or a gesture). We just looked at the possible innateness of the two ends of the symbolic relationship, the concepts of the various predators and the associated noises. The most interesting question is whether the relationship itself, the vervets' particular arbitrary association between the concepts and the respective noises is in any sense innate. On the evidence so far, these particular pairings between concepts and noises do seem to be predetermined in young monkeys, rather than learned. A useful, though very difficult experiment, would try to introduce into a population of vervets the `wrong' pairings, say bark for a snake, cough for a leopard, and chutter for an eagle. If it proved (relatively!) easy to convert a group of vervets to this alternative system, the observed arbitrary pairings are not innate. But the weight of opinion among the relevant researchers is that these specific symbolic relationships are innate in the vervets, and probably all other such animal call systems are equally innately predetermined.
None of this conflicts with the well-known fact that animals are able to learn small sets of arbitrary signal-meaning relationships. The most celebrated trained apes, Kanzi (Savage-Rumbaugh and Lewin, 1994) and Nim (Terrace, 1987), have been able to acquire arbitrary symbolic vocabularies of several hundred items. Domesticated animals can be trained to respond systematically to human words. Charles Snowdon makes an interesting observation: ``Vervet monkeys can learn to respond to the alarm calls of superb starlings (Sprea superbus) (Hauser 1988), and ring-tailed lemurs (Lemur catta) can respond appropriately to the aerial and terrestrial alarm calls given by Verraux's sifakas (propithecus v. verrauxi (Oba & Matasaka, 1996). Domestic animals respond to human commands as well. This evidence of cross-species learning goes well beyond within-species learnability.'' (Snowdon, 1999:82) It should come as no surprise that the most innately hardwired, or genetically assimilated, arbitrary symbolic pairings used by a species are those used within the species, and that responses to calls made by other species need to be learned.
Bottlenose dolphins (Tursiops truncatus) have calls that are learned, rather than innate, but their calls cannot be claimed to belong to a learned symbolic communication system in the same sense as the vocabularies of human languages. Each animal develops an individually distinctive signature whistle in the first few months of its life. ``Signature whistle development is strongly influenced by learning. Scientists think that signature whistles are used in individual recognition, which is supported by the fact that they can primarily be heard if individuals cannot see each other.'' (Janik, website) There is some conflict between the idea of learning and the development of an individually distinctive signature whistle. Human learning of vocabulary involves acquiring the same arbitrary pairings between form and meaning as were used in the previous generation, and in most cases have existed for many generations. The dolphins' signature whistles do qualify as arbitrarily pairing a particular whistle with a particular dolphin, rather like a proper name. And once such a naming whistle has been invented by a dolphin to `name' itself, this same whistle can be learned and imitated by other dolphins. But in dolphin society such symbols are invented afresh with each newborn dolphin. There is not the continuity of arbitrary pairing between referents and signals that can be seen in the history of human languages. And dolphins use their calls only to symbolize their own identity.
It is not surprising that communication systems involving arbitrary symbols should be innate, genetically determined within the species that use them. What is truly remarkable is the extent to which humans have radically departed from this pattern. Humans still preserve, universally, some natural signals conveying emotional responses, such as smiling in response to certain kinds of pleasure and wide-open eyes in response to alarm. We give and interpret these few signals naturally, without any apparent need to learn them. But, paradoxically, what is most obviously common to humans is their ability to learn to behave differently, from one group to another. Humans learn tens of thousands of arbitrary meanings within a few years, at some times up to about twenty new items per day. Enormous size of vocabulary necessarily goes with learning. The genome of any complex animal encodes a huge amount of information, determining in byzantine detail and with microscopic accuracy the reliable long-term functioning of the healthy animal. No-one has devised an accurate measure for comparing the quantity of information in the epigenetic instructions that build adult bodies with the quantity of information contained in the dictionary of a typical human language. The evidence indicates that nature's preferred way of endowing an adult creature with a vast vocabulary is not to encode it in the genes but to give the creature the plastic capacity to acquire what it needs during ontogeny. Interestingly, `Nature' has only done this once, with humans. Other species get along fine in their ecological niches with relatively limited innate communication systems.
A species such as Homo sapiens which makes such copious use of arbitrary symbolic mappings between concepts and external forms (words) is an intrinsically information-hungry and trusting species. The ordinary children who pay such wrapt attention to the noises made by fellow humans around them are innately disposed to take those noises seriously, as meaning something. And taking the noises seriously can only be sustained if there is some benefit. That is, the noisemakers (the adults around the child) have to be imparting genuine truthful or useful information. The information conveyed by language need not all be useful in the immediately practical sense of helping to put food in the child's mouth or make it comfortable. The child also benefits indirectly from gaining a strong sense of community with the group, able to participate in the group's efforts and share in the rewards. In humans, language contributes enormously to the organization of the group's efforts and the distribution of the booty. Somehow, humans alone got to be a species that could realize the advantage of passing information using cheap arbitrary symbols, whose value everyone accepts on trust.
Let's start by clarifying the meaning of `syntax'. I shall use it here in a way deliberately divorced from semantics. Semantics involves interpretation; syntax only involves patterning. As we are dealing with language, which is mostly realized in linear, sequential behaviour, I'll restrict discussion to patterning in linear (one-dimensional) sequences. This is clearly an oversimplification, because although the raw signal reaching the ear is nothing but a series of different air pressures, the brain immediately transforms this into a representation with more than one dimension, separating out bands of energy at different pitches sharing the same time-slice. The simplification of syntax to patterning in one dimension, the dimension of time, will not be harmful here.
How can sequences of events over time display patterning? Here are some examples. Many birdsongs and the territorial songs of some primates, such as gibbons, are composed of recognizable re-used subunits. That is, a human analyst can break a song up into several distinct portions which can occur in different positions in other songs. The signal is composed of reusable units drawn from an inventory. Any constraints on the sequencing of such units constitute the very rudimentary beginnings of syntactic organization. For instance, if a song or call must always begin with a unit of a certain type, it has a very elementary syntax. Music in the Western tradition is organized into bars, with the same number of beats to a bar in any piece of music. The musical bar is a repeated abstract element. It is abstract, because there are many different possible bars, but they must all conform to the same simple rule. Signals whose only patterned property is that they must begin with a certain type of element, and `music' only constrained to concatenate rhythmic units containing the same number of beats are at the bottom end of the scale of interest for syntacticians; indeed they are so simple as to be off the scale of interest for syntacticians interested in language. Nevertheless, such patterning shows the germ of syntactic organization.
Any finite set of calls can in principle be stored holistically; each separate call can be individually listed, with any recurring similarities between them, and between their parts, ignored. If every call or song begins with some element drawn from a set of specialized initial elements, and ends with some element drawn from a different set of specialized final elements, and if these sets are quite large, a simple listing of all the possible calls is obviously uneconomical. If there are, say, 20 possible starting elements and another 20 possible finishing elements, the whole set of possible calls numbers 400. It is uneconomical to remember 400 separate calls, when it is easier to store a simple syntactic `grammar' that defines the two sets and the order in which items drawn from them must occur. We assume that animals' brains represent their behavioural patterns somewhat economically. Systems of calls which contain more `slots', with large and different classes of elementary units privileged to occupy these slots, are more likely to be stored in the brains of the controlling animals as simple grammars, rather than as inventories listing all the possible calls independently of each other. Animals with such call systems can be credited with some simple control of syntax. By contrast, it would not be sensible to attribute syntax to a cuckoo, whose call contains two separate notes, always in the same sequence.
Defining syntax in this way, some simple examples of syntax can be found in the animal world. Complex birdsong and the territorial or identifying calls of some primates, and probably the calls of cetaceans, such as whales and dolphins, have a degree of syntactic organization. This should come as no great surprise. Calls with this degree of syntax are examples of serial behaviour routines, which are extremely widespread among animals. In many spheres of their lives, animals perform complex acts consisting of routine sequences, with the elementary actions filling the slots in the sequences varying according to the demands of the moment. Generally in non-communicative serial behaviour, such as searching for a twig, picking one up, stripping it of leaves, inserting it into a termite mound, withdrawing it, and eating the termites, each action in the sequence is quite naturally determined by the preceding sequence. You can't strip the leaves off a twig until you have found it and picked it up. But with communicative calls, this element of natural, or externally determined sequencing is absent. A chaffinch call begins with a trill and ends with a flourish; why couldn't it be the other way around? There are no doubt certain physiological constraints making certain orderings preferable. For instance, if a call is produced by expulsion of breath from the lungs, we might expect the last elements of calls to have lower pitch, because less air pressure is available at the end of a call, when most of the air has been emptied out. But such constraints alone cannot explain the wide variety of patterning found in animal calls.
Monkeys can be trained to put sets of objects (typically icons on a computer screen) into fixed sequences, up to about nine objects (Brannon & Terrace, 1998). They do this artificial exercise in a laboratory, and are rewarded for it. The trained bonobo, Kanzi, has developed, untrained, a very strong preference for a particular fixed ordering of two-element signals consisting of a lexigram and a pointing gesture (Savage-Rumbaugh & Lewin, 1994). But what motivates the sequencing of elementary units in an animal call in the wild? If we were discussing human language, a large component of the answer would be obvious. Words are put in different sequences to convey different messages, and the messages are actually composed from the meanings of the words. This is semantic compositionality. A compositionally organized communication system is one in which the meaning of a whole sequence is a function of the meanings of its elementary parts.
Perplexingly, the best example of semantically compositional communication outside humans has been found in a species only very distantly related to us, honeybees. Honeybees communicate the location of food by a two-part signal; one part conveys the distance of the food from the hive, and the other conveys the direction. Composing distance and direction yields location. This system differs in many ways from human languages. Most obviously, it is extremely limited; it is also innate, and iconic, rather than symbolic, as the individual parts of the signal are iconically related to their meanings (faster waggle means nearer food, angle of dance to the vertical means angle of flight to the sun).
To date, no clear example of semantically compositional communication has been found in non-human vertebrates. I will mention one example which, it has been suggested, come close. Klaus Zuberbühler (2002) has found a predator alarm system, like that of the vervet monkeys, in the Campbell's monkeys and Diana monkeys of the Ivory Coast rainforest. They give one call to warn of a leopard and another to warn of a crowned hawk eagle. On hearing these calls, the monkeys take appropriate evasive action. Mainly, these calls are given in isolation, so there is no question of syntax. But the monkeys can produce another call, which Zuberbühler calls a `boom'. When this boom precedes one of the regular alarm calls (often with a surprising long interval, as much as 25 seconds), the regular alarm calls do not trigger the usual evasive actions. It is as if the monkeys know that the meaning of the alarm call is modified by the preceding boom. It would be extravagant to attribute any human-like meaning, such as negation, or modal operator (e.g. maybe), to the boom.
While noting such cases, it is safe to say that semantically compositional learned symbolic signals have not been found in non-human animals. We return to our earlier question: if a degree of syntactic patterning can be observed in the communicative calls of animals, and yet the elementary parts of the calls do not contribute any meaning to the meaning of the whole call, what is the function of the syntactic patterning? Here it is useful to remind ourselves that even humans sometimes learn to repeat quite long sequences of words without compositional semantics.
``According to the most authoritative critical voice, Frits Staal, mantras are pieces of texts basically devoid of meaning which take on the function of ritual objects. Endowed with phonological and pragmatic properties but devoid of syntax and semantics, they do not conform to Western and (non-esoteric) Indian theories of language, therefore they cannot be considered as either linguistic entities or speech acts.'' (Rambelli, 1993, citing Staal, 1985, 1988) Humans are also extremely adept at learning complex musical tunes, an essentially syntactic, a-semantic exercise. We remember complex sentences largely helped by their meanings, but meaning is of no help when it comes to remembering tunes. The individual notes and phrases of a tune don't have meanings like words, which combine to yield the meaning of the whole tune.
One possible function for animal calls with some degree of syntactic patterning, but no compositional semantics, is to display virtuosity and thereby convey fitness. It is possible that there has been some element of sexual selection (as with the peacock's tail) in the evolution of an ability to produce complex calls.
Both humans and many non-human animals show ability to order sequences of actions, including those comprising communicative signals. As mantras and musical tunes show, this ability in humans is not always supported by compositional semantic interpretation of the sequences. And the corresponding ability in animals is never supported by semantic interpretation. Thus both humans and non-humans show some limited raw syntactic ability. Even in raw (i.e. semantically unsupported) syntactic ability, humans outperform non-humans. But the human ability to manipulate extremely long and complex signals is founded on the signals' semantically compositional structure. And this semantic compositionality necessarily rests, in its turn, on the fact that the elementary units of the signal carry (arbitrary) meaning. Without an inventory of elementary symbols, there is nothing for complex syntax to put together in a meaningful way.
A recent widely cited article (Hauser et al., 2002) has argued that the single distinctive feature of the human capacity for language is recursion. Human languages exhibit recursion; animal communication doesn't, and it is an open question whether animal non-communicative cognition shows any recursive capacity. So the argument goes. There is much that is plausible in this position, though I believe it to be oversimplified, and to underrate the power of symbols. We'll examine the issue, beginning with some definitions.
Recursion needs to be distinguished from iteration. Both terms are used with some precision in the theory of computing, but in the borrowing into the discourse of linguistics, some of this precision has been lost. Iteration is doing the same thing over and over again. An iterative procedure makes a computer repeat some action over and over again, until some criterion is met. An example from cookery would be ``Beat the egg-whites until they are stiff''. Walking involves iterated striding.
A recursive procedure is a procedure which is (partially) defined in terms of itself. Such circularity or self-reference is not necessarily vicious. Recursion can lead to infinite regress, although it need not. A non-computational analogue of recursion is provided by the fairground hall-of-mirrors effect. You see a reflection of yourself, and behind it you see a reflection of the reflection, and behind that you see a reflection of the reflection of the reflection, and so on, potentially ad infinitum. As a human, you know the reflections can go on ad infinitum, but we are all let down by the ultimate physical imperfection of the mirrors, no matter how good they might be. A computer program using a badly defined recursive procedure can go into a potentially infinite regress and make the computer grind to a standstill, because it runs out of memory to store a record of the whole mountain of successive loops it has gone through. Memory is essential to recursion, but not to iteration.
In computing, a recursive procedure can often be defined as an iterative one. I'll give an example from syntax. Take sentences of a type used just above, sentences like this:
You see yourself.
You see yourself looking at yourself.
You see yourself looking at yourself looking at yourself.
You see yourself looking at yourself looking at yourself looking at yourself.
Here is an iterative specification of such sentences:
Start with You see yourself and then add as many instances of looking at yourself as you like, until you get tired, or are murdered by your companion.
Note that no part of this instruction refers to itself, so it is not
recursive. Now, here is a recursive way of defining such sentences:
Make a sentence of the form You see SOMETHING where SOMETHING is not a word you will actually say, but a place-holder for the object-phrase of the sentence. In its turn, you form this SOMETHING either as the simple word yourself or as the phrase yourself looking at SOMETHING.
Here, the SOMETHING is an abstract element defined recursively.
These two alternative ways of defining the same set of sentences actually implicitly assign different structure to the sentences. The iterative definition assigns either a `flat' or a right-branching structure. The `flat' structure is illustrated below.
In this structure, typical of recursively defined structures, the successive instances of the SOMETHING phrase are nested inside each other, like Russian dolls. This recursive structure makes a number of combined grammatical and semantic claims. In particular, it represents the object of the main verb see as the whole rest of the sentence, and the object of the first verb looking at as the whole SOMETHING that follows it, and so on. This is correct; that is how the sentence is understood, if it is seriously taken as meaningful, rather than as merely a trick utterance which happens to repeat itself.
As another example, consider phrases such as John's brother's wife, or John's brother's wife's uncle, or John's brother's wife's uncle's neighbour. These are not so outlandish. English grammar places no principled limit on the length of such phrases, though our memory and patience inhibit us from producing or decoding ones which get too long. A recursive definition of such phrases might label such phrases as SOMEONE phrases, and define them, in brief as: SOMEONE'S NOUN, where SOMEONE can also be a Proper Name, such as John. Here again, the recursive definition gets it semantically right, because it identifies such strings as John, John's brother, John's brother's wife, and so on, as constituents which can be directly mapped onto their real referents, the actual John, his brother, and his sister-in-law. By contrast, an iterative definition that might be proposed, just repeating ad lib any number of instances of -'s NOUN after the initial Proper Name would be semantically unmotivated.
An organism that can handle recursion needs a certain kind of memory. It has to be able to keep track of each successive layer of nesting for the purpose of assigning meanings. Informally, such a creature has to be able to remember why it is processing a larger X, at the same time as it is processing some number of smaller Xs inside the larger one, for all Xs down the line. The claim of Hauser et al. (2002) is that humans alone, at least as far as their communication systems (languages) are concerned, have this recursive capacity. The discussion in that paper is remarkably taciturn on the issue which I have taken to here to be most relevant, namely the role of compositional semantics in motivating recursive syntax.
To relate the recursion/iteration issue to the earlier discussion, recursive definitions of human sentences are faithful to their compositional semantics. Even the most syntactically complex imaginable birdsong, so long as it is not designed to be interpreted compositionally, as a function of the meanings of its elementary parts, would not require a recursive description. Many varieties of birdsong involve repeating certain elements, often many times; these varieties of song exploit iteration, but not recursion. Not only is the syntax of human languages distinct from animal calls in being semantically compositional; it is also distinctively recursive.
While compositional semantics provides the motivation for syntactic recursion, it does not absolutely necessitate it. It is possible to imagine a quasi-human language which only had main clauses (i.e. no subordinate clauses) and no embedded possessives or self-embedded constructions of any kind. Such a language could possibly be very rich, with a vocabulary as large as that of any real language, and might even have `simple' sentences much longer and more elaborately structured than those of real languages using recursion. In fact, although the introduction of recursion might be seen as an evolutionary step up in complexity from non-recursive languages, recursion can actually be seen as a simplifying device, allowing more efficient storage of possible sentence patterns compatible with their meanings.
The argument that recursive syntax is motivated by semantic considerations presupposes that the conceptual structures expressed by the sentences of languages are themselves best characterized by recursive descriptions. Sentences with clauses embedded within clauses, embedded within clauses, map compositionally onto non-linguistic representations which are similarly nested. If an ape can think that another ape thinks that a third ape is its ally, its thoughts have a recursive structure. The extent to which apes can represent such nested beliefs about the beliefs of others is uncertain, and controversial. (See Call et al., 1998; Heyes, 1998; Tomasello et al., 2003.) If non-human animals know in some sense that things have parts that have subparts which have subsubparts, then again their mental representations, independent of language, have a recursive structure. It is not known whether animals are capable of such mental representations.
``If one entertains the hypothesis that recursion evolved to solve other computational problems, such as navigation, number quantification or social relationships, then it is possible that other animals have such abilities. Why did humans, but no other animal, take the power of recursion to create an open-ended and limitless system of communication?'' (Hauser et al. 2002:1578)
The question is exactly the right one, but the authors show no sign of seeking an answer in the capacity to acquire arbitrary symbols. The Chomskyan approach to language has always emphasized the individual mind, and paid scant attention to the interaction of individual minds in social groups. A disposition to acquire and use arbitrary elementary symbols in massive numbers characterizes humans no less than recursive syntax. And it is on the foundation of symbols that the recursive syntax of communicative language is built. A group organized in such a way that the arbitrary symbols uttered by speakers are taken on trust as significant fosters an appetite in the hearers for acquiring knowledge of more symbols. If some capacity for recursive calculation existed before the rise of learned arbitrary symbols, it is not surprising that this capacity should immediately be co-opted for symbolic communicative purposes. The real human breakthrough was the use of, and trust in, learned symbols, as Terry Deacon (Deacon, 1997) and Chris Knight (Knight, 2002), in their radically different ways, have emphasized.
(I gratefully acknowledge the influence of conversations with Mike Oliphant, Terry Deacon and Chris Knight in helping me to work this out. Apologies if they'd prefer not to be blamed.)
Brannon, Elizabeth M., & Herbert Terrace, 1998, ``Ordering of the Numerosities 1 to 9 by monkeys'', Science, 282(5389):746-749.
Call, J., Hare, B., & Tomasello, M., 1998, ``Chimpanzee gaze following in an object-choice task'', Animal Cognition, 1:89-99.
Deacon, Terrence, 1997, The Symbolic Species : The Co-evolution of Language and the Brain, W.W.Norton, New York.
Din.W., R. Anand, P. Boursot, D. Darviche, B. Dod, E. JouvinMarche, A. Orth, G.P. Talwar, P. A. Cazenave, and F. Bonhomme, 1996, ``Origin and radiation of the house mouse: Clues from nuclear genes'', Journal of Evolutionary Biology, 9(5):519-539.
Enard, Wolfgang, Molly Przeworski, Simon E.Fisher, Cecilia S.Lai, Victor Wiebe, Takashi Kitano, Anthony P.Monaco, and Svante Pääbo, 2002, ``Molecular evolution of FOXP2, a gene involved in speech and language'', Nature, 418:869-872.
Evans, Christopher S., 1997, ``Referential Signals'', Perspectives in Ethology, 12:99-143.
Gopnik, Myrna, 1990, ``Feature Blindness: a Case Study'', Language Acquisition, 1(2):139-164.
Gopnik, Myrna, 1994, ``Impairments of Tense in a Familial Language Disorder'', Journal of Neurolinguistics, 8(2):109-133.
Hauser, Marc D., 1988, `` How infant vervet monkeys learn to recognize starling alarm calls'', Behaviour, 105:187-201.
Hauser, Marc, Noam Chomsky, and Tecumseh Fitch, 2002, ``The Faculty of Language: What Is It, Who Has it, and How Did It Evolve?'', Science, 298:1569-1579.
Heyes, Cecilia M., 1998, ``Theory of Mind in nonhuman primates'', Behavioral and Brain Sciences, 21:101-134
Highton, R. 1998, ``Is Ensatina eschscholtzii a ring-species?'', Herpetologica, 54(2):254-278.
Irwin, D.E., 2000, ``Song variation in an avian ring species'', Evolution, 54(3):998-1010.
Irwin, D.E., S. Bensch and T. D. Price, 2001, ``Speciation in a ring'', Nature, 409(6818):333-337.
Janik, Vincent M., website http://www.st-andrews.ac.uk/~bmscg/Tursiops.htm
Knight, Chris, 2002, ``Language and Revolutionary Consciousness'', In The Transition to Language, edited by Alison Wray, Oxford University Press, pp.138-160.
Konishi, M., 1963, ``The role of auditory feedback in the vocal behavior of the domestic fowl'', Zeitschrift für Tierpsychologie, 20:349-367.
Lai, Cecilia S.L., Simon E. Fisher, Jane A. Hurst, Faraneh Vargha-Khadem, and Anthony P. Monaco, 2001, ``A forkhead-domain gene is mutated in a severe speech and language disorder'', Nature, 413:519-523.
Oba, R., and N. Matasaka, 1996, ``Interspecific responses of ring-tailed lemurs to playback of antipredator alarm calls given by Verraux's sifakas'', Ethology, 102:441-453.
Owren, M.J., Dieter, J.A., Seyfarth, R.M. and Cheney, D.L. 1992, ``Food calls produced by adult female rhesus (Macaca mulatta) and Japanese (M. fuscata) macaques, their normally-raised offspring, and offspring cross-fostered between species'', Behaviour, 120:218-231.
Rambelli, Fabio, 1993, ``Sounds for Thought'', The Semiotic Review of Books, 4(2):1-2.
Savage-Rumbaugh, S., & R. Lewin. 1994, Kanzi : The Ape at the Brink of the Human Mind, Wiley, New York.
Seyfarth, Robert M., and Dorothy Cheney, 1990, How Monkeys See the World, Chicago: University of Chicago Press.
Seyfarth, Robert M., and Dorothy Cheney, 1996, ``Inside the mind of a monkey'',. In Readings in Animal Cognition, edited by Marc Bekoff and Dale Jamieson, MIT Press, Cambridge, MA, pp.337-343. (Reprinted from New Scientist, 4th January 1992, 25-29.)
Snowdon, Charles T., 1990, ``Language capacities of nonhuman animal'', Yearbook of Physical Anthropology, 33:215-243.
Snowdon, Charles T. 1999, ``An Empiricist View of Language Evolution and Development'', in The Origins of Language: What Nonhuman Primates Can Tell Us, edited by Barbara J. King, School of American Research Press, Santa Fe, New Mexico. pp.79-114.
Staal, Frits, 1988, Rules Without Meaning: Essays on Ritual, Mantras and the Science of Man, Peter Lang, New York.
Staal, Frits, 1985, ``Mantras and Bird Song'', Journal of the American Oriental Society, 105 No.3: 549-558.
Stebbins, Robert C., 1949, Speciation in salamanders of the plethodontid genus Ensatina, University of California Press, Berkeley.
Terrace, Herbert, 1987, Nim, Columbia University Press, New York.
Tomasello, M., Call, J., Hare, B. 2003. ``Chimpanzees understand psychological states -- the question is which ones and to what extent'', Trends in Cognitive Science, 7:153-156.
Wake, D.B., 1997, ``Incipient species formation in salamanders of the Ensatina complex'', Proceedings of the National Academy of Sciences of the United States of America, 94(15):7761-7767.
Wiltenmuth, E.B, and K. C. Nishikawa, 1998, ``Geographical variation in agonistic behaviour in a ring species of salamander, Ensatina eschscholtzii'', Animal Behaviour, 55:1595-1606.
Zuberbühler, Klaus, 2002, ``A syntactic rule in forest monkey communication'', Animal Behaviour, 63(2):293-299.