Without Miracles

11 The Evolution, Acquisition, and Use of Language

It [language] certainly is not a true instinct, for every language has to be learned. It differs, however, widely from all ordinary arts, for man has an instinctive tendency to speak, as we see in the babble of our young children; while no child has an instinctive tendency to brew, bake, or write. Moreover, no philologist now supposes that any language has been deliberately invented; it has been slowly and unconsciously developed by many steps.

--Charles Darwin[1]

Of all the behaviors in which humans engage, probably none is so complex and yet commonplace as speaking and listening to language. Indeed, it is rather unusual to observe a gathering of two or more conscious humans who are not involved in the continuous use of language.

It is not difficult to think of many ways in which language is an important tool for the human species. Perhaps most important, language helps us to accomplish things that no individual could achieve alone. Organized hunting, warfare, communal agriculture, and the construction of dams, canals, roadways, buildings, and transport vehicles all depend on language to coordinate the activities of the individuals involved. Language allows people to share their experiences, successes, and failures with others, making it possible for knowledge to be shared among the members of a community. Language in its more permanent written form, first on stone, then paper, and now increasingly on computer disks, makes it possible for us to understand something about those who lived in other places and times. And language, both spoken and written, appears essential for the development of science and technology in that it allows individuals to make their ideas and theories public and thus subject to both the skepticism and further development of their peers.

We may normally first think of language as a tool for communication among individuals, but it also appears to be used silently as a medium of thought. Although its specific role in facilitating and shaping our thought processes remains the subject of lively debate among scholars, the subjective experience of being human suggests that language plays an important role in human cognition and consciousness.[2]

Human language is also remarkable for how it is put together, that is, its grammar. The structural complexity of all 5000 to 6000 extant languages is such that despite centuries of analysis, linguists have yet to come up with a complete and accurate grammatical description of any one of them. Linguists have been successful in discovering important regularities in many languages, and have also shown how different languages that may appear quite different on the surface share underlying similarities. Nonetheless, certain structural aspects of all languages have so far eluded formal understanding. Since we are so intimately familiar with our own native tongue we seldom consider this complexity, yet it becomes evident to anyone who has attempted to learn a foreign language as an adult.

This complexity would be just a curiosity, rather than a puzzle of fit demanding explanation, if it did not contribute to the usefulness and expressive power of language. We all experience occasional difficulty in putting our thoughts into words, but all languages, from the cautious, conservative French of the academicians to the ever-changing inner-city English of African-Americans, provide for a wide range of expressive possibilities, and we are seldom unable to describe or comment on what we consider to be an important aspect of our physical or social world. We have nouns to refer to both physical objects such as cars and horses as well as to abstract concepts such as justice and love. We have verbs such as walk to refer to actions, others such as believe to refer to the states of organisms and objects, and still others such as sit which can refer to either states or actions. We have adjectives such as red and adverbs such as quickly to modify nouns and adjectives. We have grammatical devices such as word endings and word orders to signal the relationships among nouns and verbs. We have tense and mood systems to specify further the time and manner in which events did take, will take, or would take place. We can use language deceptively to describe events that did not or will not take place, and we can even use it imaginatively and hypothetically to describe things that might exist, will exist, or that we would like to exist.

We also have to consider the human physiological characteristics that make language possible, and how language fits so well the physical and social environment in which we use it. The human vocal tract is unique among all animals in allowing the production of a rich variety of sounds that are used for speaking. This is accomplished by producing a continuous stream of sounds (most originating in the vocal cords) that is modified by actions of the tongue, lips, and teeth. Large portions of the brain are involved in producing and receiving speech, and newborn infants appear to be "prewired" for the categorical perception of speech sounds. The use of sound as a medium of language is well matched to the ocean of air around us. Speech sounds travel quickly through air and, unlike light, can penetrate walls and turn corners. Speaking, rather than gesturing manually, keeps the hands free for other tasks and is effective in both the light of day and dark of night.[3]

Human language thus appears to be very well designed for communicating with others and controlling our physical and social environments. Thus it provides another striking example of a puzzle of fit. We must consider at least three aspects of language. First we have its origin and evolution as a characteristic of the our species. Second is the puzzle of how it is possible that all normal children, who take so many years to develop fully their cognitive, physical, and social skills, are able to learn the language of their community with amazing speed and apparent ease. Finally, we must look into just how language allows us to communicate our thoughts, questions, requests, and desires to others.

The Origin and Evolution of Language

Attempting to reconstruct the evolution of human language is fraught with difficulties. Whereas the physical remains of our ancestors may endure many millions of years, unfortunately we have no records of early speech since "language leaves no bones."[4] However, this has not prevented many scholars from proposing accounts of the origin of language and how it evolved into the thousands of tongues that are spoken today. "For instance, it has been argued that language arose from mimicry of animal calls, imitations of physical sounds, or grunts of exertion--the infamous `bow-wow,' `ding-dong,' and `heave-ho' theories."[5] Such unfounded speculations became so rampant in nineteenth-century Europe that in 1866 the Société de Linguistique de Paris banned the topic altogether from its meetings and publications.

But although evolutionary evidence of a purely linguistic nature does not exist, recent research on hominid fossils and studies of the modern human brain and vocal tract are beginning to shed some light on the evolution of language that has begun to move us beyond the realm of pure speculation.

The Biological Prerequisites of Human Speech

Since speaking requires the production of a continuously and finely varied stream of sound, we can gain some knowledge concerning the evolution of language by studying the evolution of the vocal tract and comparing it with corresponding systems in other mammals. The human system is different from that of all other terrestrial mammals in one striking way. Darwin himself noted "the strange fact that every particle of food and drink which we swallow has to pass over the orifice of the trachea, with some risk of falling into the lungs."[6] So unlike mammals that maintain separate pathways for breathing and feeding, thus enabling them to breathe and drink at the same time, adult humans are at a much higher risk for having food enter their respiratory systems; indeed, many thousands die each year from choking.[7] In addition, our relatively short palate and lower jaw are less efficient for chewing than those of nonhuman primates and our early human-like ancestors, and provide less space for teeth.[8]

But if the design of the human throat and mouth is far from optimal for eating and breathing, it is superbly suited for producing speech sounds. All mammals produce oral sounds by passing air from the lungs through the vocal cords, which are housed in the larynx (or "Adam's apple"). The risk of choking to which we are exposed results from our larynx being located quite low in the throat. This low position permits us to use the large cavity above the larynx formed by the throat and mouth (supralaryngal tract) as a sound filter. By varying the position of the tongue and lips, we can vary the frequencies that are filtered and thus produce different vowel sounds such as the [i] of seat, the [u] of stupid, and the [a] of mama.[9] We thus see an interesting trade-off in the evolution of the throat and mouth, with safety and efficiency in eating and breathing sacrificed to a significant extent for the sake of speaking. This suggests that the evolution of language must have provided advantages for survival and reproduction that more than offset these other disadvantages. We will save for a bit later discussion of what these advantages might have been.

The importance of the evolution of the human vocal tract to fit the functions of speech are suggested by studies of Neanderthal man, who lived in Europe and the Middle East about 100,000 years ago. To find out what types of speech sounds Neanderthals might have been able to produce, Philip Lieberman, of Brown University in Rhode Island, and his associates reconstructed the vocal tract of these hominids based on fossil evidence and applied computer-modeling techniques to it. They concluded that Neanderthals had a relatively high larynx and relatively flat tongue, and could therefore have produced only a limited number of nasal vowel sounds. Most significant, they would have been unable to produce the sounds [i], [u], and [a], the three quantal vowel sounds that are most easily distinguished by human listeners. Thus Lieberman concludes that even "if Neanderthal hominids had had the full perceptual ability of modern human beings, their speech communications, at minimum, would have had an error rate of 30% higher than ours."[10]

Also existing in Europe at the same time were the Cro-Magnons, a separate hominid species who appear to have been slightly taller but with lighter bones and less powerful muscles than the Neanderthals. One could easily imagine that the Neanderthals' superior strength would have been an advantage for hunting and for any competitive encounters with their Cro-Magnon "cousins." But the Cro-Magnons appeared to have one important advantage in their favor--a modern vocal tract capable of producing all the sounds of human speech. It is therefore tempting to speculate that both the disappearance of Neanderthals about 35,000 years ago and the survival and continued evolution of Cro-Magnons into what we are today was due at least in part to the superior linguistic ability of our ancestors.[11]

The study of the evolution of the our vocal tract also provides hints concerning the evolution of our brain. Obviously, the throat and mouth would not have evolved the way they did to facilitate language production and comprehension while compromising eating and respiration if the brain had not been capable of producing and comprehending language.

Lieberman has proposed three major stages in the evolution of the neural bases for language. First was lateralization of the brain, meaning that each half became specialized for different functions. For most of us, the left hemisphere provides most of the neural circuitry required for language production. For about 90% it also controls the dominant (right) hand used for tasks involving fine motor control, suggesting that lateralization may have originally evolved in response to selection pressure for skilled hand movements. As neurologist Doreen Kimura noted,

[making and using tools] requires the asymmetric use of the two arms, and in modern man, this asymmetry is systematic. One hand, usually the left, acts as the stable balancing hand; the other, the right, acts as the moving hand in such acts as chopping, for example. When only one hand is needed, it is generally the right that is used. It seems not too farfetched to suppose that cerebral asymmetry of function developed in conjunction with the asymmetric activity of the two limbs during tool use, the left hemisphere, for reasons uncertain, becoming the hemisphere specialized for precise limb positioning. When a gestural system [for language] was employed, therefore it would presumably also be controlled primarily from the left hemisphere. If speech were indeed a later development, it would be reasonable to suppose that it would also come under the direction of the hemisphere already well developed for precise motor control.[12]

That brain lateralization had prelinguistic origins is supported by recent findings of handedness and lateralization among nonhuman primates.[13]

The second component of language evolution involved the evolution of brain structures responsible for the voluntary, intentional control of speech. Although we usually take the voluntary and intentional nature of language for granted, it is of interest to contrast human use of language with the communication systems of other animals. Of particular interest are the chimpanzee observations of Jane Goodall:

Chimpanzee vocalizations are closely tied to emotion. The production of a sound in the absence of the appropriate emotional state seems to be an almost impossible task for a chimpanzee. . . . A chimpanzee can learn to suppress calls in situations when the production of sounds might, by drawing attention to the signaler, place him in an unpleasant or dangerous position, but even this is not easy. On one occasion when Figan was an adolescent, he waited in camp until the senior male had left and we were able to give him some bananas (he had none before). His excited food calls quickly brought the big males racing back and Figan lost his fruit. A few days later he waited behind again, and once more received his bananas. He made no loud sounds, but the calls could be heard deep in his throat, almost causing him to gag.[14]

Figan's difficulty in concealing news of the bananas from his associates contrasts sharply with the ease with which humans can use language to deceive and manipulate others, to talk of the past, and to plan for the future. Lieberman attributes our control over language to certain changes in the brain, including the evolution of what is referred to as Broca's area, as well as the enlargement of the prefrontal cortex (the part of the brain just behind the forehead) and a rewiring of concentrations of neurons referred to as the basal ganglia.

The third component in the evolution of human language involved the ability to put sounds and words in specific orders and to perceive these orders as meaningful. In all languages, the order in which words and parts of words are produced and perceived is crucial to the meaning. The sentence Mary saw John conveys a different meaning from John saw Mary. Of all the communication systems used by the earth's animals, it appears that only human language derives its expressive power from the recombination of a finite (though large) number of words and word parts into an infinite number of different sequential orders. The ways in which words may be ordered and how these different orders relate to meaning is syntax.

Lieberman suggests that the evolution of motor control for speech itself provided the basis for the development of syntax. This is because articulating even a simple word such as cat requires the precisely timed sequential coordination of movements of the tongue, lips, and jaws. The order of sounds within words also makes a difference as to their meaning, with cat different from tack. Lieberman thus concludes that "speech motor control is the preadaptive basis, that is, the starting point [for syntax]. Once syntax became a factor in human communication, the selective advantages that it confers . . . would have set the stage for natural selection that specifically enhanced these abilities--independently of speech motor control."[15]

The importance of language and the advantages it provides us in communicating, coordinating our activities, and thinking suggests that, in addition to being a product of our evolution, it also played a large part in shaping our evolution, particularly that of our brain. Lieberman has consequently

. . . propose[d] that natural selection to enhance faster and more reliable communication is responsible for the second stage of the evolution of these mechanisms--the evolution of the modern human brain. Communication places the heaviest functional load on "circuitry" for both electronic devices and brains. The transistors and solid-state devices that made digital computers a useful tool were first developed for communication systems. Indeed one can argue that the demands of communication preempt the highest levels of technology and organization of a culture, whether couriers on horses or lasers and fiber-optic bundles are the means employed. In short, evolution for efficient, rapid communication resulted in a brain that has extremely efficient information-processing devices that enhance our ability to use syntax. These brain mechanisms also may be the key to human cognitive ability. As many scholars have noted, human language is creative; its rule-governed syntax and morphology allow us to express "new" sentences that describe novel situations or convey novel thoughts. The key to enhanced cognitive ability likewise seems to be our ability to apply prior knowledge and "rules" or principles to new problems.[16]

Given the progress that has been made in understanding the evolution of language, together with modern biology's acceptance of natural selection as the explanation for the appearance of design in the structure and behavior of organisms, it might come as somewhat of a surprise to learn that some scholars reject natural selection as an explanation for the appearance, structure, and use of language. What is particularly noteworthy about these critics of natural selection is that some of them are widely recognized as leading thinkers and researchers in their respective fields, among them Noam Chomsky and Stephen Jay Gould.

Chomsky not only rejects natural selection as an explanation for the evolution of human language, but also rejects Darwinian explanations for certain well-understood biological phenomena. He has stated that "evolutionary theory appears to have very little to say about speciation, or about any kind of innovation. It can explain how you get a different distribution of qualities that are already present, but it does not say much about how new qualities can emerge."[17] This is a curious statement, given that Darwin proposed natural selection to account for speciation specifically and that the essence of his theory is still accepted today by mainstream biologists as the sole explanation for adaptive evolution and speciation. As for evolution "not say[ing] much about how new qualities can emerge," those unconvinced by the biological evidence will see in part IV how artificial evolution is being applied increasingly to create useful, innovative products from drugs to computer programs. An attempt to understand better Chomsky's rationale for rejecting a selectionist account of language will be offered later in this chapter, where we take a look at how children learn the language of their community.

Gould's reluctance to accept a selectionist account of human language stems from his more general concern that adaptationist explanations of biological traits are often misapplied. In an important paper written with Lewontin,[18] they proposed that certain biological traits may not be solely due to adaptive natural selection, but rather may have their origins as side effects or by-products of other evolutionary changes, which are then "seized" at a certain time by a new function. We referred to this phenomenon of exaptation (originally called preadaptation by Darwin) in addressing the evolution of the human brain in chapter 5. The exaptationist perspective proposes that the ability to communicate by language had little or no role in the changes that made the brain capable of language. "The brain, in becoming large for whatever adaptive reasons, acquired a plethora of cooptable features. Why shouldn't the capacity for language be among them? Why not seize this possibility as something discrete at some later point in evolution, grafting upon it a range of conceptual capacities that achieve different expression in other species (and in our ancestry)?"[19]

Perhaps we can better understand the exaptationist view of language by leaving biology for a short time and considering an example of technological change.[20] The very first cameras had chemically treated plates of glass that captured the image projected by the lens. In attempts to make photography less costly and more convenient, celluloid sheets were introduced, followed by rolls of celluloid film. But whereas long rolls of flexible film were intended solely to facilitate still photography, they made it possible also to take many pictures in quick succession, thus leading to the development of the motion picture camera. We therefore cannot say that roll film evolved to make the motion picture camera possible, since the idea of a motion picture camera probably did not even exist when the first rolls of film were produced. Instead, to use Darwin's term, the roll film of the still camera was preadapted, although quite accidentally and unintentionally, for use in the motion picture camera. To use Gould's more neutral and more accurate terminology, this feature of the still camera was exapted for use in motion picture cameras. So, in effect, Chomsky and Gould assert that the human brain is analogous to roll film in that it evolved for reasons originally unrelated to language concerns; but once it reached a certain level of size and complexity, language was possible.

Exaptation is an important conceptual tool in understanding the evolution of biological structures and behaviors, but it alone cannot account for the continued evolution of adapted complexity. Although roll film made the first motion picture cameras possible, modern movie cameras and the film they use are more complex and better adapted to the production of high-quality movies than the very first movie cameras. And whereas some of these additional technological developments may have also been exapted from other fields, for example, developments in electronics and chemistry making possible more accurate light meters and more sensitive film, the way in which all the component parts of a modern movie camera work together can be understood only as resulting from selection processes operating at the level of the entire camera, not just its component parts.

Similarly, certain preexisting structures and functions of the human brain and vocal tract may have been taken over (or exapted) for use in language. However, this cannot by itself account for the ways in which the brain, the vocal tract, and language fit together to create a total system that is quite remarkably adapted to serve the functions for which language is used. The fact remains that the process of cumulative blind variation and selection is the only process currently understood that can account for the nonprovidential appearance of the adaptive complexity that is seen in the design of language, and the design of the human brain and vocal tract for language. As Pinker and Bloom point out in their important discussion of the role of natural selection in the evolution of language, "language shows signs of complex design for the communication of propositional structures, and the only explanation for the origin of organs with complex design is the process of natural selection."[21]

The Evolution of Language

But what about the sounds, structures, and rules that make up language? How did they originate and evolve over time, leading to the languages spoken throughout the world today? As already noted, we unfortunately have no records of how language was used by our prehistoric ancestors. Nonetheless, our current knowledge of evolution provides at least a general scenario of how it evolved. As Pinker and Bloom observed, for language to have evolved by natural selection:

There must have been genetic variation among individuals in their grammatical competence. There must have been a series of steps leading from no language at all to language as we now find it, each step small enough to have been produced by a random mutation or recombination, and each intermediate grammar useful to its possessor. Every detail of grammatical competence that we wish to ascribe to selection must have conferred a reproductive advantage on its speakers, and this advantage must be large enough to have become fixed in the ancestral population. And there must be enough evolutionary time and genomic space separating our species from nonlinguistic primate ancestors.[22]

But since there are so many conceivable ways in which language could have conferred a "reproductive advantage on its speakers" and so few conclusive data on this subject, we can only speculate on which ones actually were important. We already mentioned the use of language to coordinate human activity, and it is not difficult to imagine how the ability to plan and coordinate hunting, agricultural, and warfare activities would have conferred survival advantages to individuals and groups with language skills. Also, as mentioned earlier, language makes it possible for individuals to share knowledge, thereby avoiding the mistakes and errors that others have already made.

We cannot go back in time to see how language was used by early humans, but we can learn from the behavior of communities of hunter-gatherers who still live today in much the same way as all our ancestors did as recently as 12,000 years ago when the total human population of the earth was only about 10 million.[23]Among such groups are the !Kung of the Kalahari Desert in Namibia and Botswana who use language to discuss

everything from the location of food sources to the behavior of predators to the movements of migratory game. Not only stories, but great stores of knowledge are exchanged around the fire among the !Kung and the dramatizations--perhaps best of all--bear knowledge critical to survival. A way of life that is difficult enough would, without such knowledge, become simply impossible.[24]

Although sharing knowledge of the location of food would certainly seem to be a function of language providing important survival advantages, survival in itself cannot ensure that an individual's genes and the language abilities that go with them will be inherited. Inheritance requires reproduction, and human reproduction, as does all sexual reproduction, requires a partner of the opposite sex. It should not be surprising, therefore, to find that language plays an important role in sexual selection. "Just as female birds seem to have favored elaborate songs by males (not to mention long and shiny feathers) when choosing a mate, so prehuman females might have promoted a fancier form of language"[25] by preferring men with more impressive language skills.

Of course, we do not know, and will probably never know, the actual events and selection pressures that gradually transformed the hoots, grunts, and cries of our ancestors into our current remarkable vocal communication system. But two things are known: first, language is a highly complex and adaptive tool without which human life as we know it would be impossible; and second, at some time in the past it did not exist. And regardless of arguments that natural selection is not quite up to the task of explaining the emergence and refinement of language, the fact remains that Darwinian evolution is the only currently available, nonmiraculous explanation for the appearance of our most remarkable and useful ability, an ability that some believe may be responsible for a complete experience of human consciousness.[26]

The Childs Acquisition of Language

Fortunately, the child does not have to be concerned about the evolution of language, the brain, and the vocal organs that make language possible. These already exist in the community and in the biological structures found above her shoulders. The child's task is therefore "simply" to apply her biological endowment, provided by natural selection, to learn the language she hears spoken by her parents and community, whether it be Estonian, Eskimo, or English. We should not expect this to be too much of an ordeal, since after countless centuries of language use, natural selection should have provided a good fit between the child's abilities and the requirements of the task.

Language as Learned

The acquisition of language may not initially appear to be very different from the other things that children learn. It may seem unremarkable that an American child who hears countless hours of language spoken to and around her will eventually begin to produce the same sounds, words, and grammatical structures. Parents also provide considerable encouragement for their children to speak, as is evident in the smiles and hugs that typically follow the first utterance of "mama" or "dada." Children learn to do many things--put on their clothes, drink from cups, open and close doors, and even operate the television set and VCR--apparently from observing and imitating the actions they see others performing and from being reinforced by the satisfaction of the consequences of their actions. Why should language acquisition be any different?

This is essentially what was proposed by B. F. Skinner, introduced in our discussion of learning in chapter 7. It will be recalled that Skinner's theory of learning attempted to explain the acquisition of any new behavior as a process of operant conditioning by which new behaviors (such as a rat pushing a bar) would be learned to the extent that they were reinforced in some way by the environment (such as receiving food). He extended his behaviorist view of learning to human language.

In all verbal behavior under stimulus control there are three important events to be taken into account: a stimulus, a response, and a reinforcement. These are contingent upon each other, as we have seen, in the following way: the stimulus, acting prior to the emission of the response, sets the occasion upon which the response is likely to be reinforced. Under this contingency, through a process of operant discrimination, the stimulus becomes the occasion upon which the response is likely to be emitted.[27]

But he recognized at the outset that the social use of language is in one important respect quite unlike nonverbal behaviors operating on an inanimate environment.

When a man walks toward an object, he usually finds himself closer to it; if he reaches for it, physical contact is likely to follow; and if he grasps and lifts it, or pushes or pulls it, the object frequently changes position in appropriate directions. All this follows from simple geometrical and mechanical principles. . . . However, when we use language to act upon the world, as when we ask another for some water, the glass of water reaches the speaker only as the result of a complex series of events including the behavior of the listener. . . . Indeed, it is characteristic of such [verbal] behavior that it is impotent against the physical world.[28]

Consequently, Skinner determined that although the links among stimulus, response, and reinforcement may be less obvious and more indirect for verbal behavior, they nonetheless exist and can be used to explain language learning and use. For example, he noted that in a given language community certain verbal behaviors such as "Wait!" and "Sh-h!" are typically followed by certain consequences, such as someone waiting or being quiet. Such a result depends, of course, on the cooperation and behavior of the other person. But if the consequence is achieved, that particular verbal response will be strengthened and be more likely to occur in a similar instance in the future. Thus, like all other behaviors, language learning is completely dependent on contingencies of reinforcement.

In essence, Skinner's analysis of verbal behavior is an attempt to show how language is shaped by the environment in the same way that a rat's lever pushing or pigeon's key pecking can be controlled by providing and withholding food. By giving reinforcement for the sounds, words, and sentences the child produces that approximate the adult form of the language, and by withholding such reinforcement when an utterance is in some way deviant, the child's verbal behavior is gradually shaped over time to approximate the language of the community. Skinner argued that such contingencies of reinforcement are not only responsible for the child's learning language, but are the determining factors for all behavior, including adults' language behavior.

In insisting that reinforcement is the key to understanding language behavior, Skinner had to stretch the concept of reinforcement to cover situations that are quite unlike those found in studies of animal learning. For instance, he claimed that many verbal behaviors are "automatically self-reinforcing," as when "the child is reinforced automatically when he duplicates the sounds of airplanes, streetcars, automobiles, vacuum cleaners, birds, dogs, cats, and so on."[29] And "the young child alone in the nursery may automatically reinforce his own exploratory verbal behavior when he produces sounds which he has heard in the speech of others."[30] He stretched the concept of reinforcement to situations where the person producing language is not even present when the reinforcement takes place, as when a public speaker or writer is reinforced by "the fact that effects of verbal behavior may be multiplied by exposing many ears to the same sound waves or many eyes to the same page."[31]

Chomsky, in his influential and widely cited review[32] of Skinner's book, underscored these and many other problems with an operant conditioning analysis of human behavior, including language. Concerning the role of reinforcement, after having first cited the above and other examples, Chomsky concluded:

From this sample, it can be seen that the notion of reinforcement has totally lost whatever objective meaning it may ever have had. Running through these examples, we see that a person can be reinforced though he emits no response at all, and that the reinforcing stimulus need not impinge on the reinforced person or need not even exist (it is sufficient that it be imagined or hoped for). When we read that a person plays what music he likes, says what he likes, thinks what he likes, reads what books he likes, etc. BECAUSE he finds it reinforcing to do so, or that we write books or inform others of facts BECAUSE we are reinforced by what we hope will be the ultimate behavior of reader or listener, we can only conclude that the term reinforcement has a purely ritual function. The phrase "X is reinforced by Y" . . . is being used as a cover term for "X wants Y," "X likes Y," "X wishes that Y were the case," etc. Invoking the term reinforcement has no explanatory force, and any idea that this paraphrase introduces any new clarity or objectivity into the description of wishing, liking, etc., is a serious delusion.[33]

It is interesting to note that Chomsky's analysis is not inconsistent with perceptual control theory discussed in chapter 8, as he implied that people behave to satisfy their internal "wishes," "likes," and "wants," and not because certain behaviors were reinforced in the past.

Language as Innately Provided

In his subsequent writings on language acquisition, Chomsky has attacked Skinner's and other learning theories by focusing on the syntactic structure of language and the fact that all normal children show impressive knowledge of this structure despite considerable variation in their exposure to language. Chomsky's insights and his development of what is called generative grammar revolutionized our understanding of language. The particular generative grammars being developed to explain aspects of various languages are probably beyond the grasp of anyone who has not formally studied modern linguistics. The general notion of a generative grammar is fortunately more accessible.

Let us first consider what syntax is and why it is necessary for language. Due to the nature of the human vocal tract, we do not normally produce (or perceive) more than one speech sound at a time. This makes oral language a serial medium, meaning that sounds are strung together one after another like beads on a string. As already noted, the order in which sounds are uttered to form words (compare pot with top) and words are uttered to form sentences (compare The dog ate the pig with The pig ate the dog) are related to the meaning of a sentence. All languages differ in the degree to which word order is crucial for understanding, and English is particularly choosy, with most orders being meaningless, or nearly so. As linguist Derek Bickerton observed:

Try to rearrange any ordinary sentence consisting of ten words. There are, in principle, exactly 3,628,800 ways in which you could do this, but for the first sentence of this paragraph only one of them gives a correct and meaningful result. That means 3,628,799 of them are ungrammatical. How did we learn this? Certainly, no parent or teacher ever told us. The only way in which we can know it is by possessing, as it were, some recipe for how to construct sentences, a recipe so complex and exhaustive that it automatically rules out all 3,628,799 wrong ways of putting together a ten word sentence and allows only the right one. But since such a recipe must apply to all sentences, not just the example given, that recipe will, for every language, rule out more ungrammatical sentences than there are atoms in the cosmos--and there are at least five thousand different languages![34]

Although many languages are not as strict as English concerning word order, all of them require at least certain orders of basic sounds (called phonemes) to make up words, even if they are more flexible in the permissible orderings of words to form sentences.[35]

Syntax refers to the principles that govern the permissible orderings of words in a language and how these orderings are related to meaning. According to Chomsky, we produce syntactic--properly ordered--sentences not by memorizing a list of words and sentences, and not even by learning the general structural patterns that make up sentences and then using these as frames to create new sentences. Instead, we know quite abstract rules of the language that we use to generate sentences, most of which are novel in that they have neither been heard nor spoken before by the speaker. Let us look at a few simple examples of the generative rules of syntax proposed by Chomsky for English.

(1) S -> NP + VP

(2) NP -> Det + N

(3) VP -> V + NP

Rule (1) states that a sentence (S) may be composed of a noun phrase (NP) plus a verb phrase (VP). Noun phrase is defined in (2) as a determiner (Det) such as a, the, this, or that followed by a noun. And (3) defines a possible verb phrase (VP) as a verb (V) followed by another noun phrase. One learns these rules, together with some others including recursive rules that permit the use of a sentence as a noun phrase as in I saw John throw the ball. Once one also learns which words fit into which classes, one can produce an unlimited number of sentences, such as The earthquake destroyed the city, or That dog has big ears. In addition, transformational rules transform a sentence into related sentences, so that The earthquake destroyed the city can be transformed into the question Did the earthquake destroy the city? or into the passive voice sentence The city was destroyed by the earthquake.

The magnitude of the child's achievement in learning language can be appreciated by considering the complexity of these rules. Take, for example, a simple question such as Did you see my toy? To produce such a question the child must know that such a yes-no question is formed from the corresponding declarative (nonquestion) sentence by moving the first auxiliary verb before the subject or adding the auxiliary verb do to this position if no auxiliary already is present. This is a rather complex rule, and yet it is tacitly known by every English-speaking child who can ask a yes-no question! It is furthermore clear that such rules are not explicitly taught by parents to their children, since few parents could even state them, and it is highly unlikely that a typical three- or four-year-old could comprehend them even if the parents did.

To make the acquisition of syntax even more remarkable, Chomsky says that the examples of language that children hear are inadequate for them to figure out the underlying generative rules on their own. Children hear only a relatively small subset of sentences, and due to lapses of attention or memory, false starts, slips of the tongue, interruptions, and other disturbances, many of the sentences they do hear are not well-formed. Considering also noise from nearby machines, television sets, airplanes, and other sources together with the frequent ear infections that many young children experience, and occasions when a child hears clearly articulated, grammatical sentences become even less frequent. Chomsky has referred to these characteristics as the "poverty of the stimulus," implying that language as heard by a child is not sufficiently clear, accurate, and structured for the child to be able to deduce and learn its underlying generative rules.

It is also clear that children go well beyond what they hear in coming up with grammatical rules. For example, many if not all children living in an English-speaking environment will use words such as breaked, drawed, holded, and cutted despite never having heard adults use such words. But although they make certain types of errors, there are many other possible errors which they never seem to make. For example, no child has been heard to say "Kitchen the in is Daddy?" to turn "Daddy is in the kitchen" into a question.

Chomsky, together with many other linguists and child language researchers influenced by his theories, have thus concluded that children do not and could not acquire language by operant conditioning as proposed by Skinner, or indeed by any method of learning as learning is generally understood in psychology today. The fact that all normal children, regardless of intelligence level, are quite proficient speakers of the language to which they are exposed by the age of four years suggests to Chomsky that much of that knowledge is part of the human biological endowment. And although different languages certainly vary in terms of their sounds, words, and grammatical constructions, these differences can be understood as variations on a theme, the theme being characteristics that are shared by all natural languages and that linguists call "universal grammar." For Chomsky, knowledge of universal grammar is as much a part of Baby Sue's biological inheritance as is the nose on her face.

In effect, then, Chomsky believes that since language cannot be acquired by environmental instruction (due primarily to the poverty of the stimulus), to a very large extent it must be innate. To be sure, he and like-minded linguists recognize that children must be exposed to a language for it to be acquired, but rather minimal and haphazard exposure is all that is required to trigger its acquisition. In this respect spoken language is very different from other abilities such as mathematics and reading skills, which normally require special experiences for their development, such as formal schooling and prolonged practice. Even then many children and adults have difficulty acquiring these academic skills.

Chomsky is not alone in his innatist view of human cognitive abilites related to language. While he focused on syntactic knowledge, psycholinguist and philosopher Jerry Fodor has maintained for many years that all human conceptual knowledge is innate. Concepts such as TRIANGLE, DOG, and FREEDOM, and therefore the meaning we attach to these words, must be innate since it is impossible for someone to learn a concept that was not in some form already known before. It will be noted that this argument bears an uncanny resemblance to the one made by Plato in the dialogue with Meno discussed in chapter 6. For Fodor, what appears to be learning is actually "fixation of belief,"[36] using experience to select among a host of innate ideas. This may initially seem consistent with a selectionist (and therefore evolutionary and constructive) theory of learning and cognitive development. But Fodor places some severe limitations on what he believes such selection can achieve, as seen in his statement cited in chapter 9:

There literally isn't such a thing as the notion of learning a conceptual system richer than the one that one already has; we simply have no idea of what it would be like to get from a conceptually impoverished to a conceptually richer system by anything like a process of learning.[37]

But if evolution itself can be considered to be a form of learning in which organisms over phylogenetic time acquire knowledge about their environment,[38] it turns out that, despite Fodor, we do have a theory of learning, namely, natural selection, that explains how complex, adapted systems such as organisms and components of organisms can emerge from simpler or more "impoverished" ones.

Indeed, the argument that knowledge in the form of new and richer concepts cannot be constructed, but must rather already exist and be innately provided, creates a serious problem for Fodor and others who use this can't-get-there-from-here logic. Obviously, at some time (to be safe, let's say one second after the Big Bang), concepts such as RED, MOTHER, CONSERVATION OF ENERGY, and INTERROGATIVE SENTENCE did not exist, yet today they do exist in the minds of humans. How did they originate? Certainly, biological evolution must have had something to do with it since these concepts are clearly complex and functional characteristics of our species. But if Fodor is right that there is no way "to get from a conceptually impoverished to a conceptually richer system," he must also believe biological evolution to be impossible. Once it is recognized that the evolutionary process of cumulative blind variation and selection has in fact resulted in the emergence of more complex systems from simpler ones, and that an evolutionary process involving the cumulative variation and selection of ideas, thoughts, and concepts could also be an essential and universal part of human cognitive development, Fodor's argument for the impossibility of learning appears seriously flawed.

Mark Bickhard, a cognitive scientist at Lehigh University, made just such a critique, using in the following quotation the word representation to refer to Fodor's conceptualization of knowledge:

If representations cannot emerge, however, then they cannot come into being at all. A narrow focus on this point yields Fodor's innatism: neither learning nor development, as currently understood, can construct emergent representation; therefore the basic representational atoms must be already present genetically. Unfortunately, this conclusion does not follow. If representation cannot emerge, then it cannot emerge in evolution any more than it can in development. The problem is logical in nature, and is not specific to the individual. Conversely, if some way were posited in which evolution could yield emergent representation, then there is no a priori reason why that emergence would not be just as available in the development of the individual. Fodor's innatism, then, simply misses the basic issue. If representation cannot emerge, then it is impossible for it to exist, and evolution is in no better position in this respect than is individual development; on the other hand, if representation can emerge, then there is something wrong with the models of learning and development that cannot account for that emergence. When those models are corrected, that emergence should be as available to the individual as to evolution. In either case, Fodor's strong innatism does not follow.[39]

But isn't it true that things can happen during evolution that cannot happen during the lifetime of an individual human? Human hearts and arms evolved over a very long period of time through among-organism selection. Yet no human can "learn" to grow another heart or arm during a lifetime because such biological structures are determined by genes that do not change during one's lifetime.[40] In contrast, the cognitive abilities underlying language and conceptual knowledge are dependent on the structure of the brain, and the brain is remarkably adaptive ontogenetically, whereas the genome is not.[41] As explained in chapter 5, the brain retains the ability to make adaptive changes through a variation and selection of synapses. Thus it is at least conceivable that new linguistic and conceptual knowledge could emerge as a result of such within-organism selection.

None of this, of course, proves that Chomsky and Fodor are wrong in their assertions that our knowledge of language and concepts is innately determined. But it does argue against their reasoning, and the reasoning of other cognitive and linguistic innatists, that this knowledge must be innate. If the human genome could have acquired such knowledge by way of the among-organism selection of human evolution, then it must be considered at least a possibility that the brain could acquire similar knowledge by way of the within-organism selection of synapses. The implications of within-organism selection for such innatist views of human cognition will be considered again at the end of chapter 15.

Language Acquisition as Selection

Chomsky's and Fodor's views of language acquisition are undeniably very popular among linguists and cognitive scientists today. Nonetheless, noteworthy opposition to their perspective exists, much of it coming from psychologists who take a less linguistic and more functional perspective on language, its acquisition by children, and its use. A number of these individuals have adopted, either explicitly or implicitly, a selectionist view of language learning.

Let us leave syntax for a while and consider what is involved in learning the meanings of individual words. Since children appear to learn new words and their meanings so quickly, it might first appear that it is simply a matter of forming an association between each new word they hear and some object (for example, cat), quality (black), relation (on), action or state (eating) that the child can perceive and that is perhaps even pointed out by a helpful adult. But further reflection indicates that learning vocabulary cannot be quite that simple.

The difficulty inherent in determining what a word means was pointed out by W. V. Quine, arguably America's most influential living philosopher.[42] He uses the example of a linguist visiting a strange country whose language he does not know. During his visit, the linguist hears the word gavagai used in the presence of a small, furry mammal with long ears and initially assumes that gavagai is equivalent to the English rabbit. But on further reflection he realizes that gavagai could actually refer to the concept ANIMAL or MAMMAL or HEAD or FUR or RABBIT-LIKE SHAPE or HOP, or perhaps even something quite unrelated to the rabbit such as the time of day. Gavagai could even be the proper name of a person who in some way resembles a rabbit, or an expletive to curse the appearance of yet another garden pest. Quine argues that no matter how much evidence our linguist collects, he simply has no way of ever being certain that two words from different languages have the same meaning.

The child faces essentially the same conundrum.[43] Even if a helpful mother points to an animal and says "cat," how is the child to know that the word cat refers to the animal itself (actually, a species of animal) and not to its color, its fur, its relationship to the carpet, the cat-plus-the-carpet-it-is-sitting-on, the sound it is making, or its current behavior of scratching itself? When one realizes the infinite possibilities concerning the meaning of any word, it becomes clear that it is not possible for an adult to provide information that would reliably transmit the meaning of a word to a child (or to any other individual, for that matter). The child can only suppose that the word cat refers to some concept already in her mind, since she surely has no direct access to those concepts and meanings in the adult's mind, but she can never be absolutely sure which it is. That children do make guesses and jump to unjustified, tentative conclusions is clear when a child refers to a small black dog as a "cat" or, perhaps more humorously, when she refers to the visiting parish priest as "Daddy."

It is informative to compare the child in this situation with that of the scientist testing theories, such as that water must be heated to 100deg. C for it to boil. For the scientist, no amount of evidence can be taken as conclusive proof of the theory. One can boil water using heat from burning gas, electrical resistance, or solar energy and find that the source of heat makes no difference in the boiling point. One can boil water in vessels made of steel, iron, aluminum, stone, or plastic, or do the experiment at different times of day and during different phases of the moon and obtain what appears to be additional evidence for the theory. These findings may appear to lend support to the theory, but they cannot prove it. Indeed, a water-boiling experiment conducted using an accurate thermometer at an altitude of 2000 meters above sea level will show that the theory is in fact false, since the boiling point of water depends on air pressure, which is reduced at higher altitudes.

The child's situation with respect to words and their meanings appears analogous. A young child growing up with little or no contact with nonsibling children may over a period of many years be presented with absolutely overwhelming evidence that mommy refers to the one particular woman who is almost always close by and who feeds, clothes, bathes, and cuddles him. Imagine little Johnny's surprise when during his first visit to kindergarten at the age of five he hears another child using the word mommy to refer to a woman whom he has never seen before! Johnny then has no option other than to reject his initially "well-supported" hypothesis about the meaning of mommy. Eventually he will replace it with the theory that mommy refers to not one particular person, or even a class of persons, but rather to a special kind of relationship between one human being and another. And of course, even this meaning is subject to revision as young Johnny hears or reads about a cow, dog, or cat who is the "mommy" of a calf, puppy, or kitten.

But although learning word meanings must necessarily proceed through a process of theory construction, rejection, and revision, it is also clear that the meaning theories that children entertain are either often accurate or quite close to the adult meaning, since it doesn't seem to take many cycles of trial-and-error elimination to arrive at the accepted meaning. If this were not the case, children would be hard pressed to learn new vocabulary as quickly and easily as they do.

Donald T. Campbell, whose variation-and-selection perspective on learning and thought was described in chapter 9, has theorized that children are aided in their guesses as to the meanings of new words by an innate expectation that words refer to the more easily perceivable, stable aspects (or entities) of their environment, a characteristic he refers to as "entitativity."[44] Thus, since a cup is perceived as a single entity that can be separated from and used independent of the rest of its environment, a child will expect that there is a word that refers to a cup, and not one that refers to the combination of both cup and saucer or to just the handle and bottom of the cup. Similarly, the child will expect that cat is more likely to refer to the animal she sees moving across the rug rather than to a combination of the cat and rug, or to just the cat's head and tail. As Steven Pinker concluded (referring to Quine's gavagai example):

. . . humans [are] innately constrained to make only certain kinds of guesses--probably correct kinds--about how the world and its occupants work. Let's say the word-learning baby has a brain that carves the world into discrete, bounded, cohesive objects and into the actions they undergo, and that the baby forms mental categories that lump together objects that are of the same kind. Let's also say that babies are designed to expect a language to contain words for kinds of objects and words for kinds of actions--nouns and verbs, more or less. Then the undetached rabbit parts, rabbit-trod ground, intermittent rabbiting, and other accurate descriptions of the scene will, fortunately, not occur to them as possible meanings of gavagai.[45]

Such strategies, and likely many others,[46] are very useful in constraining or biasing the child's theories of word meaning, but the fact remains that neither the child nor the adult can ever be absolutely confident that his meaning for a word is identical to that of any other person. Even consulting a dictionary provides no absolute assurance, since a dictionary can only define words through the meanings of other words whose meanings are also unverifiable. But the more interaction a person has with other speakers of the language, the more confident (though never certain) he can be that meanings are shared, since such interaction provides for increased opportunities for the rejection, revision, and resulting fine-tuning of meanings.[47] And as the child's vocabulary increases, already learned words can be used effectively to narrow down the meanings of new words.

It turns out that acquiring the meaning of words also has important implications for acquiring syntax. MIT linguist and cognitive scientist Steven Pinker, whose important article with Paul Bloom on language evolution was mentioned earlier in this chapter, has pointed out that many puzzling exceptions to some basic syntactic patterns in English appear to render their being learned by children very difficult if not impossible. Consider the following sentences (an asterisk precedes words and sentences that are ungrammatical in English):

(1) Beth sold the cookies to Eric.

(2) Beth sold Eric the cookies.

(3) Beth pulled the cookies to Eric.

(4) *Beth pulled Eric the cookies.

From the first two sentences, it is clear that a speaker of English can use one of two different grammatical structures for sentences containing both a direct object (the cookies) and indirect object (Eric). We can put the direct object after the verb followed by to and the indirect object. Or we can drop the to and switch the positions of the two objects. But notice that although the second structure seems to work fine for the verb sold, it does not sound right to most speakers of English for the verb pulled as used in (4), despite the fact that both verbs behave similarly in (1) and (3).

Let's consider a few more sentences to show that this is not an isolated example.

(5) Christopher kicked Erin.

(6) Erin was kicked by Christopher.

(7) Christopher resembled Erin.

(8) *Erin was resembled by Christopher.

Here we have examples of the active and passive voices. In the active voice construction of sentence (5), the doer of the action is before the verb and the recipient after the verb. But in the passive voice construction of (6), the recipient is before the verb (to which was has been added) and the doer is now after the verb (after which by has been added). Note that countless verbs could be substituted into sentences (5) and (6) and yield grammatical sentences, such as loved, heard, kissed, believed, and served. But for some reason the passive construction using resembled in (8) is clearly not an acceptable English sentence.

Now if children learned language by simply listening to and memorizing sentences, and if they never said a sentence that they hadn't already heard, these inconsistencies would pose no problem. But countless studies and observations reveal that children are not conservative in their language learning. We already noted their use of past tense forms they could not have heard from an adult, such as *cutted, *drawed, and *breaked. Some additional examples of children's creativity in attempting to figure out and apply rules of English grammar are:

(9) *How was it shoelaced?

(10) *Jay said me no.

(11) *I'm just gonna fall this on her.

(12) *I'm gonna pour it with water.[48]

So instead of being linguistically conservative, children are quite creative speakers in venturing beyond the words and sentences they hear others produce. Now if it is true (as Pinker believes) that children receive no useful information from adults concerning which sentences they produce are ungrammatical,[49] then the fact that we do not produce such ungrammatical sentences as adults is one of the most interesting dilemmas of human learning. If children obtain no information concerning the grammaticality of their sentences, how are they able to eliminate the ungrammatical ones? This is often referred to as the problem of the "learnability" of language.

Pinker attempted to provide a solution to this problem by showing that the syntactic exceptions of the types shown above are not arbitrary but depend on often quite subtle differences in the meanings of the verbs and the meanings of syntactic constructions. For example, the so-called dative indirect object can be used when the verb of the sentence indicates a change of possession, as in (2) Beth sold Eric the cookies, but not when only motion is implied as in (4) *Beth pulled Eric the cookies. Also, if the verb implies acting upon an object, then a passive form is acceptable, as in Adam was hit by Anne and (6), but usually not otherwise as in *Money is lacked by Matilda and (8).

So by Pinker's account, learning the meanings of words is essential to producing syntactic sentences. So how does he suggest that these meanings are learned? To quote, with a bit of added commentary in brackets: "What we need to show is that the child is capable of entertaining as a hypothesis any possible verb meaning [that is, consider any of a large number of possible variations], and that he or she is capable of eliminating any incorrect hypotheses [and consequently selecting the better ones] as a result of observing how the verb is used across situations."[50] Pinker sounds even more Darwinian and selectionist when discussing the learning of morphemes (the meaningful entities that make up words) in stating that "as the child continues to work on that morpheme over a large set of sentences, all incorrect hypotheses will be discarded at some point or another, any correct hypothesis will be hypothesized sooner or later . . . and only the correct ones will survive in the limit."[51]

For the "unlearning" (elimination) of overgeneralized verb forms such as *drawed and *hitted, Pinker (as well as several other researchers) invokes what is called the uniqueness principle by which the child expects that there cannot be two ways of expressing exactly the same meaning. So when the child hears an adult say Nicholas drew a nice picture and it is clear that the adult is referring to the past act of drawing, the child will understand that drew has the same meaning as *drawed, and he will eventually replace the latter with the former.[52]

Although Pinker would almost certainly not characterize his theory in this way, he nonetheless has proposed a procedure by which children are able to generate guesses about the structure of the language they hear and then eliminate the incorrect ones without the benefit of having adults indicate which utterances are incorrect, that is, without access to negative evidence. To pull this off, children must be sensitive to some very subtle semantic distinctions among verbs, which is in itself quite remarkable. It leads Pinker to speculate that such "lexicosemantic" concepts (which appear to pertain only to language learning and use) must be innate and part of a separate language component of the mind having little to do with other cognitive abilities. But his overall theory of language learning can nonetheless be understood as selectionist in that innate knowledge, arising from natural selection among humans during biological evolution, interacts with selectionist cognitive processes of hypothesis formation and elimination (selection within humans) to arrive at adult language competence.

Whereas Pinker's theory of language learning can be construed as implicitly selectionist, the one proposed by American psycholinguists Elizabeth Bates and Brian MacWhinney, called the competition model, is explicitly so. According to the competition model, the child's learning of word meanings and grammar has three necessary stages:

First, the child develops a function to express. We will call this functional acquisition. Then the child makes a first stab at a way of mapping the function into a form. We will call this jumping in. Then a period of competition ensues during which the range of the form is narrowed or widened.[53]

Let us take a brief look at each of these three stages. Functional acquisition has to do with the child's assignment of meaning to the objects, actions, and relationships around her. These meaning functions must be developed before she can understand language referring to them, and before she can use language herself to express them.

The child then has to associate the sounds of the language she hears to these meaning functions. Since at the early stages of language acquisition she can do no better than make a guess as to the meaning of a word, phrase, or sentence, this stage is referred to as jumping in. Such initial guesses may be quite wide of the mark, but as the child learns more and more word meanings, this knowledge can be used to help discover the meanings of new words in much the same way that the final words of a crossword puzzle are usually easier to identify than the first ones attempted. These jump-ins are, of course, nothing but preliminary guesses as to the meaning of a word (or grammatical form) and are the necessary source of variation for subsequent selection.

Selection is accomplished by a process of competition in which words and grammatical forms compete for meanings based on the assumption that two different forms must have different meanings (as in Pinker's uniqueness principle), and if no difference can be found, one of them is wrong. For example, the child may hear the words plate and saucer referring to what initially appear to be the same type of object. These two words will then compete for these two meanings until, after several presentations and perhaps some correction ("That's not a plate, it's a saucer"), plate wins out for round, thin objects on which food is placed, and saucer wins out for round, thin objects on which cups are placed. MacWhinney likens this to the competition of two species for the same environment resulting in each species establishing its niche in that part of the environment for which it is best adapted.[54] In other situations, one of the forms may be eliminated entirely (become extinct) as when *goed and *cutted are eventually replaced by went and cut.

Such competition is not limited to word meanings. Different languages use different ways of expressing grammatical relations, and the child must learn these syntactic rules as well. In English, the order of words in a sentence is the primary determinant in assigning the roles of subject (almost always placed before the verb) and object (usually after the verb), so that in the sentence The food ate the dog, the food would normally be understood as the entity doing the eating despite the fact that this makes little real-world sense (perhaps the dog fell into a vat of highly acidic hot sauce?). But the same sentence structure in certain other languages (such as Spanish in La comida comió el perro) would be immediately understood as the more sensible "the dog ate the food." Bates and MacWhinney and their associates in their extensive research on both children learning their first language, and students and adults learning second languages, indicated that learners rely on certain cues to resolve the various competing words, meanings, and syntactic forms.

It should be clear from even this brief description that the competition model is selectionist. Although Bates and MacWhinney do not use that word or its variants, the process of competition they propose is clearly one through which selection (and elimination) takes place. Children make guesses concerning the meanings and forms of the language they hear and eventually fine-tune these guesses by cumulative selection as words and forms compete for various meanings and functions. The theory is informed by both linguistics and psychology, and it draws support from a considerable body of research. And it does not assume that the child possesses an extraordinary store of detailed innate linguistic knowledge. The selectionism is clear when MacWhinney states that "the underlying idea in the Competition Model is that mental processing is competitive in the classical Darwinian sense."[55]

A Near-Common DenominatorSelection

We have now considered several theories of language learning that differ in a number of respects. One useful way of comparing and contrasting them is to consider the equation: Innate knowledge x experience x learning = language knowledge.

This equation states that the child's ability to use language is the result of the interaction of the child's innate knowledge, experience, and learning. Innate knowledge is considered in the broad sense, including the brain and vocal tract structures shaped by biological evolution, in addition to any more general cognitive or specifically linguistic knowledge that could be considered to be a part of the child's biological endowment. Note that the interaction among innate knowledge, experience, and learning is considered to involve multiplication (not addition), since this recognizes that if any one of the three factors did not exist (were zero), the child's language knowledge would also not exist (would also be zero).

Using this formula, it can be seen that the various theories of child language acquisition discussed thus far differ in the importance they ascribe to the three factors in a more or less compensatory fashion. Skinner's and other behaviorists' accounts of language learning emphasize the role of experience and learning and downplay innate knowledge.

Chomsky and Fodor, in contrast, minimize the role of both experience (recall Chomsky's argument of the poverty of the stimulus) and learning (Fodor believing that any form of learning is impossible) while emphasizing the importance of innate knowledge. Indeed, the standard practice of most linguists today is to put as much as they possibly can in the innate knowledge factor, which they refer to as "universal grammar," while still allowing children to be able to learn different languages, that is, while providing the minimum necessary role for experience, since it is clear that children speak the same language they hear. It should also be noted that linguists' conception of language knowledge is much richer and sophisticated than that conceived of by Skinner.

Pinker appears to place more emphasis on experience than do Chomsky and Fodor, but he nonetheless maintains one aspect of the poverty of the stimulus in his belief that negative evidence, which would allow the child to reject easily mistaken hypotheses about the language being acquired, is not available to the child. Pinker also emphasizes the role of innate knowledge, since his theory depends on the child being able to classify verbs in quite subtle and sophisticated ways using categories and concepts he believes to be quite specific to the domain of language.

Bates and MacWhinney's competition model of language learning contrasts with Chomsky's and Pinker's views and brings us back closer to Skinner's in minimizing the role of innate knowledge and providing a larger role for learning. Bates and MacWhinney also place more importance on experience than either Chomsky or Pinker, believing that children have access to negative evidence about language as provided by the corrections and repetitions of their parents and possibly other adults.

But although these theorists differ considerably in the relative importance ascribed to these three elements, they appear to be much more in agreement (than perhaps they would care to admit) in their perspectives on the role of selection in language learning in the factors they consider most important. As we saw in chapter 7, Skinner has on many occasions drawn a parallel between operant conditioning and natural selection,[56] and the selectionist nature of the competition model has been noted.

Pinker, although appearing somewhat reluctant to recognize the necessary variation component in language acquisition, suggests that he sees all learning as a selectionist process in noting that "despite all its complex guises, learning can always be analyzed as a set of `hypotheses' the organism is capable of entertaining and of a `confirmation function' by which the environmental input tells the organism which one to keep."[57] His contribution to our appreciation of the role of among-organism selection in the evolution of language was recognized in the first section of this chapter.[58]

The one notable exception is Chomsky, who not only sees no place for selection in language learning, but rejects a Darwinian account of the evolution of language itself. It is tempting to speculate that this latter stance is related to his innatist beliefs concerning language acquisition, since if he were to admit that language gradually evolved along with our species through natural selection among humans, he might have to confront the possibility that language knowledge could also emerge through within-human selection processes in the growing mind of the child.

But regardless of the role that innate linguistic knowledge may play, it can only go so far. It cannot provide the child with knowledge of the specific sounds used in her mother tongue, the meanings of the words she hears, or knowledge of how each grammatical form maps to a function. Additional knowledge must somehow be developed as she adapts her developing linguistic system to her linguistic environment and communicative needs. The form that this additional constructed knowledge can take may be strongly biased and constrained by already achieved innate knowledge about human language. However, the variation and selection of linguistic hypotheses cannot be completely eliminated from the adaptive process of language acquisition. Wherever it may be that knowledge obtained through biological evolution leaves off, we can expect a selectionist process to take over in the generation of varied hypotheses concerning the meanings and functions of words and grammatical structures coupled with the elimination and fine-tuning of those hypotheses found by the child to be inadequate in some way. And although few linguists or language acquisition researchers now describe and model child language learning as a Darwinian process, the success that selectionist models of learning have in other areas (to be described in chapters 13 and 14) seems certain eventually to provide new insights into one of the most remarkable feats of human learning.[59]

The Use of Language

Let us now finally turn to a brief consideration of how language functions, enabling us to communicate our thoughts and intentions to others. Much could be said about this topic considering the prodigious amounts of relevant research conducted by psychologists, linguists, psycholinguists, and educators. I will make no attempt to review all this research here, but will rather provide a concise argument, using a few examples, that language use also involves a Darwinian process of variation and selection.

Once two or more people have acquired the same meanings for words (semantics) and knowledge of the same rules for combining words to express meaning (syntax), it might seem that language could be used to transmit meaning from one to another. Surely, if we understand the words dog, cat, bit, and the, and know that in English the normal ordering of sentences is subject-verb-object, my declaration of "The dog bit the cat" should provide the obvious information to you. But in practice things are not so straightforward, the reason being that words themselves do not carry meaning. Rather, they can only elicit meanings that already exist in the brain of the listener. And the meaning that they elicit depends on the listener's relevant experiences and the context in which the words are used. For example, consider the following five sentences.

(1) Where did you put the newspaper?

(2) There was an interesting story in the newspaper yesterday.

(3) The newspaper is going on strike.

(4) Workers are demonstrating outside the newspaper.

(5) The newspaper is experiencing financial difficulties.[60]

It is readily apparent that the word newspaper has a quite different meaning in each of these sentences. In (1) a particular physical copy of the newspaper is intended. In (2) the word refers to all copies of a particular edition of the newspaper. It refers to employees in (3), and in (4) to the building where the publication is produced. Finally, in (5) newspaper means the institution that publishes the newspaper.

But it is not just the meaning of individual words that depends on context and the experiences and imagination of the listener, but larger stretches of words as well. To demonstrate this, consider the following passage:

The procedure is actually quite simple. First you arrange things into different groups depending on their makeup. Of course, one pile may be sufficient depending on how much there is to do. If you have to go somewhere else due to lack of facilities that is the next step, otherwise you are pretty well set. It is important not to overdo any particular endeavor. That is, it is better to do too few things at once than too many. In the short run this may not seem important, but complications from doing too many can easily arise. A mistake can be expensive. The manipulation of the appropriate mechanisms should be self-explanatory, and we need not dwell on it here. At first the whole procedure will seem complicated. Soon, however, it will become just another facet of life. It is difficult to foresee any end to the necessity for this task in the immediate future, but then one never can tell.[61]

I would venture to guess that you did not have difficulty understanding the meaning of any of the individual words in this passage. But I would also venture to guess that the passage as a whole probably did not make much sense to you when you first read it. But if I now inform you that the passage has something to do with washing clothes (and assuming that you have had some experience in washing clothes), reading it again will likely be quite a different experience as it will now elicit meanings it did not before. This is because I helped you to constrain your hypothesis about what the passage is about. But again, the meaning must actually be created by you and is not transmitted by the words, phrases, or sentences of the passage.

These examples are meant to demonstrate that language comprehension is not a matter of receiving meaning from a speaker or writer, but rather is an active process of constructing meaning as the listener or reader attempts to make the words, intentions, and context of the situation fit. As child language researcher Gordon Wells put it:

When I communicate with other people, whether it be to inform, request, or persuade, what I have in mind is an idea--an event, action, or outcome--that I intend they should understand. However, this idea arises from my mental model of the world, which is itself the product of my unique personal biography. Nobody else has exactly the same mental model of the world, since nobody else has had exactly the same experience. It follows, therefore, that nobody can have exactly the same ideas I have.

What all this leads to is a recognition that one never knows what other people mean by what they say or write. One can only make an informed guess, taking into account all the cues that are available: from the communication context, from one's own relevant experience, and from the actual linguistic signal. To put it differently, I cannot know what idea is in your mind as you speak or write. I can only know what ideas I would have had in mind if I had produced the same lexico-grammatical sequence as I believe you to have produced in the context that I think you think we currently share.[62]

So according to Wells (and he is certainly not alone in his interpretation), understanding language involves making informed guesses about the intent of the speaker or writer. Some guesses will be wrong and will be quickly eliminated. Others will be wrong but not so easily eliminated, resulting in misunderstanding, which is most likely when the individuals involved are from different cultures, age groups, sexes, or social classes.[63] Of course, we expect that most guesses will be quite close to the speaker's or writer's intended meaning. And that they usually are in normal conversation with our family (except very young children), friends, and work associates is what makes it appear as if using language does involve the transmission of meaning from one person to another. But this is an illusion, which is quickly revealed when we experience difficulties in communication, and when we recognize that language can at best elicit and help select meanings that already exist in the listener's or reader's head. This selectionist view of language use has important implications for understanding the process of education, to which we turn next.

[1]Darwin (1874, pp. 88-89).

[2]See Berk (1994) for an account of how the private, self-directed speech of the child gradually turns into the silent thoughts of the adult.

[3]Mentioning these advantages of spoken language is by no means intended to imply that sign language is not a powerful and expressive system of communication. No hearing communities have been known to use sign language instead of spoken language, however.

[4]Bickerton (1990, p. 106).

[5]Pinker & Bloom (1990, p. 711).

[6]Darwin (1859/1966, p. 191).

[7]It is of interest to note that the human infant also shares the high larunx of non-human mammals making it possible simultaneously to drink through the mouth and breathe through the nose. This ability is soon lost and linguistic ability gained as the larynx drops, permitting the infant to make all the human speech sounds.

[8]Lieberman (1991, p. 56).

[9]Letters in square brackets are symbols for language sounds (phonemes) as used in the International Phonetic Alphabet.

[10]Lieberman (1991, p. 65).

[11] Lieberman (1984) proposes:

. . . that the extinction of Neanderthal hominids was due to the competition of modern human beings who were better adapted for speech and language. The synergetic effect of rapid data transmission through the medium of encoded speech and the cognitive power of the large hominid brain probably yielded the full human linguistic system. The rapid changes in human culture that occurred shortly after the replacement of the Neanderthals could be the result of a difference in the way in which humans thought. Though it is impossible to prove that human language and thought were the causative agents, the replacement of the Neanderthal population--adapted for strength and agility--by a population that was inferior save for enhanced speech abilities is consistent with this hypothesis. (p. 329)

[12]Kimura (1979; quoted in Lieberman, 1991, p. 79).

[13]See Lieberman (1991, p. 80).

[14]Goodall (1986; quoted in Lieberman, 1991, p. 52).

[15]Lieberman (1991, p. 109).

[16]Lieberman (1991, pp. 80, 81).

[17]Chomsky (1988a, p. 23).

[18]Gould & Lewontin (1979).

[19]Gould & Lewontin (1979, p. 285).

[20]The following is adapted from Lieberman (1991, pp. 8, 9).

[21]Pinker & Bloom (1990, p. 726). This paper provides an excellent discussion of issues involved in understanding language as the product of Darwinian natural selection. Of particular interest are the 31 critical commentaries that follow the article and Pinker and Bloom's subsequent responses.

[22]Pinker & Bloom (1990, p. 721).

[23]Reader (1988, p. 143).

[24]Konner (1982, p. 171).

[25]Calvin (1990, p. 207) discusses the ideas of Nicholas Humphrey and Richard Dawkins as aired in a BBC radio program.

[26]The argument that language is necessary for a complete human sense of self and consciousness was made by Helen Keller, who became deaf and blind shortly after birth and learned a language based on touch at the age of eight. As she recounts: "When I learned the meaning of `I' and `me' and found that I was something, I began to think. Then consciousness first existed for me" (Keller, 1904, p. 145).

[27]Skinner (1957, p. 81).

[28]Skinner (1957, pp. 1, 2).

[29]Skinner (1957, p. 164)

[30]Skinner (1957, p. 58).

[31]Skinner (1957, p. 206).

[32]Chomsky's (1959) review of Skinner's book is considered to be an important event in the start of what has become known as the cognitive revolution in psychology.

[33]Chomsky (1964, p. 558).

[34]Bickerton (1990, pp. 57, 58).

[35]On this characteristic English constrasts strongly with languages such as Latin, Navajo, and Walpiri, which have quite free word orders.

[36]Fodor (1980, p. 143).

[37]Fodor (1980, p. 149).

[38]Munz's (1993, p. 154) concept of organisms as "embodied theories" is useful here.

[39]Bickhard (1991, pp. 16-17). See also Bickhard & Terveen (1995, pp. 25ff.) for additional critique of cognitive innatism.

[40]Note that this is not the case for the genes that determine antibody production, as discussed in chapter 4.

[41]Again, except for that part of the genome underlying the production of antibodies.

[42]Quine (1960).

[43]See Macnamara (1972, p. 3).

[44]Campbell (1973).

[45]Pinker (1994, p. 154).

[46]See Gleitman (1994) and Markman (1994) for additional constraints that children bring to the task of learning word meanings.

[47]The tentative and fallible nature of our learning of word meanings was made clear to me several years ago when I realized that the word befriend made no sense in the context of a newspaper article I was reading. Checking the meaning of this word in my dictionary, I was quite surprised to learn that it means "to make a friend" whereas for more than 30 years of my life I had understood it as meaning "to lose a friend," somewhat analogous, I suppose, to the way that behead involves losing a head.

[48]These four sentences are taken from Pinker (1989, pp. 19ff.).

[49]A study conducted by Brown & Hanlon (1970) indicated that parents do not provide corrective information concerning the grammatical errors made by their children. This study has been widely cited, especially by those who make innatist arguments for language acquisition. However, more recent studies provide evidence that children do have access to information from adults concerning the grammaticality of their utterances, information that would make it even easier for them to reject incorrect hypotheses about the language being learned. These studies found that parents respond differentially to the ungrammatical utterances of their children by often repeating verbatim well-formed sentences in contrast to repeating with changes, or requesting clarification for, sentences containing errors (Bohannon & Stanowicz, 1990; Demetras, Post, & Snow, 1986; Hirsch-Pasek, Treiman, & Schneiderman, 1984; Penner, 1987; see also Gordon, 1990; and Bohannon, MacWhinney, & Snow, 1990 for contrasting views of this research and its importance). Although it has not yet been demonstrated convincingly that children actually use such information in learning language, the availability of such negative evidence has the potential of making language acquisition easier for the child without relying on innate linguistic knowledge. But then again, see Marcus (1993) for arguments against the role of adult feedback in language acquisition.

[50]Pinker (1989, p. 255; comments added in brackets).

[51]Pinker (1989, p. 255).

[52]Pinker (1989, p. 290).

[53]MacWhinney (1987, p. 287).

[54]MacWhinney (1987, p. 292).

[55]MacWhinney (1989, p. 65).

[56]See Skinner (1966, 1981).

[57]Pinker (1989, pp. 166, 167). I would like to see Pinker's use of quotation marks around "confirmation factor" as a recognition that hypotheses can never be confirmed. And of course the environment does not actually tell anyone which hypotheses to keep, but rather provides information for the learner (or scientist) concerning which ones should be rejected or revised.

[58]Pinker & Bloom (1990).

[59] See Clark & Roberts (1993) for an interesting application of a cumulative selectionist learning process (called genetic algorithms and described later in this book in chapter 14) to language acquisition and language change based on Chomsky's principles-and-parameters approach to linguistic theory. 60. Georgia Green brought these examples to my attention.

[60]Georgia Green brought these examples to my attention.

[61]Bransford & Johnson (1972, p. 722).

[62]Wells (1986, pp. 216, 217).

[63]See Tannen (1990) for an interesting account of language-based misunderstandings that arise between men and women.