Elizabeth Svoboda, writing for Nautilus:
Even the advancement of general cognitive skill, however, may be too narrow a picture of the evolution of language. University of Edinburgh computational linguist Simon Kirby argues that, while the human brain may be a necessary foundation for language, it is not sufficient to explain it. The beginnings of language, Kirby says, were profoundly shaped by the dynamic interplay of human culture itself.
Kirby took a unique approach to probing the origins of language: He taught human participants novel languages he had made up. He and his colleagues showed human subjects cards with different shapes and pictures on them, taught them the words for these pictures, and tested them. “Whatever they do, whether they get it right or wrong, we teach it to the next person,” Kirby says. “It’s rather like the game Telephone.”
Remarkably, as the language passed from one learner to the next, it began to acquire cogent structure. After 10 generations, the language had changed to make it easier for human speakers to process. Most notably, it began to show “compositionality,” meaning that parts of words corresponded to their meaning—shapes with four sides, for instance, might all have a prefix like “ikeke.” Thanks to these predictable properties, learners developed a mental framework they could easily fit new words into. “Participants not only learn everything we show them,” Kirby says, “but they can correctly guess words we didn’t even train them on.”
Wow, what a blast from the past!
In 2003 I took Patrick Winston’s graduate class on artificial intelligence (“The Human Intelligence Enterprise”), and my final project focused on exploring Kirby’s fascinating — and somewhat counterintuitive — 2001 paper on Spontaneous Evolution of Linguistic Structure.
In that paper, simple chunking rules in individual learners are enough for grammar to appear from repeated conversations made among a small group of simulated learning agents, after about 10 generations of learners. (Think random grunts to pidgin to creole.)
Kirby’s model assumed that meaning was conveyed along with utterances 100% correctly 100% of the time; my project tried to test this assumption by introducing noise to utterances and transmitted meaning, only to discover that this actually accelerated the appearance of grammar.
If you have an interest in linguistics and computation but haven’t read Kirby’s paper, you should!