Sticking the tongue out: Early imitation in infants

Famous picture of Albert Einstein sticking out his tongue.
Albert Einstein sticking out the tongue to a neonate in an attempt to test their imitation of tongue protrusion.

The nativism-empiricism debate haunts the fields of language acquisition and evolution on more than just one level. How much of children’s social and cognitive abilities have to be present at birth, what is acquired through experience, and therefore malleable? Classically, this debate resolves around the poverty of stimulus. How much does a child have to take for granted in her environment, how much can she learn from the input?

Research into imitation has its own version of the poverty of stimulus, the correspondence problem. The correspondence problem can be summed up as follows: when you are imitating someone, you need to know which parts of your body map onto the body of the person you’re trying to imitate. If they wiggle their finger, you can establish correspondence by noticing that your hand looks similar to theirs, and that you can do the same movement with it, too. But this is much trickier with parts of your body that are out of your sight. If you want to imitate someone sticking their tongue out, you first have to realise that you have a tongue, too, and how you can move it in such a way that it matches your partner’s movements.

Continue reading “Sticking the tongue out: Early imitation in infants”

Imitation and Social Cognition in Humans and Chimpanzees (II): Rational Imitation in Human Infants and Human-Raised Chimps

In my last post I wrote about two experiments on imitation in young children and chimpanzees by Lyons et al. (2005) and Horner & Whiten (2005).  Their results suggested that young children tend to copy both the ‘necessary’ and the ‘unnecessary’ parts of a demonstrator’s action who shows them how to get a reward out of a puzzle box, whereas chimps only copy the ones necessary to get the reward.

ResearchBlogging.orgOne important question raised by these experiments was whether these results can only be applied to wild chimpanzees or whether they also hold for enculturated, human-raised chimps. This is an important question because it is possible that chimpanzees raised in these kinds of richly interactive contexts show more sensitivity to human intentionality.

Buttelman et al. (2007) tested just that. They used the “rational imitation” paradigm, which features two conditions

a) the subjects are shown an action in which the specific manner of the action is not purposive and intentional but results from the demonstrator being occupied with something else. For example, he may be carrying something so that he has to use his foot to turn on a light (often called the Hands Occupied Condition).

b) the subjects are shown an action in which the demonstrator chooses a specific manner of doing something on purpose. For example he may have his hands free but still choosto turn on the light with his foot (Hands Free Condition).

taken from Call & Tomasello 2008

Continue reading “Imitation and Social Cognition in Humans and Chimpanzees (II): Rational Imitation in Human Infants and Human-Raised Chimps”

What Makes Humans Unique ?(III): Self-Domestication, Social Cognition, and Physical Cognition

ResearchBlogging.orgIn my last post I summed up some proposals for what (among other things) makes human cognition unique. But one thing that we should bear in mind, I think, is that our cognitive style may more be something of an idiosyncrasy due to a highly specific cognitive specialization instead of a definitive quantitative and qualitative advance over other styles of animal cognition. In this post I will look at studies which further point in that direction.

Chimpanzees, for example, beat humans at certain memory tasks  (Inoue & Matsuzawa 2007) and behave more rational in reward situations (Jensen et al. 2007).

In addition, it has been shown that in tasks in the social domain, which are generally assumed to be cognitively complex, domesticated animals such as dogs and goats (Kaminski et al. 2005) fare similarly well or even outperform chimpanzees.

Social Cognition and Self-Domestication

It is entirely possible that the first signs of human uniqueness where at first simply side-effects our self-domesticating lifestyle – the same way the evolution of social intelligence in dogs and goats is hypothesised to have come about –, acting on a complex primate brain (Hare & Tomasello 2005).

This line of reasoning is also supported by domesticated silver foxes which have been bred for tameness over a time period of 50 years but developed other interesting characteristics as a by-product: To quote from an excellent post on the topic over at a Blog Around the Clock (see also here):

“They started having splotched and piebald coloration of  their coats, floppy ears, white tips of their tails and paws. Their body proportions changed. They started barking. They improved on their performance in cognitive experiments. They started breeding earlier in spring, and many of them started breeding twice a year.”

What seems most interesting to me, however, is another by-product of their experimental domestication: they also improved in the domain of social cognition. For example, like dogs, they are able to understand human communicative gestures like pointing. This is all the more striking because, as mentioned above, chimpanzees do not understand human communicative gestures like  helpful  pointing. Neither do wolves or non-domesticated silver foxes (Hare et al. 2005).

Continue reading “What Makes Humans Unique ?(III): Self-Domestication, Social Cognition, and Physical Cognition”

Some Links #12: What if there had never been a cognitive revolution?

What if there had never been a cognitive revolution? Apparently, nothing would really be all that different according to Nicolas Baumard over at ICCI. It’s all speculative, in a similar vein to alternative history fiction (I recommend: Making History by Stephen Fry and Difference Engine by William Gibson and Bruce Sterling), with Baumard stating:

My point here is that these key ideas would have emerged even without a Cognitive Revolution. Take for instance the idea that the mind cannot be a blank slate. This idea is totally natural to evolutionary biologists. What about the mind as “a complex system composed of many interacting parts”? Without going back to La Mettrie, Hutcheson or Descartes, one can argue that the idea of modularity is at the core of the research program of neuropsychology since its beginning (the same is true, albeit at a lesser degree, for evolutionary biology). We should not forget as well that, with or without the Cognitive Revolution, brain imaging techniques would have emerged and would have joined neuropsychology and evolutionary biology in decomposing the mind. Add the methodological advances of developmental psychology or social psychology – which were not part of the Cognitive revolution – and you get a pretty big part of today’s ‘Cognition and Culture’.

‘Mad Men -ese. Ben Zimmer has a cool article on Mad Men (easily one of the best shows to have emerged in recent years) and its dedication to accurately portraying 1960s dialogue. But with such dedication comes equally dedicated, and pedantic, criticisms of some of the lines used. For example, Zimmer points to Don’s line “The window for this apology is closing” as being tied to the 70s use of window in a metaphorical sense. On another note: the new season of Mad Men begins tomorrow (25th June) in America.

A growing isolated brain can organize itself. Deric Bownds points to an article by Zhou et al (2010) which disconnected a mouse’s neocortex from the rest of its brain to see how the surface map developed. The results:

During these weeks, the mutant mice, despite having disconnected brains, display a variety of behaviors: eating, drinking, walking, and swimming. Thus, “protomap” formation, namely cortical lamination and formation of areas, proceed normally in absence of extrinsic connections, but survival of projection neurons and acquisition of mature morphological and some electrophysiological features depend on the establishment of normal cortical–subcortical relationships.

Things I’d like to see: a nice, simple, colourful website on evidence-based social policy. Being an avid reader of Ben Goldacre’s Bad Science column, and having read his book of the same name, I was surprised to find that he has another blog. Anyway, the linked post is fairly self-explanatory: he’s calling for someone to create a website looking at evidence-based social policy (something he’s been discussing since at least 2007). I’m a big fan of this idea, which would see social policy based on less rhetorical wrangling and more on actual evidence:

There are three key stages in evidence-based practise: you generate evidence; you collate and appraise it, and then you disseminate and implement. It feels to me like the last bit is currently underdone, and often it takes one clear information hub, or an organisation devoted to promoting something, to move things on.

Why money makes you unhappy. Money is apparently not very good at making us happy. Jonah Lehrer writes about a study exploring the experience-stretching hypothesis, and how it relates to money and happiness. Basically, the argument is that because money allows us to enjoy the best things in life, we actually end up lessening our ability to enjoy the mundane aspects of our life. As the mundane aspects are most frequent, then this isn’t necessarily a good thing. This comes on the back of another paper claiming that the United States, currently the richest nation on Earth, is slowly getting less satisfied with life.  As the current study states:

Taken together, our findings provide evidence for the provocative notion that having access to the best things in life may actually undermine one’s ability to reap enjoyment from life’s small pleasures. Our research demonstrates that a simple reminder of wealth produces the same deleterious effects as actual wealth on an individual’s ability to savor, suggesting that perceived access to pleasurable experiences may be sufficient to impair everyday savoring. In other words, one need not actually visit the pyramids of Egypt or spend a week at the legendary Banff spas in Canada for one’s savoring ability to be impaired—simply knowing that these peak experiences are readily available may increase one’s tendency to take the small pleasures of daily life for granted.