Sunday, July 25, 2010

The neural response to emotional robots

When it comes to robotics, Japan is way ahead of the rest of the world. Reality is quickly catching up with science-fiction as robots are being used and developed for increasingly complex behaviours. There are now Japanese robots that function as security guards, trainers for professional skills, and even pets and social companions.

Robots are taking over roles that were once thought to make humans unique. An archetypal example of this is the use of robots for cognitive therapy. Robots are also being used for social assistance and personal care of the elderly. But how does robot social support fare against human support? Is it really possible to empathize with and emotionally respond to a robot while simultaneously knowing that it is just a robot?


Researchers from – you guessed it – Japan, recently collaborated with an international team to investigate how our brains respond to robots expressing human emotions. A humanoid robot, called WE4-RII, was specially designed to make facial expressions for emotions such as disgust, anger and joy. Study participants had their brains scanned with fMRI as they watched either WE4-RII expressing emotions or a human expressing the same emotions. The subjects were asked to attend to either the way the robot/human face was moving or to the emotion being depicted.


When subjects attended to emotions, it was found that brain activity in areas involved in processing emotions (such as the left anterior insula for the perception of disgust) was reduced for robot compared to human stimuli. However, when participants attended to emotions rather than motions, it was also found that brain activity was increased in the ‘motor resonance’ (i.e., the controversial mirror neuron) circuit.


The experimenters suggested this means we don’t emotionally resonate with robots as well as we do with humans, but we relate to robot facial motions better when we attend to their emotions.


The latter claim may seem counterintuitive, but it matches behavioural experiments that suggest a ‘motor interference effect’ for robotic movements. That is, when we attend to robotic movements, we don’t think of these movements as something that we do when we move, so we don’t resonate with them. But when we stop attending to the movements, we don’t notice the robotic nature of these movements anymore, and we therefore imitate them to a greater degree. I wonder if this would be much different for people who are really good at dancing the robot.


Still the experiment doesn’t demonstrate that we can’t emotionally relate to robots to the same degree as to humans. The robot used was clearly mechanical compared to the human actors. This means that subjects knew the robot was a robot. If you can’t tell whether a face is robot or human, however, it would be expected that the brain would respond in the same way.


An interesting follow-up study could address this problem by using more human-looking robots and including two groups of subjects: those who are aware of the robots and those who are not. Then we’d be able to see how previous knowledge affects our innate reactions to emotional facial expressions.


And another central question remains. Now that robots are taking over traditional human roles, when will we start scanning their ‘brains’ and investigating whether they can empathize with us? I won’t be visiting any hot-shot robot psychotherapist until we have data that shows they care.



ResearchBlogging.org
Thierry Chaminade1,2*, Massimiliano Zecca3,4,5, Sarah-Jayne Blakemore6, Atsuo Takanishi, Chris D.Frith1, Silvestro Micera, Paolo Dario, Giacomo Rizzolatti, Vittorio Gallese11, & Maria Alessandra Umilta (2010). Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures PLoS ONE

4 comments:

  1. Exactly what did subjects think they were attending to when told to attend to the robots emotions?

    You might mention that the acceptance of robots in care giving situations in Japan is part of the larger Japanese social issue of accepting non-Japanese into the society. It is not surprising that many elderly Japanese would prefer a Japanese robot to a gaijin human.

    ReplyDelete
  2. Given that most of our recognizance of our own emotions is via interpretative analysis of our facial musculature's action, bodystance, pulse, hormonelevel etc etc due to lack of vertical connections between the limbic system it stands to reason this works for anything else as well.
    Being just the monkey on the back of an autonomous moving object we can but wait and see what it'll do.

    ReplyDelete
  3. This comment has been removed by a blog administrator.

    ReplyDelete
  4. This comment has been removed by a blog administrator.

    ReplyDelete