A team of researchers at the University of Duisburg-Essen in Germany has found that humans can be susceptible to emotional manipulation by a robot. In their paper published on the open access site PLOS ONE, the group describes experiments they carried out with human volunteers interacting with robots and what they found.
Back in 2007, a team of researchers carried out a study called “Begging computer does not want to die.” Volunteers were asked to switch off a robot cat, but were not sure what to do when the cat begged them to please not turn them off. In this new effort, the researchers have replicated this experiment using more volunteers and a different robot.
The new study involved 89 volunteers who were asked to interact with a Nao robot under the guises of helping it become more intelligent. At the conclusion of the interaction, a researcher would ask the volunteer to turn off the robot only to have the robot beg them please not to do so. In addition to voice requests, the robot also displayed bodily actions meant to bolster their request. Some volunteers served as controls—they were asked to turn off the robot but did not experience begging from the robot.
The researchers report that ultimately, 43 of the volunteers were confronted with the decision between complying with the researchers’ request, or the robot’s. They report that 13 volunteers chose to heed the robot’s wishes and that all of the others took longer to turn off the robot than did those in the control group. They suggest their findings indicate that humans have such a strong tendency to anthropomorphize robots that we can fall prey to emotional manipulation. They note that they also found the type and length of socializing before being asked to turn off the robot did not appear to have any impact regarding the decision made by the volunteers.
Each of the volunteers was interviewed after their interactions with the robot—those who had refused to turn off the robot were asked why. The researchers report that many of the volunteers refused simply because the robot asked. Others reported feeling sorry for the robot or were worried about doing something wrong.