If you’re bluffing your way through a game of high-stakes poker, it’s a good idea to avoid shifty, nervous eye movements, which just might give your hand away.
But it’s not just during poker that our eyes can betray us. A recent study suggests the way our eyes move actually reveals a scary amount about what we feel inside – to the point where AI can predict somebody’s personality type simply by watching their eyes.
“Thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits,” explains neuropsychologist Tobias Loetscher from the University of South Australia.
There’s a body of previous research suggesting our eye movements signal things about the way we think and feel – a trait humans consciously or unconsciously pick up on during interpersonal relations.
But can these eye movements – and what they internally represent – be similarly appreciated by something that isn’t human?
That’s what Loetscher and his team wanted to find out, so they recruited 50 volunteers to fill out questionnaires that would indicate where each participant fell in terms of the so-called Big Five personality traits: openness, conscientiousness, neuroticism, agreeableness, and extraversion.
Each of the (student) participants also wore eye-tracking headsets, which recorded their eye movements while they were sent out to visit a store and purchase something. They were wearing the headset for about ten minutes.
When the team had their machine learning AI analyse the data recorded by the eye-tracking software, they found it was able to isolate patterns of eye movements and match them up to the basic psychological profiles.
“One key contribution of our work is to demonstrate, for the first time, that an individual’s level of neuroticism, extraversion, agreeableness, conscientiousness, and perceptual curiosity [another personality type] can be predicted only from eye movements recorded during an everyday task,” the authors write in their paper.
It’s worth pointing out that while the AI was able to predict these personality types, it wasn’t able to do it with particularly high accuracy – but the researchers say it was still reliable (up to 15 percent better than chance) for those traits.
With further refinement, this kind of technology could dramatically improve interactions with machines, the researchers think, giving things like virtual assistants a way of reading our mood or personality.
“People are always looking for improved, personalised services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues,” Loetscher says.
“This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.”
Of course, there’s a more dystopian angle to the findings too. If cameras can peer into our psyche using nothing more than optical sensors, there could be disturbing privacy implications – especially if people don’t want a machine trying to guess how they’re feeling.
“If the same information could be gained from eye recordings or speech frequency then it could easily be recorded and used without people’s knowledge,” neuroscientist Olivia Carter from the University of Melbourne, who wasn’t involved with the research, told New Scientist.
That’s something scientists will have to keep in mind as these systems continue to evolve, with Loetscher and co. hypothesising these abilities could one day be incorporated into a wave of socially interactive robots – capable of interpreting tell-tale eye flutters, and even mimicking it to seem more human.
Welcome to the future.
The findings are reported in Frontiers in Human Neuroscience.