New machine learning research has established links between how we move our eyes and our personalities.
The study was carried out by a partnership between the University of South Australia, the University of Stuttgart, Flinders University, and the Max Planck Institute for Informatics in Germany.
The team used a state-of-the-art machine learning algorithm to demonstrate the link between personality and eye movements.
It found that eye movements can reveal whether someone is sociable, conscientious, or curious, with the algorithm reliably recognising four of the ‘big five’ personality traits: neuroticism, extroversion, agreeableness, and conscientiousness. (‘Openness to experience’ is the fifth.)
The 42 participants in the study went about everyday tasks on their university campuses while researchers tracked their eye movements. They then had their personalities assessed using traditional questionnaires.
The quest for emotion recognition
The University of South Australia’s Dr Tobias Loetscher believes the study provides new evidence of links between eye movements and personality traits, in what has been an under-investigated field, offering important insights into social signal processing and social robotics. In the university’s own report, he said:
“There’s certainly the potential for these findings to improve human-machine interactions,” he said. “People are always looking for improved, personalised services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues.
This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.
Dr Loetscher also emphasised that the research used real-world data, rather than lab-based experiments. “This research has tracked and measured the visual behaviour of people going about their everyday tasks, providing more natural responses than if they were in a lab,” he said.
“And thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits.”
Internet of Business says
This collaboration marks a new frontier for AI research. If conversational AI is to become truly intelligent and capable, it must be able to understand non-verbal cues.
We convey a huge amount in our facial expressions and gestures – sometimes completely changing the meaning of spoken words. A sideways glance can turn a sincere comment into a sarcastic jibe, for example.
So it’s no surprise that researchers elsewhere are also taking on the monumental task of so-called affective computing, and with it attempting to tackle the cultural, gender, and age differences that can affect our mannerisms. Just last week, for example, MIT revealed its own research into emotion-reading AI.
There’s even been research into using AI to detect emotions based on changes in our faces and skin tone, due to variations in blood flow.