Cues for future robot ‘learning’

The eyes give key clues to what we’re thinking and how we respond to our surroundings.

Their movements recorded by high-tech eye-tracking glasses also give insight into a range of key human behaviours and personality traits, as revealed by an international study conducted at Flinders University.

Led by European researchers based at the Machine Learning and Robotics Lab at the University of Stuttgart, and Max Planck Institute for Informatics in Germany, the project monitored eye gaze patterns of 50 Flinders University students and staff.

Organised by Flinders PhD candidate in cognitive psychology Stephanie Morey and UniSA senior lecturer Dr Tobias Loetscher, the study involved participants wearing eye-tracking glasses during a short walk around campus before completing a series of personality questionnaires back in the Brain and Cognition Laboratory.

The paper, published in Frontiers in Human Neuroscience will go towards understanding how non-verbal social cues, such as a person’s eye movements, can be used to develop computer systems and robots that can interact effectively and safely with humans.

It also suggests personality may play an important role in explaining how we take in visual information from our world, says Ms Morey.

“Our colleagues in Germany were then able to use novel machine learning techniques to analyse the findings from the sample group,” she says.

SensoMotoric Instruments eye-tracking glasses can track the wearer’s eye movements, as well as identify what they’re looking at. Photo: SensoMotoric Instruments (Germany).

“We found that the machine learning method was better than chance at using the eye movement data to predict four of the Big Five personality traits – neuroticism, extraversion, agreeableness, and conscientiousness.”

The fifth of the Big Five personality traits is openness to experience.

Most people have some degree of these traits but the amount of each trait varies between people.

“Though this study is only preliminary, these results are a step forward in learning how robotics or artificial intelligence might be able to read human social cues, such as our gaze patterns or pupil dilations, to successfully interact with humans,” Ms Morey says.

“If machines can learn to identify different behavioural traits or characteristics in humans, this raises many interesting questions about how we can develop effective human-robot interactions, potentially leading to the design of systems that are specifically suited to their human user.”

The head-mounted video-based eye trackers from SensoMotoric Instruments (SMI) track the positions and movements of the participant’s eyes while recording what the participant sees via a camera on the front of the glasses. The glasses can then match up where the participant looks so researchers can find out where and what they were looking at in their environment.

The paper, ‘Eye Movements During Everyday Behavior Predict Personality Traits,’ by Sabrina Hoppe, Tobias Loetscher, Stephanie Morey and Andreas Bulling (2018) has been published in Frontiers in Human Neuroscience. 12:105. doi: 10.3389/fnhum.2018.00105

Posted in
College of Education, Psychology and Social Work Research