Skip to main content
School of Electronic Engineering and Computer Science

Using robots to reach out to isolated people

Researchers from Queen Mary University of London are taking part in a major new project looking at how cutting-edge robotics can enable people to participate in public spaces, as a place to meet and share ideas without being there in person.

Published:
Using robots to reach out to isolated people

Researchers from Queen Mary University of London are taking part in a major new project looking at how cutting-edge robotics can enable people to participate in public spaces, as a place to meet and share ideas without being there in person.

The £2m three-year project, Being There: Humans and Robots in Public Spaces, funded by the EPSRC, will examine how robotics can help to bridge the gap between the way we communicate in person and online.

It brings together researchers from Queen Mary and the Universities of Bath, Exeter, Oxford, and the Bristol Robotics Laboratory* to look at the social and technological aspects of being able to appear in public in proxy forms, via a range of advanced robotics platforms.

Assistant professor in digital media, Dr Hatice Gunes from Queen Mary’s School of Electronic Engineering and Computer Science will lead on the analysis of human nonverbal behaviour and emotional expressions in human-robot interactions. 

“We are excited about extending our research in automatic analysis of individual emotions and nonverbal behaviour to public space settings where multiple people, multiple groups and even robots will be meeting and interacting,” said Dr Gunes.

The research team will create a ‘living laboratory’, using state-of-the-art technologies to measure how people respond to, and interact with other people who are acting through a robot representative.

The scientists will use an advanced programmable humanoid robot ‘Nao’, which they will take into public spaces around Bristol to measure human interaction with robots.

Nao will be controlled remotely and researchers will be able to see and speak through its eyes and mouth, while directing where it looks and walks.

Dr Gunes added: “Interactions between individuals in a public space generate a rich set of explicit and implicit data, from gestures, visual cues, and body language to long-term patterns of interaction and group movement.

“We are interested in obtaining fast and robust means for quantifying emotions and behaviour in collective settings.

“This in turn can provide a real-time feedback mechanism that can be used to regulate the public space, which we’re creating. This might encourage the crowd to rethink the site’s purpose, as well as their own individual and spatial interactions, and provide new perspectives along the way”.

Supporting this process, digital creative from Bristol’s iShed will work alongside the researchers, bringing their expertise in public engagement to help bring the research out of the lab and into a range of public spaces in Bristol.

Dr Gunes and her group will be creating technology for automatic and continuous analysis of emotions in naturalistic multi-participant and multi-group environments, where people will be interacting both with robots and with other people.

They will use their expertise in affective computing, a field of study that focuses on developing systems that are sensitive to human emotions and behaviour, and multimodal information processing by focusing on human nonverbal behavioural cues from different sources.

*Bristol Robotics Laboratory is a collaborative partnership between the University of the West of England (UWE, Bristol) and the University of Bristol.


 

 

Back to top