Skip to main content
School of Electronic Engineering and Computer Science

A novel robotic interface for walking in virtual reality developed at Queen Mary University

Researchers from the Centre for Advanced Robotics at Queen Mary have developed a novel human-machine interface for walking in Virtual Reality (VR). Their paper describing their system by PhD candidate, Mr Ata Otaran and Dr Ildar Farkhatdinov (Senior Lecturer, School of Electronic Engineering and Computer Science) was accepted for publication at IEEE Transactions of Visualization and Computer Graphics. 

Published:
A man sitting wearing a VR mask moving a robot

Virtual reality technology is successfully applied in various domains, from Internet-based teleconferencing and edutainment to medical training and robot control. Robotics researchers from the Human Augmentation and Interactive Robotics team worked on developing VR-based interactive systems for remote robotic manipulation in extreme environments, training medical simulators and computer gaming applications. Their recent work allows virtual reality users to walk in virtual scenes using natural leg movements and to physically feel the simulated virtual terrain through their sense of touch. The developed system  consists of an ankle robotic platform actuated with an electric motor and a software algorithm that maps the ankle angular movements into walking patterns of a virtual avatar. To control waking of a virtual avatar, a seated-user places his/her legs on the platform and performs alternating left/right foot tapping movements as corresponding to ankle joint flexion during normal walking. The walking algorithms matches the measured ankle movements to a walking pattern and uses this information to control gait of a virtual avatar. As a result, a user can control walking of an avatar in a virtual scene as if he/she was actually walking on a virtual terrain.

Ata Otaran and Ildar Farkhatdinov conducted a series of experimental studies and demonstrated that their system can be efficiently and easily used by novice users. All experimental participants were able to walk in a virtual scene successfully and were able to distinguish between the different types of simulated virtual terrains like hard floor, soft carpet and ice. The invented system improves immersions and usability, and contributes to increase sense of embodiment, such that the users actually felt that their bodies are actually walking in the virtual scene. The research team is looking forward to apply their findings to practical applications of human-computer interaction in the field of computer gaming, training simulators and telerobotics.

A video demonstration can be found here:https://www.youtube.com/watch?v=OicUbh_kaYQ

 

 

Back to top