Tuesday, July 5, 2011

Soon! Robots that react to human facial expression


Washington: Scientists studying facial movements are aiming to create socially aware companion robots and graphical characters, which could recognise human facial expressions.

Our brain processes lots of tiny and subtle clues about faces whenever we interact with other people, and now scientists from Queen Mary, University of London and UCL (University College London) are investigating whether robots and computers can learn to do the same thing.

“We will be showing some of the latest research from the EU funded LIREC project, which aims to create socially aware companion robots and graphical characters,” said Professor Peter McOwan, from the School of Electronic Engineering and Computer Science at Queen Mary, University of London.

“Robots are going to increasingly form part of our daily lives – for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes.

“Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible - understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness,” he added.

Co researcher Professor Alan Johnston, from the UCL Division of Psychology and Language Sciences explained: “A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements.”

“Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face,” he said.

0 comments:

Post a Comment