Software to Imitate Real-Time Facial Expressions on Screen

December 27, 2012, By Sanjeev Ramachandran

Video games have made huge strides from yester years, especially in their graphics department. I remember playing arcade games and 8-bit video games, where most of the characters looked like they were build of minute boxes.

But that has changes, and the change surprises me even now. The degree of improvement has been so good that the graphics of some games leaves us with our mouth open.

Games like Crysis and Skyrim have explored and improved on detailing the virtual world. There are, however, quite a few areas left to improve, especially the facial features of game characters.

Software to Imitate Real Time Facial Expressions on Screen

Some games, like the latest edition of Far Cry, have good facial expressions, but at times you feel that they could have done more. And that’s where Thibaut Weise of EPFL’s Computer Graphics and Geometry Laboratory comes into play.

His vision is to provide real-time facial expressions to virtual avatars, so that they reflect their user’s facial emotions on screen. While that may seem quite simple to read on paper, making it work would need some sweat.

Weise has started a system that would help him realize his goal. He has a software developed, with which he hopes to build a system that could cater to the animation and video game developers.

A camera, fitted to the brim with motion and depth sensors, is used for the process. The system is somewhat similar to what we see in the Kinect.

The software requires a short time period before it can produce accurate results. A 10 minute time frame is required for the software to become familiar with the user’s face and facial features.

It would take some time before you can watch your virtual avatar scrunching up its face like you do. But still, it would be cool to see the imitation on screen, as you play along.

© 2008-2012 DeviceMag.com - All rights reserved | Privacy Policy