Faceshift uses the Kinect to bring your face and emotions into videogames

We already know there’s research about getting the Kinect to recognize emotions and how far facial tech has come in videogames. Now a team of researchers from the École Polytechnique Fédérale de Lausanne (EPFL) are looking to marry the two ideas into technology that will let you express your emotions through your virtual avatars.

Faceshift, as spotted by Gizmodo, effectively turns the Kinect into a real-time motion capture system that does not require tracking dots. The system combines recorded color video with depth mapping to translate your facial expressions to a prepared 3D model. All you have to do is sit in front of your Kinect and rage at Black Ops II, and your character will have the same face in the final kill cam.

The system translates every eyebrow raise and head tilt, and it even tracks where your eyes are looking. This sort of technology could lead to a whole new generation of games like Second Life or The Sims, with videogame characters that wear your face. When Master Chief finally takes off his helmet maybe he will have your face.

Okay, maybe it's not so cool.

The Faceshift team is already rolling out its technology as a developer kit for use in videogames, animated movies, TV shows, and Web chat systems.

Would you use your real face in videogames? Leave a comment.

Like this? You might also enjoy...

Get more GeekTech: Twitter - Facebook - RSS | Tip us off

Subscribe to the Now Playing Newsletter

Comments