Facebook’s Freaky Tech Instantly Turns Users’ Full Bodies Into Moving Avatars
By Mikelle Leow, 02 May 2019
Video screenshot via Facebook for Developers
Facebook is attempting to make online communication look as if it’s happening in real life by turning its users into full-body, hyperrealistic versions of themselves on virtual reality.
The social network previewed an early version of this technology on day two of its F8 developer conference, where it also unveiled a major redesign of its interface. The eerie system is now able to replicate a person’s appearance, clothing, and motions and turn them into moving avatars.
In a sneak peek shown by Oculus researcher Ronald Mallet, the technology successfully recreates a man’s skeleton and movements.
Understanding that non-verbal language is essential in real-world conversations, the prototype is even able to track and replicate the person’s muscle use to indicate his “intention.”
His skin and outfit are also instantly reconstructed without any manual personalization, which is required when you design a character on the Sims.
Facebook hopes to roll out this technology into games, as previewed in a clip of the same man playing soccer on a simulated field with a woman. Whereas current VR apps are typically only able to feature a person’s head, Facebook’s VR project almost identically matches their arm, body and leg movements with a very slight lag.
In addition, the company is now able to simulate people’s faces and turn them into 3D, lifelike avatars that come to life when they speak or change their expressions. Facebook’s variation looks far more realistic than Apple’s ‘Animoji’, and is designed to make it seem as though users are talking to each other in the real world.
However, Mallet stresses that it will still be some time before users can start generating full-body virtual avatars, as Facebook is still unable to implement this technology with mainstream VR sensors. Further, it needs to ensure that the avatars are kept secure, and might need users to have them verified through facial or fingerprint sensors.
You can watch the hour-and-45-minute-long keynote in full below.
Developing and Scaling AI Experiences at Facebook with PyTorch
Learn how PyTorch is being used to help accelerate the path from novel research to large scale production deployment in computer vision, natural language processing and machine translation at Facebook. We'll cover the latest product updates and new resources for getting started with PyTorch.Posted by Facebook for Developers on Wednesday, May 1, 2019
[via CNN, video and screenshots via Facebook for Developers]
More related news