Type to search

Realities for everyone – Facebook puts pre-emptive guardrails on the future of AR and VR

Technology & Data

Realities for everyone – Facebook puts pre-emptive guardrails on the future of AR and VR

Share

On the second day of Facebook’s annual F8 conference, the technology company has introduced its vision for a safe and inclusive future in digital realities.

The San Jose, California event saw Facebook introduce a range of announcements and updates to Facebook’s family of platforms – Facebook, Instagram, WhatsApp, Messenger – as well as its AR and VR offerings.

Led by chief technology officer Mike Schroepfer, Facebook AR/VR team described risk of implicit and inherent bias being built into the technology. As an example, Facebook brought up one its AR effects triggered by hand gesture.

To ensure the software “delivers quality AR effects for everyone,” Facebook says the training data for the software must include various skin tones under a variety of lighting conditions to ensure the system would recognise any human hand in front of the camera.

In VR, Facebook says its Oculus engineers use a similar process for voice commands while a user wearing a headset, using representative data across dialect, ages and genders.

Earlier this year Facebook also introduced its Codec Avatars project which uses computer vision technology to map a real-time model of a user’s face while they are wearing a VR headset. The avatar can then be used to represent the user in a digital environment, reflecting their facial movements and speech from the real world.

One technology the Facebook team says it is keenly aware of is ‘deepfakes’ – a process that takes pre-existing images and video to create a digital fabrication of a person’s face. This digital fabrication can then be mapped on top of another person’s face or manipulated to appear in an entirely new environment, context or action.

“Deepfakes are an existential threat to our telepresence project because trust is so intrinsically related to communication,” says Yaser Sheikh, director of research at Facebook Reality Labs (FRL) in Pittsburgh in a Facebook blogpost.

Facebook says the Reality Labs team is thinking programmatically about safeguards to keep avatar data safe – with regular reviews from privacy, security and IT experts to ensure the “most rigorous safeguards possible.”

We’ve considered all possible use cases for this technology,” says Chuck Hoover, General Manager at FRL. “We’re aware of the risk and routinely talk about the positive and negative impacts this technology can have.

“As a lab, we’re excited about making this technology, but only if it’s done the right way. Everyone knows how important this research is and how important it is that people trust it.”

 

Further Reading:

 

 

 

Image credit:Lucrezia Carnelos

Tags:
Josh Loh

Josh Loh is assistant editor at MarketingMag.com.au

  • 1

You Might also Like

Leave a Comment