close-up of a blue eye

The increased acceptance of telecommuting, the rise of the geographically independent freelancer, and excellent communication software are only three factors that encourage a society where people communicate with each other over vast distances. This is a trend that shows no sign of slowing down. The freelance economy is growing, coworking spaces are popular, and software companies such as Buffer, Doist, and Toggl are entirely remote.

However, despite the abundance of wonderful communication tools that aid this trend, digital communication is still a long way from achieving the quality of a conversation in person. This largely has to do with the absence of body language and facial expressions. Even through video, our ability to read someone’s facial expressions is hampered by issues such as latency and low resolutions.

Tracking Facial Expressions

Several companies in the VR industry are dedicating themselves to improving the quality of digital communication. One such company, and the largest one at that, is Facebook. This should come as no surprise, as it ties nicely into their core mission of building communities and bringing people closer together. What better way to do so than VR?

Facebook already launched Facebook Spaces, a VR app where you create a digital avatar to invite and interact with up to three of your Facebook friends. Right now, they’re investing heavily into making their digital avatars as like-like as possible. More specifically, they’re researching photorealistic avatars with high-quality face tracking.

Facebook calls these avatars “Codec Avatars.” The impressive results from the UploadVR video above are made possible because of several cameras that are pointed at the user’s eyes and at the lower part of their face, as well as sophisticated machine learning algorithms. Unfortunately, Facebook employees say that the technology is still years away from being implemented in consumer headsets.

Eye-Tracking in VR Headsets

While full facial emulation might still be years away, eye-tracking technology in VR headsets is not. HTC introduced the HTC Vive Pro Eye at CES in January of this year, a headset that can accurately track eye movement in VR. Qualcomm’s next-gen mobile VR headsets will have eye-tracking too, as will the HoloLens 2.

Although it might seem like a minor technical innovation, eye-tracking will significantly improve many aspects of virtual reality, the first of which is a more natural interaction with other avatars in VR. It’s surprising how much more immersive a conversation becomes when the avatar you’re talking to has realistic eye movement.

Secondly, eye-tracking enables foveated rendering. This is a big one. Foveated rendering lowers the resolution in the areas of the screen you’re not looking at. This mimics the human eye. Let me show you how: keep reading this sentence, but try and see the details of an object in your peripheral vision. It’s impossible. You know it’s there, but it’s blurry. Foveated rendering does the same, but in VR.

blurry image toward the edges

An example of foveated rendering (look at the lower corners in particular)

Foveated rendering both delivers a more realistic experience while also drastically reducing power consumption and the load on your GPU. But to pull it off, you need a device with highly precise and fast eye-tracking capabilities.

Thirdly, eye-tracking can increase user comfort of a VR experience, because the cameras used for eye-tracking can calculate the interpupillary distance (IPD) of the user and, through software, help them adjust the IPD slider that’s available in headsets such as the Oculus Quest and the HTC Vive. Alternatively, a VR manufacturer could develop a headset that makes those adjustments automatically.

Then there are a plethora of other use cases that eye-tracking could be useful for. Eyes are great biometric identifiers, so eye-tracking could be used for security reasons, to simplify profile management, to better understand what the user is looking at, and much more…

Another Step Forward

The VR industry keeps pushing boundaries and this is another one. The ability to track where we’re looking in a simulation will make it seem more realistic, reduce the required computational power, and make digital avatars seem more realistic. Eventually, great communication will no longer require physical proximity. All it will require is a VR headset and the blink of an eye.