If you’re old enough to have been around computers and video games in the 90s, you might remember Nintendo Virtual Boy, Virtual IO i-glasses and VFX-1, icons of a generation of unsuccessful virtual reality headsets. VR tried to make a push into the consumer market then, but failed epically due to several factors, including technological and hardware limits.
Fast forward to 20 years, and virtual reality and its sibling technologies, augmented and mixed reality, are back in the limelight with the likes of Oculus Rift, HTC Vive, Magic Leap and Microsoft HoloLens. And the buzz they’ve created dwarfs that of any of the previous cycles. This time around however, the technology is much more mature, and the AR/VR/MR industry is slated to be worth more than $100 billion by 2021. More than gaming and entertainment, VR, AR and MR are moving into practical domains such as medicine and education, and they will be a big part of the future of professional work.
But this doesn’t mean that our synthetic realities don’t have hurdles to overcome. The graphics still need polishing, wearing headsets is tiresome and dizzying, and interacting with virtual objects is still not as smooth as it should be.
Hopefully, a few technologies can help overcome these hurdles. One of them is eye tracking, the technique that measures eye activity and gaze movement. Eye tracking has been around for decades, but it was only in the past few years that it entered the consumer market in earnest. The technology has become much less expensive and much more compact, and it can be integrated into smartphones and VR/AR headsets without disrupting the form factor or increasing the costs.
Here’s how eye tracking can make a big difference in developing the next generation of virtual and augmented reality technologies.
Current VR headsets have a 4k graphics output, which is a huge improvement in comparison to the previous generations of VR gear. However, a fully natural experience requires 24k displays for each eye, which is beyond any technology currently available.
Fortunately, VR headset manufacturers can take advantage of how the human visual system works to simulate higher quality graphics without making huge upgrades to hardware. The human eye covers 165 degrees horizontally and 135 degrees vertically, but only sees details in the 5 degrees area of the direction it is looking. This small part of the visual field projects on the retinal field called the fovea.
Foveated rendering is a rendering technique where graphic resources are concentrated on the parts of the image where the eye perceives more detail. With the help of eye tracking technology, VR headsets can determine where the user is looking at and provide high-quality rendering in the immediate area around it while allocating less resources to the rest of the image. Thus, we create the impression of higher quality graphics without the need to wait for the next generation of graphics output hardware.
One of the challenges augmented reality headsets face is providing users with balanced information. Overwhelming the wearer with too much information can have an adverse effect, frustrating users by occluding real-world graphics or causing distraction or confusion. Users might miss important information when there are too many AR objects on the display. Ultimately, too much AR overlay may lead to security threats and cause physical harm to users.
To prevent the negative effects, developers of AR applications have to reduce the amount of AR objects they overlay on the display, possibly depriving the users of useful information that can be pertinent to the task they’re accomplishing.
Eye tracking will enable AR developers to provide focused rendering to their users. AR applications with eye tracking integration can show overlays only for the area the user is staring at, and hide the rest of the AR objects or make them semi-transparent to avoid blocking the user’s vision.
For instance, doctors or medics wearing smart glasses will be able to see the vital signs of patients only when they direct their gaze at them as opposed to seeing a bunch of figures floating in the corner of their field of vision all the time. Likewise, an AR application for watching sports will display the full stats of players when the user looks at them, and fade them when the user’s gaze drifts away.
Focused rendering is the AR equivalent of VR’s foveated rendering.
A more interactive experience
When knowing where the user is looking, AR and VR applications can provide a richer experience. For instance, in VR gaming, characters can react when the player makes eye contact or when they stare at sensitive locations, such as a character’s purse. In multiuser VR environments, avatars will be able to reflect users’ gazes as opposed to looking like zombies with blank stares.
In AR applications, overlay objects will be able to react to the user’s gaze, such as becoming animated, performing functions or altering their visibility status.
Another problem that being able to interact with the virtual environment through eye movements solves is that of controllers. Due to the limited field of view of current headsets, hand controllers require users to keep their hands outstretched to become visible in the display. This can cause cramps and fatigue in long periods of use. Eye tracking can give users a second medium to interact with VR and AR environments, minimizing the need for hand controllers.
Less physical and eye strain on the user
In real life, we can change the direction of our view by moving our eyes sideways or vertically. But in AR and VR headsets that don’t have eye tracking capability, users must move their entire heads to shift their view. This puts a lot of strain on the neck of users, especially as they have to support the weight of the headsets as well as constantly tilt their heads in different directions.
This can become an issue in work conditions, where users have to wear the headsets for long periods, and in fast-paced gaming action, where users constantly have to look in different directions.
Moreover, the lack of sync between the graphics the eye movements can create artifacts that will make the user uncomfortable.
Adding eye tracking to AR and VR headsets will make sure that users get as natural an experience as they can. Consequently, they’ll be able to use the headsets for longer periods and suffer less strain on the head and neck.
Valuable eye data
Eye tracking devices generate a lot of valuable data about the eye movements of users. For instance, in medical and educational AR and VR applications, eye tracking can help spot pain points and help direct users’ attention where it matters.
Eye tracking data also has analytical value and help application developers obtain valuable insights about how users view and interact with the virtual environments of AR and VR applications.
However, one important consideration is the invasive uses of eye tracking data. Advertisers would want to access the information in order to optimize ad placements, which can become annoying if used aggressively.
Eye tracking data also brings along privacy concerns. Users might not want some cloud server store information about where their gaze lingers and what objects they like to stare at. That’s why developers should explicitly ask users for permission before collecting data from eye tracking devices .
The future of AR and VR
There are already several manufacturers that are integrating eye tracking technology into their future AR and VR headsets. We’re also seeing new mobile devices that come packed with eye tracking capability, which will help provide the same capabilities for mobile AR and VR headsets.
The AR/VR industry is still nascent (at least this iteration of headsets is still in its early stages), and it will undergo a lot of change down the road. Eye tracking integration is just the beginning.