Why eye-tracking sensors will be VR’s next must-have feature

Opinion: Computing Editor Michael Passingham explains the joys of VR eye tracking
VR is in its very early days and we have heaps of problems to overcome both technologically and from a software perspective, but one I hadn’t thought of until yesterday has shot straight to the top of my list. Well, second on my list after wireless VR. But that’s another story.
Yesterday at the Game Developers Conference (GDC) in San Francisco, Swedish eye tracking firm Tobii presented a talk where, naturally, they extolled the virtues of embedded eye trackers in VR headsets. We’ve already written a review of Tobii’s second consumer-level eye tracker and concluded that it’s not the perfect solution for flat screen gaming, and not enough games support it.
But at GDC Tobii finally put forward a pretty convincing case as to why we should be demanding eye tracker support in the next generation of VR headsets, and even provided some early in-game examples of how it would work.
I’ve distilled down my top five reasons why eye tracking will be essential in VR.
1. Performance
This is perhaps the most interesting one, and it’s based on a concept called foveated rendering that’s come back into popularity because of VR. Foveated rendering takes advantage of how our brains don’t bother grab loads of detail from our peripheral vision, instead just taking the bare minimum in and filling in the gaps with what we think will be there.
With foveated rendering, the game can track the user’s eye and only render the stuff the player is actually looking at in high detail. This means games can look way better without increasing the graphical horsepower required to actually run them. The rest of the scene can be shown at a lower quality, with just a few things preserved, such as colour and contrast, which our peripheral vision is more sensitive to.
Related: Best VR headset
This solves one of the bigger problems with VR; rendering games at the high refresh rate and resolution is extremely hard and means the graphics hardware required for VR is pretty expensive, and rendering everything in the player’s periphery when they might not actually be looking at it is exceptionally wasteful.
This is a best-case scenario, of course, since actually programming this sort of tech is no doubt quite hard. However, this is something GPU firm Nvidia is currently working on, so it’s not out of the realms of possibility.
Take a look at a few exaggerated foveated rendering examples in the video below:
2. Player interactions
The one that most gamers will actually notice is the massively improved interactions with fellow players in multiplayer games. Instead of avatars getting that thousand-yard stare where you’re not actually sure if they’re looking at you or not, eye tracking lets you see exactly where they’re looking. Plus they can wink at you! What’s not to like?
Game developer Against Gravity has put together a very short demo of eye tracking makes playing poker in VR more fun, which you can see below
3. Calibration
Another problem unique to VR is an extremely narrow window of where image quality is best. The headset has to be strapped on tight and your eyes in the perfect location to get the best image quality. With eye tracking sensors, Tobii reckons it’ll be able help VR gamers with getting their headsets on straight with on-screen prompts to move the headset into the best position or adapt the image based on where the player’s eyes end up.
4. Understand gamers better
A fun one for developers, it would be technically possible to understand exactly where gamers are looking the most when playing VR game, data that could be shown visually on the levels themselves. This would be a handy way of understanding whether players can actually see the things the game wants them to be looking at, and understand why some players get stuck or bored in a level.
Related: Best VR games to play right now
5. Make physical challenges easier
This is a subtle but useful one for games that require physical accuracy, such as throwing stones at bottles or playing darts. We all know that throws become more accurate when you’re looking at the object you’re attempting to hit, but that’s impossible to recreate in VR without eye tracking.
When the game knows where the player is looking, it can activate a little bit (or a lot) of auto-aim to simulate that increased accuracy. This could make precision-based VR games easier to play if implemented correctly.
Is any of this actually going to happen?
Valve has already started experimenting with this technology, implementing tech from eye tracking firm SMI as well as Tobii. Valve has also added eye tracking to its OpenVR software development kit so the path is well and truly clear for headset makers to start adding eye trackers to their hardware.
Tobii has also announced its own partnership with the open-standards group Khronos to produce an open platform to allow eye tracking to be implemented easily into future VR headsets.
Keep your eyes on it.
WATCH: How to choose a laptop