admin
Pinned January 21, 2018

<> Embed

@  Email

Report

Uploaded by user
Tobii proves that eye tracking is VR’s next killer feature
<> Embed @  Email Report

Tobii proves that eye tracking is VR’s next killer feature

Devindra Hardawar, @devindra

January 13, 2018
 

Tobii proves that eye tracking is VR's next killer feature | DeviceDaily.com

 
 
 

There are plenty of ways virtual reality headsets could get better. They could offer higher-resolution screens (like the new Vive Pro), a wider field of view and improved built-in tracking sensors. But another feature might be even more essential: eye tracking. It’s not a new concept — we’ve been following FOVE’s eye-tracking headset, as well as 7Invensun’s Vive accessory, for a few years now. But it seems more important than ever as consumer VR winds up. Tobii, a company that’s been exploring the potential of eye tracking for a while, is hoping to integrate its technology into the next generation of VR headsets. And based on some demos I saw, it’s clearly not a question of if VR headsets will get eye tracking. It’s when.

Next, I found myself sitting in a virtual loft playing an augmented reality game. On my left was Mars, while Earth was on my right. The goal was to launch rockets from Mars and make them hit alien ships floating around Earth. I could spin both planets, which changed the angle of the rockets and the ships, and I also had a button for turning Tobii’s tech on and off. Naturally, the game was much easier to play when I could just look at a planet and rotate it with the Vive controller’s touchpad. Doing that manually, by selecting a planet with the controller, was far less fluid and made the game nearly impossible to play.

I also played through a scenario similar to Star Trek Bridge Crew, which involved manipulating a daunting number of buttons and dials on a spaceship. If you’ve played that Star Trek VR game, you’d know that one tough part of it is making sure you hit the right button at the right time. With eye tracking in Tobii’s scenario, I only had to look at a button to select it. The company’s tracking technology did a solid job of choosing the right button most of the time, even though the demo had plenty of other things to select nearby.

In addition to simply making VR interaction more fluid, Tobii claims that eye tracking will also allow for more efficient foveated rendering. That’s a technique that makes your computer devote most of its graphics power to what you’re seeing, while keeping offscreen content at a lower quality. Typically, foveated rendering works across the entire screen, no matter where you’re technically looking. But with eye tracking, it can focus the best quality to what your eyes are actually pointed at, while slightly downgrading what’s around it. Tobii quietly enabled the feature during my last demo, and I was surprised that I didn’t even notice it in action. The big benefit? It could make it easier to run VR on slower systems.

Tobii proves that eye tracking is VR's next killer feature | DeviceDaily.com

While VR is the most immediate and obvious fit for Tobii, the company is still aiming to work with more PC manufacturers to build eye tracking into their laptops. Currently, more than 100 games support the technology as well. You can also expect to see Tobii’s eye tracking in even thinner laptops over the next few years. (Right now it’s mainly relegated to beefy gaming notebooks.) The company let me take a glimpse at its upcoming “IS5” sensor design, which is significantly smaller and slimmer than its current solution (above). In particular, the camera has been dramatically shrunken down.

Tobii’s CEO, Henrik Eskilsson, told us that eye tracking will eventually be viewed as a requirement for VR. And I’m inclined to believe him. Accurate eye tracking delivers a better sense of presence, which is really the ultimate goal for virtual reality. Trying Tobii’s technology for just 30 minutes has already ruined me for every VR headset without it. I’d call that a success.

Click here to catch up on the latest news from CES 2018.

Tobii first showed off its technology integrated into the HTC Vive at GDC last year. But at CES, it unveiled some new experiences to demonstrate the benefits of eye tracking. I was surprised to find that, aside from some sensor rings around the Vive’s lenses, it didn’t look as if the company added much to the headset. Instead, its hardware is able to seamlessly fit inside the Vive.

To calibrate the tracking, I followed a dot around the display for a while using just my eyes. Then I was presented with a mirror that reflected my VR avatar. It tracked my head movement around, as usual, but the eyes were blank and expressionless. Then I moved on to another mirror with eye-tracking enabled. When I blinked, my avatar blinked. It’s a small thing, but it went a long way toward making the experience feel more immersive. I wasn’t constantly reminded by the limitations of expression in VR.

Then I moved on to a screen featuring two robots. When I glanced at them, they made direct eye contact and responded with text messages. There’s an uncanny social awareness to them, as if they’re actually aware of your intent on having a conversation. This sort of feedback could easily make it seem like you’re chatting organically with game characters. And it could be even more useful in social VR environments — just imagine how awkward it’d be if we were stuck with boring avatars that didn’t reflect our eye movement.

Tobii proves that eye tracking is VR's next killer feature | DeviceDaily.com

A surprising demo just involved throwing rocks at far-off bottles. Without eye tracking, it was almost impossible to accurately knock down anything. But with the feature turned on, all I had to do was focus on one bottle, throw the rock with enough virtual momentum, and down it went. As I smoothly knocked down most of the bottles on the screen, I almost felt like I had superpowered accuracy.

New for CES was a trio of experiences showing off Tobii’s technology. One was a virtual living room, where I was able to select something to watch by moving my eyes across a media library. Today you’d have to either rely on a controller’s touchpad or crane your entire head around to interact with virtual objects. It’s not just a clunky way to replace something you can do in real life easily, like scroll through your Netflix queue. It adds a whole new capability that was never possible without eye tracking.

 

(60)

Pinned onto


Top