Look, ma... hands —

Surprise! Oculus Quest becomes first VR set with native hand tracking—this week

Goes live "this week;" promising October tease has us cautiously optimistic.

Starting this week, the Oculus Quest VR headset becomes even more tantalizing by adding a feature we've never seen ship as a built-in option in a VR system: hand tracking. VR users will be able to put down their controllers and use their fingers to manipulate VR worlds, as tracked by Quest's array of built-in cameras.

The feature received a tease at October's Oculus Connect 6 conference and got an "early 2020" launch window from Facebook CEO Mark Zuckerberg. But someone on the Oculus engineering team clearly ignored Lord Zuck in getting this feature out the door a bit early, and it will land in an "experimental" tab in Quest's settings menus as a free update by week's end.

Today's news comes with two important asterisks. First, there's no fully fledged VR software available for the feature yet. At launch, the experimental feature will only work within Oculus Quest's root menu, which at least includes photo and multimedia viewing tabs. Within "a week" of the toggle going live, a Software Development Kit (SDK) for Quest hand tracking will go live for Oculus developers, which will allow them to tap into Oculus' hand-tracking system and potentially implement it in various games and apps.

And second, Oculus is limiting its hand-tracking framework to the Quest ecosystem. This update isn't coming to the PC-centric Rift or Rift S headsets, and it won't work if you use Oculus Link to connect a Quest to your favorite PC VR games.

First of its kind, for a reason

Normally in VR, users grab onto controllers full of triggers and buttons. For some VR software, a piece of handheld plastic makes sense: it can sell the sensation of holding a weapon or VR item, and it adds haptic feedback like rumbling when your real-life hand gets near VR objects. But there's something to be said about lifting your empty hands in the VR sky and seeing your real fingers wiggle, which, based on pre-release tests, we can confirm Oculus Quest hand tracking nails.

We've seen hand-tracking experiments on other VR headsets, but these have largely come in the form of proprietary add-ons like Leap Motion, which require additional hardware and a bolted-on rendering pipeline. These systems have been impressive enough as tested at various tech expos, but VR hand tracking has always been underwhelming in execution—just imprecise enough, in terms of recognizing individual fingers and "pinch" gestures, compared to the "it just works" appeal of a compatible controller.

Quest's native hand-tracking support, on the other hand, taps into the headset's existing camera array, and it may very well work without adding a processing burden to the system (though we'll have to wait for the SDK to know for sure). That reduced friction is the best news for a hand-tracking system that, at launch, is admittedly simple and limited.

“Fwshht” like Wolverine, but not yet

Right now, Quest turns your empty hands into laser pointers that can manipulate menus. Leave your hands somewhat open, like you're about to pinch a pesky fly buzzing around, to make a pointer appear on a distant menu, as aimed by your hands' orientation. Quickly pinch your index finger and thumb to "click" any menu buttons, or hold your pinch to drag menu elements like a scrolling list or volume slider.

Based on tests in October, we know that the system natively recognizes a few hand gestures, particularly the balling of fists. (In one fantasy-themed test, I could dunk my real-life hands in a vat of virtual goo, then ball my hands into fists to make Wolverine-like blades "fwsshht" out of my virtual knuckles.) But those tests revealed two weaknesses: the inability to recognize hands when they are touching each other, and a relatively narrow "vision cone" for hand tracking. If your hands aren't front-and-center in your VR field of vision, they'll vanish and require an awkward moment to reappear.

When I did things the way Oculus wanted me to, at least, the system as tested in October was fast and accurate. I could wave my hands around, then point at distant objects or poke nearby ones and expect instant visual feedback. As a result, I expect the hand-tracking update to be an interesting one for apps outside the gaming ecosystem, from real-life job training simulations to media apps. I doubt that this system will replace controllers in demanding games and apps any time soon, but as an experimental freebie, and one whose simplest use cases actually work, I'm OK with it.

What's more, this update puts the ball in other VR headset makers' courts. Valve Index, and most Windows Mixed Reality sets, include a similar array of outward-facing cameras; Oculus' primary differentiation here is software engineering, not unique cameras. Who's next to step up to hand tracking?

Listing image by Oculus

Channel Ars Technica